CN114266292A - Operation grade determination method, device, system, equipment and medium - Google Patents

Operation grade determination method, device, system, equipment and medium Download PDF

Info

Publication number
CN114266292A
CN114266292A CN202111455886.5A CN202111455886A CN114266292A CN 114266292 A CN114266292 A CN 114266292A CN 202111455886 A CN202111455886 A CN 202111455886A CN 114266292 A CN114266292 A CN 114266292A
Authority
CN
China
Prior art keywords
grade
data
automation
functional
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111455886.5A
Other languages
Chinese (zh)
Inventor
不公告发明人
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Microport Medbot Group Co Ltd
Original Assignee
Shanghai Microport Medbot Group Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Microport Medbot Group Co Ltd filed Critical Shanghai Microport Medbot Group Co Ltd
Priority to CN202111455886.5A priority Critical patent/CN114266292A/en
Publication of CN114266292A publication Critical patent/CN114266292A/en
Priority to PCT/CN2022/135879 priority patent/WO2023098806A1/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/764Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H40/00ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
    • G16H40/60ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/30ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for calculating health indices; for individual health risk assessment

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Public Health (AREA)
  • Biomedical Technology (AREA)
  • General Health & Medical Sciences (AREA)
  • Primary Health Care (AREA)
  • Theoretical Computer Science (AREA)
  • Epidemiology (AREA)
  • Databases & Information Systems (AREA)
  • Computing Systems (AREA)
  • Business, Economics & Management (AREA)
  • General Business, Economics & Management (AREA)
  • Artificial Intelligence (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Pathology (AREA)
  • Evolutionary Computation (AREA)
  • Software Systems (AREA)
  • Data Mining & Analysis (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Manipulator (AREA)

Abstract

The invention provides a method, a device, a system, equipment and a medium for determining a surgery grade, wherein the method for determining the surgery grade comprises the following steps: constructing an automatic level classifier of the operation according to operation data of the historical operation; and obtaining the automation level of the current operation according to the operation data of the current operation and the automation level classifier. The operation grade determining method can be applied to a surgical robot system, and can judge the automation grade of the automatic operation before the automatic operation is executed, so that the automatic operation is executed under the limitation of the determined automation grade, and the safety of the automatic operation is improved.

Description

Operation grade determination method, device, system, equipment and medium
Technical Field
The invention relates to the technical field of medical instruments, in particular to a method, a device, a system, equipment and a medium for determining a surgery grade.
Background
The design concept of surgical robots is to perform complex surgical procedures precisely in a minimally invasive manner. The surgical robot is developed under the condition that the traditional surgical operation faces various limitations, breaks through the limitation of human eyes, and can more clearly present organs in the human body to an operator by utilizing a three-dimensional imaging technology. And to the narrow and small region that some people's hand can't stretch into, the operation robot still steerable surgical instruments accomplish to move, swing, centre gripping and 360 rotations to can avoid the shake, improve the operation accuracy, further reach the advantage that the wound is little, the bleeding is few, the postoperative resumes soon, greatly shorten the operation object postoperative time of being in hospital. Therefore, the surgical robot is very popular among doctors and patients, and is widely applied to respective clinical operations.
With the development of modern medical technology, some specific surgical operations can be performed autonomously by the robot, and these autonomous operations are suitable for some specific surgical environments, but in the prior art, when the robot is used for automatically performing the surgical operations, there is no safety level limitation on the autonomous operations of the robot, which easily brings about safety hazards.
Disclosure of Invention
The invention aims to provide a method, a device, a system, equipment and a medium for determining a surgery grade so as to distinguish the executable situation of an automatic surgery and improve the safety of the automatic surgery.
To achieve the above object, the present invention provides a surgery level determination method, including:
training an automatic grade classifier of the operation according to operation data of the historical operation; and the number of the first and second groups,
and obtaining the target automation level of the current operation according to the operation data of the current operation and the automation level classifier.
Optionally, performing at least one functional operation per procedure;
the automated level classifier for training a procedure based on procedure data of a historical procedure comprises: training an automatic grade classifier of corresponding functional operation according to target data corresponding to each functional operation in the historical operation;
the obtaining of the automation level of the current operation according to the operation data of the current operation and the automation level classifier comprises: and obtaining the target automation level of the functional operation executed by the current operation according to the target data corresponding to the functional operation in the current operation and the corresponding automation level classifier.
Optionally, the step of training an automatic level classifier of a corresponding functional operation according to target data corresponding to each functional operation in the historical surgery includes:
extracting the target data corresponding to each functional operation from the operation data of the historical operation, and acquiring the data characteristics of the target data;
dividing the automation grade corresponding to each functional operation in the historical operation to obtain an automation grade dividing result; and the number of the first and second groups,
and training the automatic grade classifier corresponding to the corresponding functional operation according to the data characteristics of the target data corresponding to each functional operation in the historical operation and the automatic grade classification result.
Optionally, the step of classifying the automation level corresponding to each of the functional operations in the historical surgeries comprises: and dividing the automation grade corresponding to each functional operation in the historical operation according to the postoperative evaluation.
Optionally, the step of classifying the automation level corresponding to each of the functional operations in the historical surgery according to the post-operative evaluation includes: and sequencing the postoperative evaluation of the historical operation according to a preset rule, and dividing the automation level corresponding to each functional operation according to a sequencing result.
Optionally, the step of obtaining the target automation level of the functional operation in the current operation according to the target data corresponding to the functional operation in the current operation and the corresponding automation level classifier includes:
extracting the target data corresponding to the functional operation from the operation data of the current operation, and acquiring the data characteristics of the target data; and the number of the first and second groups,
inputting the data characteristics of the target data corresponding to the functional operation in the current operation into the corresponding automatic grade classifier so as to obtain the target automatic grade of the functional operation in the current operation.
Optionally, the target data of the functional operation comprises first target data and second target data; the step of inputting the data features of the target data corresponding to the functional operation in the current operation into the corresponding automation level classifier to obtain the target automation level of the functional operation includes:
inputting data characteristics of all target data of the functional operation of the current operation into the corresponding automatic grade classifier, and obtaining a first automatic grade of the functional operation of the current operation;
inputting data characteristics of the second target data into the corresponding automatic grade classifier, and obtaining a second automatic grade of the functional operation of the current operation;
comparing the first automation level and the second automation level to obtain a target automation level of the functional operation of the current procedure;
wherein the first target data is automated surgery related data.
Optionally, the comparing the first automation level and the second automation level to obtain a target automation level of the functional operation of the current operation includes:
if the type of the operation action allowed to be automatically executed in the first automation level is less than or equal to the type of the operation action allowed to be automatically executed in the second automation level, obtaining that the target automation level of the functional operation of the current operation is the first automation level; if the type of the surgical action allowed to be automatically performed in the first automation level is greater than the type of the surgical action allowed to be automatically performed in the second automation level, an error message is generated.
Optionally, the target data includes at least one of operational difficulty of a corresponding procedure, intraoperative bleeding volume, complexity of a surgical environment, and criticality of a patient's condition.
Optionally, the operation level determination method further includes: displaying the target automation level of the current surgery on a display unit.
Optionally, before the step of training the automatic grade classifier of the surgery according to the surgery data of the historical surgery, the method further comprises: and acquiring the operation data of the historical operation, and performing structured storage on the operation data of the historical operation.
Optionally, after obtaining the target automation level of the current operation according to the operation data of the current operation and the automation level classifier, the method further includes:
and sending the target automation level to a surgical operation device so that the surgical operation device executes corresponding surgical operation according to the target automation level.
To achieve the above object, the present invention also provides a surgery level determination apparatus comprising:
the training module is used for training an automatic grade classifier of the operation according to operation data of the historical operation;
and the determining module is used for obtaining the target automation level of the current operation according to the operation data of the current operation and the automation level classifier.
Optionally, the training module comprises:
the acquisition unit is used for acquiring target data corresponding to each functional operation from the operation data of the historical operation and acquiring the data characteristics of the target data;
the dividing unit is used for dividing the automation grade corresponding to each functional operation in the historical operation to obtain an automation grade dividing result; and the number of the first and second groups,
and the training unit is used for training the corresponding automatic grade classifier according to the data characteristics of the target data corresponding to each functional operation in the historical operation and the automatic grade division result.
In order to achieve the above object, the present invention further provides a surgical robot system, which includes a surgical operation device and the surgical grade determination device as described above, wherein the surgical operation device is communicatively connected to the surgical grade determination device and is configured to perform a corresponding surgical operation according to the target automation grade of the current surgery.
To achieve the above object, the present invention further provides an electronic device, which includes a memory and a processor, wherein the memory stores a computer program operable on the processor, and the processor executes the computer program to implement the operation level determination method according to any one of the preceding claims.
To achieve the above object, the present invention also provides a computer-readable storage medium having a program stored thereon, which when executed, performs the operation level determination method as described in any one of the preceding items.
Compared with the prior art, the method, the device, the system, the equipment and the medium for automatically determining the operation grade have the advantages that:
the aforementioned operation grade determination method includes the steps of: training a classifier of the operation automation grade according to target data of historical operations; and obtaining the automation level of the current operation according to the target data of the current operation and the classifier. The operation grade determining method is applied to the operation robot system and used for distinguishing the executable condition of the automatic operation in the current operation, so that the operation device can execute the automatic operation under the limitation of the corresponding automation grade, and the safety of the automatic operation is improved.
Drawings
The drawings are included to provide a better understanding of the invention and are not to be construed as unduly limiting the invention. Wherein:
FIG. 1 is a schematic view of an application scenario of a surgical robotic system;
FIG. 2 is a schematic diagram of a surgeon-side control device of the surgical robotic system;
FIG. 3 is a schematic structural diagram of an operation end operation device of the surgical robot system and a buzzer arranged thereon;
FIG. 4 is a schematic diagram of the image display device of the surgical robot system and the voice prompt system disposed thereon;
FIG. 5 is a flow chart of a procedure grade determination method provided in accordance with one embodiment of the present invention;
FIG. 6 is a schematic view of surgical data of a historical procedure obtained in a procedure grade determination method according to an embodiment of the present invention;
FIG. 7 is a flowchart of a classifier for training a surgical automation level in a surgical level determination method according to an embodiment of the present invention;
FIG. 8 is a flowchart of obtaining target data of a current operation and obtaining data characteristics of the target data according to a first embodiment of the present invention;
fig. 9 is a schematic view of the operation data of the current operation obtained in the operation level determination method according to the first embodiment of the present invention;
FIG. 10 is a schematic illustration of the processing of surgical data for a current procedure in a procedure level determination method according to one embodiment of the present invention;
FIG. 11 is a flow chart of an automated level determination method of a current procedure from procedure data of the current procedure in accordance with an embodiment of the present invention;
fig. 12 is a schematic diagram of an automatic grade of a current operation displayed by a display of an image display device or a doctor-side control device in the operation grade determination method according to an embodiment of the present invention;
fig. 13 is a schematic diagram illustrating the structure of a control unit of a surgical robot system according to a fourth embodiment of the present invention, and the connection relationship between the control unit and a display unit and between the control unit and a surgical executing device;
fig. 14 is a flowchart of a surgical robotic system performing an automated procedure according to a fourth embodiment of the present invention.
FIG. 15 is a diagrammatic illustration of a display of an image display device or a surgeon-side control device of a surgical robotic system in accordance with a fourth embodiment of the present invention displaying a current automated surgical procedure;
[ reference numerals are described below ]:
10-doctor end control device, 20-patient end control device, 30-surgical operation device, 31 a-working arm, 31 b-image arm, 40-image display device, 41-voice prompt system, 50-control unit, 51-training module, 51 a-acquisition unit, 51 b-division unit, 51 c-training unit, 52-determination module, 53-data storage module and 60-surgical instrument.
Detailed Description
The embodiments of the present invention are described below with reference to specific embodiments, and other advantages and effects of the present invention will be easily understood by those skilled in the art from the disclosure of the present specification. The invention is capable of other and different embodiments and of being practiced or of being carried out in various ways, and its several details are capable of modification in various respects, all without departing from the spirit and scope of the present invention. It should be noted that the drawings provided in the present embodiment are only for illustrating the basic idea of the present invention, and the components related to the present invention are only shown in the drawings rather than drawn according to the number, shape and size of the components in actual implementation, and the type, quantity and proportion of the components in actual implementation may be changed freely, and the layout of the components may be more complicated.
Furthermore, each of the embodiments described below has one or more technical features, and thus, the use of the technical features of any one embodiment does not necessarily mean that all of the technical features of any one embodiment are implemented at the same time or that only some or all of the technical features of different embodiments are implemented separately. In other words, those skilled in the art can selectively implement some or all of the features of any embodiment or combinations of some or all of the features of multiple embodiments according to the disclosure of the present invention and according to design specifications or implementation requirements, thereby increasing the flexibility in implementing the invention.
As used in this specification, the singular forms "a", "an" and "the" include plural referents, and the plural forms "a plurality" includes more than two referents unless the content clearly dictates otherwise. As used in this specification, the term "or" is generally employed in its sense including "and/or" unless the content clearly dictates otherwise, and the terms "mounted," "connected," and "connected" are to be construed broadly, e.g., as meaning either fixedly connected, detachably connected, or integrally connected. Either mechanically or electrically. Either directly or indirectly through intervening media, either internally or in any other relationship. The specific meanings of the above terms in the present invention can be understood by those skilled in the art according to specific situations.
To further clarify the objects, advantages and features of the present invention, a more particular description of the invention will be rendered by reference to the appended drawings. It is to be noted that the drawings are in a very simplified form and are not to precise scale, which is merely for the purpose of facilitating and distinctly claiming the embodiments of the present invention. The same or similar reference numbers in the drawings identify the same or similar elements.
< example one >
Fig. 1 shows a schematic view of an application scenario of a surgical robot system, and fig. 2 to 4 show structural schematic views of different devices in the surgical robot system. As shown in fig. 1 to 4, the surgical robot system includes a control end and an execution end, the control end includes a doctor console and a doctor end control device 10 disposed on the doctor console, and the doctor end control device 10 has an immersive display. The executing end comprises a patient end control device 20 (as labeled in fig. 13), a surgical operation device 30, an image display device 40 and the like. The surgical operation device 30 includes a plurality of robot arms, at least one of which serves as a tool arm 31a and at least one of which serves as an image arm 31 b. The tool arm 31a is used to mount a surgical instrument 60 for performing a surgical operation in a patient. The image arm 31b is used to mount an image acquisition device such as an endoscope (not shown in the drawings) which is used to enter the body of a patient and acquire image information in the body.
The operation device is also provided with a buzzer (not shown in the figure). The image display device 40 is in communication connection with the endoscope, and is configured to receive and display an image acquired by the endoscope, so that medical staff can observe the condition inside the patient. The image display device 40 is further provided with a voice prompt system 41. Alternatively, the surgical robot system may be a master-slave mapping robot system, that is, the surgeon side control device 10 further includes a master hand, and the master hand and the surgical operation device 30 have a predetermined mapping relationship therebetween, so that the master-slave relationship can be established between the master hand and the surgical operation device 30, so that the tool arm 31a and the surgical instrument 60 can perform motions in various directions according to the motions of the master hand. Moreover, in some cases, the master-slave relationship between the master manipulator and the surgical device 30 may be broken, and the surgical device 30 may be controlled directly or indirectly by other means to perform automatic surgical operations.
Before the surgical operation device 30 performs the automatic surgical operation, a control unit 50 (as labeled in fig. 13) is used to perform a surgical grade determination method to determine the automation grade of the current surgery, so that the surgical operation device 30 can perform the automatic surgical operation under the limitation of the determined automation grade, thereby improving the safety of the surgery.
As shown in fig. 5, the operation level determination method includes the steps of:
step S10: an automated grade classifier for the procedure is trained based on the procedure data of the historical procedure. And the number of the first and second groups,
step S20: and obtaining the target automation level of the current operation according to the operation data of the current operation and the automation level classifier.
Further, the surgery level determination method further includes step S00, step S30, and step S40, wherein the step S00 is performed before the step S10, and includes: operation data of historical operations are obtained, and the operation data of the historical operations are structurally stored to establish an operation information database of the historical operations. The step S30 is performed after the step S20, and includes: the level of automation of the current procedure is displayed on a display unit. The step S40 may be performed in synchronization with the step S30, which includes: sending the target automation level of the current operation to the surgical operation device 30, so that the surgical operation device 30 executes the corresponding operation according to the target automation level.
Next, the operation grade determination method will be described in more detail herein.
In step S00, as shown in fig. 6, the operation data of the historical operation may be related operation data of each of the previous plurality of operations, including but not limited to: automated surgical information, patient information, surgical information, operating room information. Wherein the automatic surgery information includes but is not limited to intervention situation of the automatic surgery, operation difficulty of the automatic surgery, use information of the automatic surgery, and the like. The patient information includes, but is not limited to, patient signs, criticality of patient condition, post-operative recovery of the patient, etc. The operation information includes, but is not limited to, the operation duration, various image information during the operation, motion information of the tool arm 31a and/or the image arm 31b, intraoperative bleeding amount, post-operation evaluation, and the like. The operating room information includes, but is not limited to, information such as the devices and layout of the operating room.
The operation data of the historical operation is not particularly limited in the embodiment of the present invention, for example, the operation data may be automatically stored in real time by the control unit 50 of the robot system during the historical operation, or the operation data may be manually input into the control unit 50 by a medical staff after the operation is finished, or the operation data may be transmitted to the control unit 50 by other control mechanisms in a wired or wireless manner.
Those skilled in the art will appreciate that each procedure performs at least one functional operation, such as performing a first functional operation, a second functional operation, a third functional operation, etc., that is, a particular surgical operation, such as a cutting, stapling, etc. Thus, step S10 actually refers to training an automated level classifier of the corresponding functional operation based on the target data of each functional operation in the historical surgery, for example, training an automated level classifier of the first functional operation based on the target data of the first functional operation in the historical surgery, training an automated level classifier of the second functional operation based on the target data of the second functional operation in the historical surgery, training an automated level classifier of the third functional operation based on the target data of the third functional operation in the historical surgery, and so on.
As shown in fig. 7, the step S10 may include the following steps:
step S11: and extracting the target data corresponding to each functional operation from the operation data of the historical operation, and acquiring the data characteristics of the target data.
Step S12: automatically ranking each of the functional operations of the historical procedure.
Step S13: and training a corresponding automatic grade classifier according to the data characteristics of the target data of each functional operation and the automatic grade of the corresponding functional operation.
Here, the target data in the step S11 refers to data that can affect the automation level of the functional operation. Optionally, the target data may be at least one of an operational difficulty of a corresponding surgery, an intraoperative amount of bleeding, a complexity of a surgical environment, a criticality of a patient's condition. The target data can be determined by performing big data analysis on the operation data of all historical operations, and the specific analysis method is the content that can be known by those skilled in the art and is not described here. Of course, in some cases, the target data may be determined by medical personnel according to actual conditions and experience.
In an alternative embodiment, the data characteristics of the target data refer to the levels of the corresponding target data, for example, the data characteristics of the target data have three levels, i.e., high, medium, and low, and the data characteristics of different levels are labeled with different numbers. Alternatively, the target data is marked with a numeral 2 when the data characteristic is a high level, with a numeral 1 when the data characteristic is a medium level, and with a numeral 0 when the data characteristic is a low level. For example, in the embodiment of the present invention, when the operation difficulty of the automated surgery is high, the data of the operation difficulty of the automated surgery is characterized by a high level and is labeled as 2; when the operation difficulty of the automatic operation is moderate, the data characteristic of the operation difficulty of the automatic operation is of a medium level and is marked as 1; when the automation operation difficulty is small, the data of the operation difficulty of the automation operation is characterized by a low level and is marked as 0. Similarly, when the intraoperative hemorrhage amount is large, the data characteristic of the intraoperative hemorrhage amount is labeled as 2, when the intraoperative hemorrhage amount is medium, the data characteristic of the intraoperative hemorrhage amount is labeled as 1, and when the intraoperative hemorrhage amount is small, the data characteristic of the intraoperative hemorrhage amount is labeled as 0. When the surgical environment is complex, the data characteristic of the complexity of the surgical environment is labeled as 2, when the complexity of the surgical environment is moderate, the data characteristic of the complexity of the surgical environment is labeled as 1, and when the surgical environment is simple, the data characteristic of the complexity of the surgical environment is labeled as 0. And, when the patient's condition is critical, the data characteristic of the patient's condition's criticality is labeled 2, when the patient's condition's criticality is moderate, the data characteristic of the patient's condition's criticality is labeled 1, when the patient's condition is not critical, the data characteristic of the patient's condition's criticality is labeled 0. The skilled person can determine the grading standard of the data characteristics by performing big data analysis on the target data of all historical operations, and the specific analysis method is the content that the skilled person can know, and is not described here.
In step S12, each of the functional operations of the historical procedure may be automatically ranked according to the post-operative evaluation of the historical procedure. Specifically, the automation level may be divided into three levels, i.e., a level 3, a level 2, and a level 1, and the lower the level is, the fewer types of actions that are allowed to be automatically performed by the surgical operation device 30, i.e., the largest number of types of actions are allowed to be automatically performed by the surgical operation device 30 at the level 3, the second largest number of types of actions are allowed to be automatically performed by the surgical operation device 30 at the level 2, and the smallest number of types of actions are allowed to be automatically performed by the surgical operation device 30 at the level 1. In automatically ranking functional operations in historical surgeries, the control unit 50 first ranks the post-operative evaluations of all historical surgeries in a predetermined rule, for example, in order of good to bad for an evaluation of a designated functional operation, then ranks the designated functional operations in the top 30% of the historical surgeries in 3 ranks, ranks the designated functional operations in the 31% to 70% of the historical surgeries in 2 ranks, and ranks the designated functional operations in the 71% to 100% of the historical surgeries in 1 rank. It is to be understood that if the predetermined rule is to sort the historical surgeries in order of bad to good for the evaluation of the designated functional operation, the automation level of the designated functional operation in the historical surgeries ranked first 30% may be classified into 1 level, the automation level of the designated functional operation in the historical surgeries ranked 31% to 70% may be classified into 2 levels, and the automation level of the designated functional operation in the historical surgeries ranked 71% to 100% may be classified into 3 levels.
Taking as an example that each functional operation has four target data, which are respectively the operation difficulty of the automatic operation, the intraoperative hemorrhage amount, the complexity of the surgical environment, and the criticality of the patient' S disease, after the step S12 is completed, the following table 1 can be obtained, where the first functional operation is numbered 1, the second functional operation is numbered 2, and the third functional operation is numbered 3 in table 1.
TABLE 1
Figure BDA0003387657570000121
As can be seen from Table 1, the same functional procedure has a different level of automation from procedure to procedure (e.g., the first functional procedure has a level of automation of 3 in one history of procedures and a level of automation of 2 in another history of procedures). Therefore, the control unit 50 executes the step S13, that is, the mapping relationship between the data feature of the target data of the functional operation and the automation level corresponding to the data feature is obtained through the training of the automation level classifier.
For simplicity, only the first functional operation in each desk history surgery is described herein as an example. The data characteristics of the target data of the first functional operation in ten history surgeries and the automation level of the first functional operation in each history surgery are given in table 2.
TABLE 2
Figure BDA0003387657570000122
Figure BDA0003387657570000131
In step S13, the training of the automatic level classifier may be performed by using any suitable classification algorithm, and the optional algorithm includes, but is not limited to, a decision tree algorithm, a naive bayes algorithm, a logistic regression algorithm, a neural network algorithm, and the like.
The training of the classifier is performed herein using a naive bayes algorithm. Thus, the first functional operation has K automation levels (K is 3 in this embodiment) and is expressed as y ═ y (y)1,y2,……yK) The target data has D categories (D is 4 in the present embodiment), and is expressed as x ═ x (x)1,x2,……xD). According to a naive Bayes algorithm, the probability of the automation level of the first functional operation is calculated separately according to the following formula (I), i.e. x is calculated separately for y1、y2……ykThe probability P (yk | X), formula (I) is:
Figure BDA0003387657570000132
wherein all the target data are independent of each other, and the calculation formula of P (X | yk) is as follows (II):
Figure BDA0003387657570000133
and taking the automation level corresponding to the maximum probability value as the automation level of the functional operation, and calculating the formula as the following formula (III):
yk=arg max(P(yk|X))yk∈Y (III)。
calculating the first functional operation in all historical operations in the table 2 according to the formulas (I), (II) and (III) until the training of the automatic grade classifier of the first functional operation is completed and stored. It will be appreciated that other functionally operated automated level classifiers may be obtained by the control unit 50 in the same way.
As can be seen from the above description, each of the classifiers is directed to one functional operation. Therefore, the step S20 is actually to obtain the automation level of each functional operation of the current operation according to the target data of each functional operation of the current operation and the corresponding automation level classifier, for example, obtain the automation level of the first functional operation of the current operation according to the target data of the first functional operation and the automation level classifier of the first functional operation in the current operation, obtain the automation level of the second functional operation of the current operation according to the target data of the second functional operation in the current operation and the automation level classifier of the second functional operation, obtain the automation level of the third functional operation of the current operation according to the target data of the third functional operation and the automation level classifier of the third functional operation in the current operation, and so on.
Thus, as shown in fig. 8, the step S20 may specifically include the following steps:
step S21: surgical data is acquired for a current procedure.
Step S22: and extracting the target data of each functional operation from the operation data of the current operation, and acquiring the data characteristics of the target data. And the number of the first and second groups,
step S23: and inputting the data characteristics of the target data of each functional operation of the current operation into a corresponding automatic grade classifier, and obtaining the corresponding automatic grade of each functional operation.
As shown in fig. 9 and 10, in step S21, the surgical data of the current operation includes preoperative data and/or intraoperative data of the current operation. The preoperative data comprise patient information, operation information, automatic operation information and operating room information, wherein the patient information comprises information such as physical signs of a patient and critical degree of disease of the patient, the operation information comprises information such as estimated operation duration and estimated intraoperative bleeding amount, and the automatic operation information comprises information such as estimated automatic operation difficulty. The intraoperative data is real-time surgical data including image information of the inside of the patient body collected by the endoscope, motion information of the mechanical arm of the surgical operation device 30, automatic surgical information, intraoperative bleeding amount, and the like. The surgical information for the current procedure may be obtained in any suitable manner. Those skilled in the art will appreciate that the level of automation for the current procedure using pre-operative procedure data is actually predictive of the level of automation for a functional procedure prior to the procedure.
And the target data in the step S21 is the same as the target data in the step S11, and is at least one of the operation difficulty of the automatic surgery, the intraoperative amount of bleeding, the complexity of the surgical environment, and the criticality of the patient' S condition.
Referring back to fig. 8, the step S22 actually includes:
step S22 a: cleaning and denoising operation data of the current operation to remove non-target data and keep the target data of the functional operation of the current operation. And the number of the first and second groups,
step S22 b: and obtaining and storing the data characteristics of the target data of the current operation through big data analysis.
In a preferred embodiment, the target data of the current operation is divided into two parts, namely a first target data and a second target data, wherein the first target data refers to the automatic operation related data. The data related to the automatic surgery may be determined by a medical staff according to actual conditions, and in some embodiments, when the target data includes an operation difficulty of the automatic surgery, an intraoperative amount of bleeding, a complexity of a surgical environment, and the like, the data related to the automatic surgery includes the operation difficulty of the automatic surgery.
On this basis, as shown in fig. 11, the step S23 may include the following steps:
step S23 a: inputting data characteristics of all target data (namely the first target data and the second target data) of the functional operation of the current operation into a corresponding automatic classifier, and obtaining a first automation level of the functional operation of the current operation.
Step S23 b: inputting the data characteristics of the second target data of the functional operation of the current operation into a corresponding automatic grade classifier, and obtaining a second automatic grade of the functional operation of the current operation.
Step S23 c: comparing the first automation level and the second automation level and obtaining an automation level of the functional operation of the current procedure.
In step S23c, if the type of the surgical action that is allowed to be automatically performed by the surgical operation device 30 in the first automation level is less than or equal to the type of the surgical action that is allowed to be automatically performed by the surgical operation device 30 in the second automation level, determining that the automation level of the functional operation of the current operation is the first automation level; if the type of the surgical operation action allowed to be automatically executed by the surgical operation device 30 in the first automation level is larger than the type of the surgical action allowed to be automatically executed by the surgical operation device 30 in the second automation level, an error message is generated to prompt the medical staff that the functional operation cannot be performed in the first automation level. The purpose of this is to consider the safety of the functional operation of the current procedure when automatically executed from multiple dimensions.
For example, assume that the data characteristics of the four target data of the first functional operation in the current operation are all 1, i.e., the data characteristic of the operation difficulty of the automatic operation is 1, the data characteristic of the intraoperative hemorrhage amount is 1, the complexity of the operation environment is 1, and the criticality of the patient's condition is 1. And training according to the historical operation data shown in the table 2 to obtain an automatic grade classifier of the first functional operation, and calculating the automatic grade of the first functional operation in the current operation according to the automatic grade classifier. The process is as follows:
first, the probability of occurrence of the first functional operation in each automation level is calculated:
Figure BDA0003387657570000161
Figure BDA0003387657570000162
Figure BDA0003387657570000163
and the number of the first and second groups,
Figure BDA0003387657570000164
that is, in the current surgery, the probabilities that the first automation level of the first functional operation is 1 level, 2 levels, and 3 levels are 0.444, 0.579, and 0, respectively, whereby it can be determined that the first automation level of the first functional operation is 2 levels.
Next, target data of operation difficulty of the automatic surgery is removed, and the probability of occurrence of the first functional operation in each automation level is calculated:
Figure BDA0003387657570000171
Figure BDA0003387657570000172
Figure BDA0003387657570000173
and the number of the first and second groups,
Figure BDA0003387657570000174
that is, in the current operation, the probabilities that the second automation level of the first functional operation is 1 level, 2 levels, and 3 levels are 0.296, 0.926, and 0, respectively, whereby it can be determined that the second automation level of the first functional operation is 2 levels. Thus, the target automation level of the first functional operation is level 2 in the current surgery.
In the step S30, the display unit includes the image display device 40 and/or the immersive display of the doctor-side control device 10, that is, as shown in fig. 12, the image display device 40 displays the automatic level of the functional operation of the current operation, or for example, the immersive display of the doctor-side control device 10 displays the automatic level of the functional operation of the current operation (not shown in the figure). In addition, other information, such as an automatic operation level function switch, is displayed on the display unit, and the automatic operation level function switch is turned on before or during the operation, so that the control unit 50 performs the automatic level judgment of the corresponding functional operation according to the operation data of the current operation (performs the prejudgment according to the operation data before the operation, or performs the judgment according to the operation data during the operation).
In this embodiment, before the surgical robot system performs the automatic surgery, the surgical grade determination method is performed first, and the target automation grade of the current surgery can be determined in advance, so that the surgical operation device can perform the corresponding surgical operation at the determined target automation grade, thereby improving the safety and reliability of the automatic surgery, and avoiding unnecessary injuries to the patient.
< example two >
The present embodiment provides a surgery level determining apparatus, as shown in fig. 13, the surgery level determining apparatus includes a training module 51 and a determining module 53, and further preferably includes a storage module 52, the training module 51 is configured to train an automation level classifier of the surgery, the storage module 52 is configured to store the automation level classifier, and the determining module 53 is configured to obtain a target automation level of the current surgery according to surgery data of the current surgery and the automation level classifier. Alternatively, the surgical grade determination device may be integrated into a control mechanism, such as the control unit 50 of the surgical robotic system.
Further, the training module 51 includes an obtaining unit 51a, a dividing unit 51b, and a training unit 51 c.
The obtaining unit 51a is configured to obtain target data corresponding to each functional operation from operation data of a historical operation, and obtain data characteristics of the target data.
The dividing unit 51b is configured to divide an automation level corresponding to each of the functional operations in the historical surgeries. In more detail, the dividing unit 51b is configured to divide the automation level corresponding to each of the functional operations in the historical surgeries according to the postoperative evaluation. For example, the postoperative evaluation of the historical surgeries may be sorted according to a predetermined rule, and then the dividing unit 51b may divide the automation level corresponding to each functional operation according to a sorting structure.
The training unit 51c is configured to train the corresponding automatic level classifier according to the data features of the target data corresponding to each of the functional operations in the historical surgery and the automatic level classification result. After the obtaining unit 51a obtains the data features of the target data and the training unit 51c trains to obtain the corresponding automation level classifier, the determining module 53 obtains the target automation level of the functional operation of the current operation according to the target data features corresponding to the functional operation in the current operation input to the corresponding automation level classifier.
Optionally, the target data includes a first target data and a second target data, and the target data may specifically include operation difficulty corresponding to an operation, intraoperative bleeding amount, complexity of an operation environment, degree of danger given to you by a patient condition, and the like, where the first target data is automatic operation related data, for example, operation difficulty corresponding to an operation.
When determining the target automation level of the functional operation of the current operation, the determining module 53 is first configured to obtain a first automation level of the functional operation of the current operation according to the data features of all target data of the functional operation of the current operation input into the corresponding automation level classifier. And then obtaining a second automation level of the functional operation of the current operation according to the second target data input into the corresponding automation level classifier. Finally, the first automation level and the second automation level are compared to obtain a target automation level of the functional operation of the current operation.
Furthermore, the determination module 53 may be further communicatively connected to the display unit (e.g., the image display device 40 and/or the doctor-side control device 10) and the surgical operation device 30, so as to transmit the target automation level to the display unit for displaying after determining the target automation level of the current surgery, and transmit the target automation level to the surgical operation device 30, where the determination module 53 may transmit the target automation level to the surgical operation device 30 first, so that the surgical operation device 30 performs the corresponding surgery at the determined target automation level.
Furthermore, the acquiring unit 51a is also used for acquiring historical surgical data. At the same time, the storage module 53 is in communication connection with the acquisition unit 51a for structured storage of the surgical data of the historical surgery.
< example three >
The present embodiment provides a surgical robot system including not only a control terminal and an execution terminal as shown in fig. 1 but also a surgical grade determination device as shown in fig. 13. The surgical grade determination means may be integrated in the control unit 50 of the surgical robotic system.
It should be noted that, the specific arrangement of the control unit 50 is not limited in the embodiment of the present invention, and the control unit 50 may be disposed at the doctor-end control device 10 as a whole, or disposed at the patient-end control device 20 as a whole, or disposed at the doctor-end control device 10 as a part, disposed at the patient-end control device 20 as a part, or independent of the patient-end control device 20 and the doctor-end control device 10, as long as it can perform the corresponding functions. When the control unit 50 is independent from the patient-side control device 20 and the doctor-side control device 10, the control unit 50 is also communicatively connected to the doctor-side control device 10 and the patient-side control device 20.
It should be noted that, since the current surgery also becomes a historical surgery for the next surgery, after the current surgery is finished, the obtaining unit 51a is further configured to obtain postoperative surgical data of the current surgery to update the surgical data information base (as shown in fig. 9 and 10) of the historical surgery, wherein the postoperative surgical data includes postoperative evaluation, postoperative recovery of the patient, and the like.
In addition, in the process of performing the automatic surgery by the surgical robot system, the information of starting the automatic surgery or ending the automatic surgery may be broadcasted by the voice prompt system, and the control unit 50 or the doctor end control device or the patient end control device may further determine whether the current automatic surgery operation is safe and the corresponding danger level, and alarm by the voice prompt system and/or the buzzer.
Fig. 14 shows a flow chart of the surgical robotic system performing an automated procedure. After determining to perform the automatic surgery, as shown in fig. 14, the flow of the automatic surgery includes:
step S1: the voice prompt system performs voice broadcast to prompt medical personnel to perform automatic surgical operation by the surgical operation device 30 next. The content of the voice broadcast may be, for example, "an automatic operation is to be performed at a certain level" or the like.
Step S2: the surgical manipulation device 30 performs automated surgical manipulations within the limits of the determined level of automation.
Step S3: and (4) judging whether the current operation is safe or not in real time, if so, not giving an alarm by the buzzer, and if not, executing the step S4.
Step S4: and judging the danger degree of the current operation action, if the danger degree is high, sending a first alarm by the buzzer at a first frequency to prompt intervention operation, and if the danger degree is low, sending a second alarm by the buzzer at a second frequency.
Step S5: the medical staff intervenes manually and judges whether to use the emergency stop function to quit the automatic operation, if not, the operation returns to the step S2, and if so, the operation is executed to the step S6. The activation switch of the scram function may be displayed on the immersive display of the physician-side control device 10.
Step S6: and (5) withdrawing the automatic operation and carrying out voice broadcasting.
The steps S3 and S4 may be performed according to safety protection measures of the surgical robot system in the prior art, such as determining whether the current surgical action is safe by determining whether the movement speed of the surgical instrument 60 is over-limited in step S3, and determining the risk level of the current surgical action according to the degree of over-limit of the movement speed of the surgical instrument 60 in step S4.
In addition, as shown in fig. 15, during the automatic operation, the display unit, such as the image display device 40, may also display the image information of the patient's body collected by the endoscope, so that the medical staff can visually observe the current automatic operation.
< example four >
The present embodiment provides a computer-readable storage medium on which a program is stored, which, when executed, performs the procedure level determination method as provided in the first embodiment.
The computer readable storage may then include, but is not limited to, a portable disk, hard disk, random access memory, read only memory, erasable programmable read only memory, optical storage, magnetic storage, or any suitable combination of the foregoing.
< example five >
The present embodiment provides an electronic device, which includes a memory and a processor, where the memory stores a computer program executable on the processor, and the processor is configured to execute the computer program to implement the operation level determination method according to the first embodiment.
The components of the electronic device include, but are not limited to, at least one of the memory and at least one of the processor, and may also include a display unit, and a bus connecting these components.
Wherein the bus comprises a data bus, an address bus and a control bus.
The memory may include volatile memory, such as Random Access Memory (RAM) and/or cache memory, and may further include read-only memory (ROM). The memory may also include a program/utility having a set (at least one) of program modules, including but not limited to an operating system, one or more application programs, other program modules, and program data, each of which examples or some combination may include an implementation of a network environment.
The processor executes various functional applications and data processing, such as a surgery level determination method provided in the first embodiment of the present invention, by executing the computer program stored in the memory.
The electronic device may also communicate with one or more external devices, such as a keyboard, pointing device, etc. Such communication may be through an input/output (I/O) interface. Also, the electronic device may communicate with one or more networks, such as a Local Area Network (LAN), a Wide Area Network (WAN), and/or a public network, through a network adapter. In addition, other hardware and/or software modules may be used in conjunction with the electronic device, including but not limited to microcode, device drivers, redundant processors, external disk drive arrays, disk array systems, tape drives, and data backup storage systems.
It should be noted that although in the above detailed description several presses/modules or sub-units/modules of the electronic device are mentioned, this division is only exemplary and not mandatory. Indeed, the features and functionality of two or more units/modules described above may be embodied in one unit or module, according to embodiments of the invention. Conversely, the features and functions of one unit/module described above may be further divided into embodiments by a plurality of units/modules.
Although the present invention is disclosed above, it is not limited thereto. Various modifications and alterations of this invention may be made by those skilled in the art without departing from the spirit and scope of this invention. Thus, if such modifications and variations of the present invention fall within the scope of the claims of the present invention and their equivalents, the present invention is also intended to include such modifications and variations.

Claims (17)

1. A method of determining a procedure grade, comprising:
training an automatic grade classifier of the operation according to operation data of the historical operation; and the number of the first and second groups,
and obtaining the target automation level of the current operation according to the operation data of the current operation and the automation level classifier.
2. The procedure grade determination method according to claim 1, wherein at least one functional operation is performed in each procedure;
the automated level classifier for training a procedure based on procedure data of a historical procedure comprises: training an automatic grade classifier of corresponding functional operation according to target data corresponding to each functional operation in the historical operation;
the obtaining of the automation level of the current operation according to the operation data of the current operation and the automation level classifier comprises: and obtaining the target automation level of the functional operation executed by the current operation according to the target data corresponding to the functional operation in the current operation and the corresponding automation level classifier.
3. The method of claim 2, wherein the step of training an automated level classifier of each functional operation in the historical procedure based on the target data corresponding to the functional operation comprises:
extracting the target data corresponding to each functional operation from the operation data of the historical operation, and acquiring the data characteristics of the target data;
dividing the automation grade corresponding to each functional operation in the historical operation to obtain an automation grade dividing result; and the number of the first and second groups,
and training the automatic grade classifier corresponding to the corresponding functional operation according to the data characteristics of the target data corresponding to each functional operation in the historical operation and the automatic grade classification result.
4. The procedure grade determination method according to claim 3, wherein the step of ranking the automation grade corresponding to each of the functional operations in the historical procedures comprises: and dividing the automation grade corresponding to each functional operation in the historical operation according to the postoperative evaluation.
5. The procedure grade determination method according to claim 4, wherein the step of ranking the automation grade corresponding to each of the functional operations in the historical procedure according to the post-operative assessment comprises: and sequencing the postoperative evaluation of the historical operation according to a preset rule, and dividing the automation level corresponding to each functional operation according to a sequencing result.
6. The method for determining the operation level according to claim 2, wherein the step of obtaining the target automation level of the functional operation in the current operation according to the target data corresponding to the functional operation in the current operation and the corresponding automation level classifier comprises:
extracting the target data corresponding to the functional operation from the operation data of the current operation, and acquiring the data characteristics of the target data; and the number of the first and second groups,
inputting the data characteristics of the target data corresponding to the functional operation in the current operation into the corresponding automatic grade classifier so as to obtain the target automatic grade of the functional operation in the current operation.
7. The surgical grade determination method of claim 6, wherein the target data of the functional operation includes a first target data and a second target data; the step of inputting the data features of the target data corresponding to the functional operation in the current operation into the corresponding automation level classifier to obtain the target automation level of the functional operation includes:
inputting data characteristics of all target data of the functional operation of the current operation into the corresponding automatic grade classifier, and obtaining a first automatic grade of the functional operation of the current operation;
inputting data characteristics of the second target data into the corresponding automatic grade classifier, and obtaining a second automatic grade of the functional operation of the current operation;
comparing the first automation level and the second automation level to obtain a target automation level of the functional operation of the current procedure;
wherein the first target data is automated surgery related data.
8. The procedure grade determination method according to claim 7, wherein the comparing the first automation grade and the second automation grade to obtain a target automation grade of the functional operation of the current procedure comprises:
if the type of the operation action allowed to be automatically executed in the first automation level is less than or equal to the type of the operation action allowed to be automatically executed in the second automation level, obtaining that the target automation level of the functional operation of the current operation is the first automation level; if the type of the surgical action allowed to be automatically performed in the first automation level is greater than the type of the surgical action allowed to be automatically performed in the second automation level, an error message is generated.
9. The method of claim 2, wherein the target data includes at least one of an operational difficulty of a corresponding operation, an intraoperative amount of bleeding, a complexity of an operation environment, and a criticality of a patient's condition.
10. The procedure grade determination method according to claim 1, further comprising: and displaying the target automation level of the current operation on a display unit or prompting the target automation level in a voice mode.
11. The procedure grade determination method of claim 1, further comprising, prior to the step of training an automated grade classifier of a procedure based on procedure data of historical procedures: and acquiring the operation data of the historical operation, and performing structured storage on the operation data of the historical operation.
12. The procedure grade determination method according to claim 1, further comprising, after obtaining the target automation grade of the current procedure from the procedure data of the current procedure and the automation grade classifier:
and sending the target automation level to a surgical operation device so that the surgical operation device executes corresponding surgical operation according to the target automation level.
13. A surgical grade determining apparatus, comprising:
the training module is used for training an automatic grade classifier of the operation according to operation data of the historical operation;
and the determining module is used for obtaining the target automation level of the current operation according to the operation data of the current operation and the automation level classifier.
14. The surgical grade determination apparatus of claim 13 wherein the training module comprises:
the acquisition unit is used for acquiring target data corresponding to each functional operation from the operation data of the historical operation and acquiring the data characteristics of the target data;
the dividing unit is used for dividing the automation grade corresponding to each functional operation in the historical operation to obtain an automation grade dividing result; and the number of the first and second groups,
and the training unit is used for training the corresponding automatic grade classifier according to the data characteristics of the target data corresponding to each functional operation in the historical operation and the automatic grade division result.
15. A surgical robotic system comprising a surgical manipulation device and a surgical grade determination device according to claim 13 or 14, said surgical manipulation device being communicatively connected to said surgical grade determination device and configured to perform a corresponding surgical procedure in accordance with said target automation grade of said current surgery.
16. An electronic device comprising a memory and a processor, the memory having stored thereon a computer program operable on the processor, when executing the computer program, implementing the procedure grade determination method according to any one of claims 1-12.
17. A computer-readable storage medium, characterized in that the computer-readable storage medium has stored thereon a program which, when executed, performs the procedure grade determination method according to any one of claims 1-12.
CN202111455886.5A 2021-12-01 2021-12-01 Operation grade determination method, device, system, equipment and medium Pending CN114266292A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202111455886.5A CN114266292A (en) 2021-12-01 2021-12-01 Operation grade determination method, device, system, equipment and medium
PCT/CN2022/135879 WO2023098806A1 (en) 2021-12-01 2022-12-01 Level determining method and apparatus for surgery, system, device, and medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111455886.5A CN114266292A (en) 2021-12-01 2021-12-01 Operation grade determination method, device, system, equipment and medium

Publications (1)

Publication Number Publication Date
CN114266292A true CN114266292A (en) 2022-04-01

Family

ID=80826303

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111455886.5A Pending CN114266292A (en) 2021-12-01 2021-12-01 Operation grade determination method, device, system, equipment and medium

Country Status (2)

Country Link
CN (1) CN114266292A (en)
WO (1) WO2023098806A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023098806A1 (en) * 2021-12-01 2023-06-08 上海微创医疗机器人(集团)股份有限公司 Level determining method and apparatus for surgery, system, device, and medium

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101806195B1 (en) * 2012-07-10 2018-01-11 큐렉소 주식회사 Surgical Robot System and Method for Controlling Surgical Robot
CN110163251A (en) * 2019-04-15 2019-08-23 深圳市中电数通智慧安全科技股份有限公司 A kind of Optimum Identification Method of fire hazard rating, device and terminal device
CN112289447B (en) * 2020-10-30 2022-03-08 四川大学华西医院 Surgical incision healing grade discrimination system
CN112712878B (en) * 2020-12-30 2024-09-06 四川桑瑞思环境技术工程有限公司 Digital operating room system and control method
CN113378968A (en) * 2021-06-28 2021-09-10 北京辰安科技股份有限公司 Building fire classification method and device based on K-means clustering
CN114266292A (en) * 2021-12-01 2022-04-01 上海微创医疗机器人(集团)股份有限公司 Operation grade determination method, device, system, equipment and medium

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023098806A1 (en) * 2021-12-01 2023-06-08 上海微创医疗机器人(集团)股份有限公司 Level determining method and apparatus for surgery, system, device, and medium

Also Published As

Publication number Publication date
WO2023098806A1 (en) 2023-06-08

Similar Documents

Publication Publication Date Title
US20210157403A1 (en) Operating room and surgical site awareness
US12114949B2 (en) Surgical system with training or assist functions
US12016644B2 (en) Artificial intelligence guidance system for robotic surgery
CN107847283B (en) Configuring a surgical system using a surgical procedure atlas
Jacob et al. Gestonurse: a robotic surgical nurse for handling surgical instruments in the operating room
CN110062608A (en) Remote operation surgery systems with the positioning based on scanning
CN109996509A (en) Remote operation surgery systems with the instrument control based on surgeon&#39;s level of skill
JP2024521719A (en) Simulation-Based Surgical Analysis System
CN114266292A (en) Operation grade determination method, device, system, equipment and medium
US20230263587A1 (en) Systems and methods for predicting and preventing bleeding and other adverse events
US20220395334A1 (en) Systems and methods for guiding surgical procedures
JP2024521110A (en) Instruction-driven, simulation-based surgical development system
CN113925607B (en) Operation robot operation training method, device, system, medium and equipment
WO2024020223A1 (en) Changing mode of operation of an instrument based on gesture detection
CN118902618A (en) Surgical operation machine system based on artificial intelligence and control method thereof

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination