CN113889224B - Training of operation prediction model and operation indication method - Google Patents

Training of operation prediction model and operation indication method Download PDF

Info

Publication number
CN113889224B
CN113889224B CN202111484118.2A CN202111484118A CN113889224B CN 113889224 B CN113889224 B CN 113889224B CN 202111484118 A CN202111484118 A CN 202111484118A CN 113889224 B CN113889224 B CN 113889224B
Authority
CN
China
Prior art keywords
surgical
historical
condition
training
scheme
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202111484118.2A
Other languages
Chinese (zh)
Other versions
CN113889224A (en
Inventor
张瑞康
方翔
王伟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Suzhou Kangduo Robot Co ltd
Original Assignee
Suzhou Kangduo Robot Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Suzhou Kangduo Robot Co ltd filed Critical Suzhou Kangduo Robot Co ltd
Priority to CN202111484118.2A priority Critical patent/CN113889224B/en
Publication of CN113889224A publication Critical patent/CN113889224A/en
Application granted granted Critical
Publication of CN113889224B publication Critical patent/CN113889224B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H20/00ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
    • G16H20/40ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to mechanical, radiation or invasive therapies, e.g. surgery, laser therapy, dialysis or acupuncture
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H40/00ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
    • G16H40/60ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/20ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for computer-aided diagnosis, e.g. based on medical expert systems
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/70ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for mining of medical data, e.g. analysing previous cases of other patients

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Data Mining & Analysis (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Medical Informatics (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Primary Health Care (AREA)
  • Epidemiology (AREA)
  • Artificial Intelligence (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Surgery (AREA)
  • Molecular Biology (AREA)
  • Evolutionary Computation (AREA)
  • Mathematical Physics (AREA)
  • Pathology (AREA)
  • Databases & Information Systems (AREA)
  • Computing Systems (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Computational Linguistics (AREA)
  • Software Systems (AREA)
  • Biophysics (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Animal Behavior & Ethology (AREA)
  • Veterinary Medicine (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Robotics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Evolutionary Biology (AREA)
  • Urology & Nephrology (AREA)
  • Business, Economics & Management (AREA)
  • General Business, Economics & Management (AREA)
  • Instructional Devices (AREA)

Abstract

The invention provides a training method of a surgical operation pre-estimation model and a surgical operation indicating method, wherein the training method of the surgical operation pre-estimation model comprises the following steps: acquiring a historical surgical operation scheme; screening the historical surgical operation scheme according to a preset rule, wherein the preset rule is generated according to at least one of surgical operation time, surgical bleeding amount, wound size and postoperative rehabilitation time; acquiring historical surgical environment conditions matched with the screened historical surgical operation schemes; and training the initial model according to the screened historical operation scheme and the screened historical operation environment condition to obtain an operation estimation model. The invention has the advantages of providing a better operation scheme for doctors, facilitating the operation process to be more reasonable and obtaining better operation results.

Description

Training of operation prediction model and operation indication method
Technical Field
The invention relates to the technical field of medical equipment, in particular to a training and operation indicating method of an operation prediction model.
Background
Currently, surgical robots are widely used in the market, for example, a doctor controls an endoscope system in a surgical robot to observe a diseased part in a patient by manipulating a control system in a doctor console, and performs a surgery on the diseased part of the patient through a mechanical arm system and a distal end instrument. Therefore, the operation is performed based on the operation robot, and great help can be made in aspects of minimally invasive operation, precise operation, reduction of operation personnel, reduction of working strength of surgeons and the like. However, in the control of the surgical robot, generally, a surgeon controls the slave manipulator and the distal end instrument to perform a surgical operation on a patient by operating the master hand, and the actual operation process and the final surgical effect are unstable due to different experiences of each surgeon and different proficiency of operating the surgical robot.
Disclosure of Invention
The invention aims to solve the technical problems in the related art at least to a certain extent, and in order to achieve the purpose, the invention provides a training method of a surgical operation pre-estimation model, which comprises the following steps:
acquiring a historical surgical operation scheme;
screening the historical operation scheme according to a preset rule, wherein the preset rule is generated according to at least one of operation time, operation bleeding amount, wound size and postoperative rehabilitation time;
acquiring historical surgical environment conditions matched with the screened historical surgical operation schemes;
training the initial model according to the screened historical operation scheme and the screened historical operation environment condition to obtain an operation estimation model.
According to the training method of the surgical operation estimation model, the surgical operation estimation model is obtained by training an initial model based on a historical surgical operation scheme and historical surgical environment conditions, the trained model can input the real-time obtained surgical environment conditions into the model when the surgical operation is estimated in real time, and real-time surgical operation is given for reference of a doctor; the historical surgical operation scheme used for training is a screened surgical operation scheme, the historical surgical operation scheme is screened according to a preset rule, specifically, the preset rule is formed according to one or more of surgical operation time, surgical bleeding amount, wound size and postoperative rehabilitation time which can represent surgical process conditions and/or result conditions, so that the historical surgical operation scheme is screened, the screened historical surgical operation scheme can be used for better model training, and then the trained model can be estimated to obtain a more reasonable surgical operation scheme.
Further, the preset rule comprises a surgical evaluation function; the screening the historical surgical operation scheme according to the preset rule comprises the following steps:
scoring the historical surgical procedure plan according to the surgical evaluation function;
and screening out the historical surgical operation scheme of which the score meets the preset score condition.
Further, the surgical merit function includes:
F(T ope ,V ble ,L wou ) =1 /( C t *T ope + C v *V ble + C L *L wou ) And/or
F(T ope ,V ble ,L wou ,T recov ) =1 /( C t *T ope + C v *V ble + C L *L wou + C recov *T recov );
Wherein, F (T) ope ,V ble ,L wou ) And F (T) ope ,V ble ,L wou ,T recov ) Represents the score, T ope Indicating the time of the operation, V ble Indicating the amount of surgical bleeding, L wou Indicating wound size, T recov Indicating the time of postoperative recovery, C t 、C v 、C L And C recov Both represent weight coefficients.
Further, the obtaining of the historical surgical environment condition matched with the filtered historical surgical operation scheme includes:
and performing cluster analysis on a plurality of historical surgical environment conditions and a plurality of screened historical surgical operation schemes, and matching the historical surgical environment conditions with the historical surgical operation schemes, wherein each historical surgical environment condition is matched with a plurality of screened historical surgical operation schemes, and each screened historical surgical operation scheme is matched with a plurality of historical surgical environment conditions.
The invention also provides a surgical operation indicating method, which comprises the following steps:
acquiring the condition of the operation environment;
and inputting the condition of the operation environment of the operation into an operation estimation model to obtain an operation scheme of the operation for the operation, wherein the operation estimation model is obtained by training the operation estimation model by adopting the training method of the operation estimation model.
The operation indication method is the application of the operation estimation model obtained by training the training method of the operation estimation model, and can output the operation scheme of the time which can be indicated to the operation of a doctor according to the operation estimation model and the operation environment condition of the time in the operation process of the doctor. The beneficial effect of the method is similar to that of the training method of the surgical operation prediction model, and the detailed description is omitted here.
Further, the surgical operation indication method further includes:
acquiring a real-time operation condition and acquiring a preset contrast operation condition corresponding to the current operation environment condition;
and generating early warning information according to the comparison condition of the real-time operation condition and the preset comparison operation condition.
Further, the real-time surgical condition includes a real-time surgical operation, and the preset contrast surgical condition includes a low-score operation scheme, wherein the low-score operation scheme is obtained according to a score of a surgical evaluation function.
Further, the surgical operation indication method further includes:
generating display information for displaying a picture of the surgical environment and/or a phantom of the patient, and
generating indication information according to the operation scheme, wherein the indication information is suitable for indicating operation at the operation environment picture and/or the patient human body model.
The invention also provides a surgical robot, which comprises a memory, a processor and a computer program stored on the memory and capable of running on the processor, wherein when the processor executes the computer program, the training method of the surgical operation estimation model and/or the surgical operation indication method are/is realized.
The surgical robot in the present invention has a similar technical effect to the training method of the surgical operation prediction model and the surgical operation indication method, and is not described herein again.
Further, the surgical robot further comprises a position sensor, the position sensor is arranged at a joint of the surgical robot, and the position sensor is used for generating a historical surgical operation scheme in a training method of the surgical operation estimation model and/or generating a real-time surgical operation in the surgical operation indication method.
Drawings
Fig. 1 is a schematic flow chart of a training method of a surgical operation prediction model and a surgical operation indication method in an embodiment of the present invention.
Detailed Description
In order to make the aforementioned objects, features and advantages of the present invention comprehensible, embodiments accompanied with figures are described in detail below.
It is noted that the terms first, second and the like in the description and in the claims of the present invention and in the drawings described above are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the invention described herein are capable of operation in sequences other than those illustrated or described herein.
In the description herein, references to the terms "an embodiment," "one embodiment," and "one implementation," etc., mean that a particular feature, structure, material, or characteristic described in connection with the embodiment or implementation is included in at least one embodiment or example implementation of the invention. In this specification, the schematic representations of the terms used above do not necessarily refer to the same embodiment or implementation. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or implementations.
Referring to fig. 1, an embodiment of the present invention provides a method for training a surgical operation prediction model, including the steps of:
s1, obtaining a historical operation scheme;
s2, screening the historical operation scheme according to a preset rule, wherein the preset rule is generated according to at least one of operation time, operation bleeding amount, wound size and postoperative rehabilitation time;
s3, acquiring historical surgical environment conditions matched with the screened historical surgical operation schemes;
and S4, training the initial model according to the screened historical operation scheme and the screened historical operation environment condition to obtain an operation estimation model.
The training method of the surgical operation prediction model in the embodiment of the invention is used for training the surgical operation prediction model, so that the trained surgical operation prediction model can be used for predicting the surgical operation scheme in the surgical process, and based on the prediction scheme, the prediction scheme can be used for guiding the surgical operation of a doctor, wherein the surgical operation prediction model is obtained by training the initial model based on the historical surgical operation scheme and the historical surgical environment, so that the understanding can be realized, when the surgical operation is predicted in real time according to the trained surgical operation prediction model, the real-time surgical environment can be input into the model, so as to give out the real-time surgical operation for the reference of the doctor, and as the model is trained and verified based on a large amount of historical data, a more accurate and uniform surgical operation scheme can be given out according to the surgical environment, so that the whole surgical process can be more stable and a better surgical effect can be obtained when the doctor performs the surgery according to the surgical operation scheme; the historical surgical operation scheme used for training is a screened surgical operation scheme, and specifically, the historical surgical operation scheme can be screened according to preset rules, for example, in the statistical and collection process of the historical surgical operation scheme, different historical surgical operation schemes under the same pathological condition can correspond to different surgical processes and surgical results, so that in the embodiment of the invention, the preset rules are formed according to one or more of the surgical operation time, the surgical bleeding amount, the wound size and the postoperative rehabilitation time which can represent the surgical process condition and/or the result condition, so as to screen the historical surgical operation scheme, and it can be understood that the screening process is a process of screening out relatively poor surgical operation schemes, and then the screened historical surgical operation scheme can be used for better model training, and further, the trained model can predict and obtain a more reasonable surgical operation scheme.
It can be understood that, for the training of the model, two sets of training data that can be matched with each other correspondingly are required, and in the embodiment of the present invention, the training data are the screened historical surgical operation plans and the historical surgical environment conditions matched with the plans, for example, the historical surgical environment conditions may be historical lesion conditions. When the model is used, the operation scheme corresponding to the operation environment condition can be obtained according to the real-time operation environment condition.
The training method of the surgical operation estimation model and the use method of the subsequent model in the embodiment of the invention can be applied to a surgical robot, for this reason, the surgical operation data in the historical surgical operation scheme in the embodiment of the invention can be the action data of the surgical robot obtained when a doctor operates the surgical robot, such as a real-time operation picture obtained through an image sensing device, and the operation data obtained by extracting features from the picture and processing, or can be that various sensors are arranged at the surgical robot to detect the position information of the mechanical arm of the surgical robot, and further the operation data obtained by processing. In some application scenarios, the surgeon may need to perform the surgical operation directly by himself or herself in addition to the surgical robot, and therefore the operation data may also be a combination of surgical robot motion data and surgeon motion data, such as a combination of operation data formed based on image data and position data as described above.
The historical surgical environment condition used for training the model and the surgical environment condition used by the model in the embodiment of the invention can be the focus characteristics, for example, for the historical surgical environment condition, the historical endoscope focus image of the surgical position is obtained based on the endoscope; and performing feature extraction on the historical endoscope focus image to obtain the historical operation environment condition, wherein the historical operation environment condition can be one or more of position feature, size feature and shape feature of focus feature.
Before the operation prediction model is trained, a database about historical operation schemes and historical operation environment conditions can be built so as to collect and store various historical operation schemes and historical operation environment conditions, the historical operation schemes and the historical operation environment conditions are classified and matched correspondingly, and when model training is carried out, data in the historical operation schemes and the historical operation environment conditions are screened and determined, so that the historical operation schemes and the historical operation environment conditions are more convenient and accurate to obtain, and the model training is convenient to carry out.
The operation prediction model in the embodiment of the invention can be a model based on a neural network, and the training process can be that historical operation environment conditions are input into an initial model to obtain a predicted operation scheme, the predicted operation scheme and the historical operation scheme corresponding to the historical operation environment conditions are used for determining the value of a loss function, and then parameters of the initial model are adjusted according to the value of the function until convergence, so that the training of the initial model is completed, and the operation prediction model is obtained.
In an optional embodiment of the present invention, the preset rule comprises a surgical evaluation function; the screening the historical surgical operation scheme according to the preset rule comprises the following steps:
scoring the historical surgical procedure plan according to the surgical evaluation function;
and screening out historical surgical operation schemes with scores meeting preset score conditions.
In this embodiment, the preset rule may be an operation evaluation function capable of accurately measuring an operation process and an operation result, and the historical operation scheme may be uniformly scored according to the operation evaluation function, so as to screen the historical operation scheme, specifically, the score of the historical operation scheme is matched with the preset score, so that it can be understood that the poor or inappropriate historical operation scheme is screened, and the historical operation scheme capable of obtaining a better operation result or an easy-to-operate operation scheme is retained, and by establishing the uniform operation evaluation function, the historical operation scheme is screened in a scoring manner, so as to facilitate more reasonable model training.
The preset score condition can be a set score threshold, if the preset score condition is lower than the score threshold, the operation score is low, namely the operation process or the operation result is not good, so that the part of historical operation schemes are screened out, and the operation accidents caused by the fact that the poor operation scheme is given due to the influence on model training are avoided. In other embodiments, the preset score condition may be a preset score range, for example, the historical surgical operation corresponding to the score lower than the lower limit of the score range is a poor surgical operation, while the surgical operation higher than the upper limit of the score range may bring better surgical procedure and surgical result, but may be more severe to operate, so that the historical surgical operation may be screened out to obtain a relatively suitable historical surgical operation scheme, and when the training model and the usage model obtain a scheme indicating the surgical operation performed by the doctor, the scheme is also a more general scheme, so that most of the doctors can perform the operation conveniently.
In an alternative embodiment of the present invention, the surgical merit function includes:
F(T ope ,V ble ,L wou ) =1 /( C t *T ope + C v *V ble + C L *L wou ) And/or
F(T ope ,V ble ,L wou ,T recov ) =1 /( C t *T ope + C v *V ble + C L *L wou + C recov *T recov );
Wherein, F (T) ope ,V ble ,L wou ) And F (T) ope ,V ble ,L wou ,T recov ) Represents the score, T ope Indicating the time of the operation, V ble Indicating the amount of surgical bleeding, L wou Indicating wound size, T recov Indicating the time of postoperative recovery, C t 、C v 、C L And C recov Both represent weight coefficients.
In this embodiment, the operation evaluation function may be a function generated based on the amount of bleeding during operation, the size of the wound, and the operation time, a function obtained by combining the recovery time, or a comprehensive comparison and judgment form by combining the two evaluation functions, so that the operation evaluation function can be better used for evaluating the historical operation, and the operation process and the operation result are comprehensively considered. For example, when one of the operation evaluation functions is adopted for evaluation, the higher the score is, the better the historical operation scheme is, the operation scheme is the one meeting the model training requirement, otherwise, the operation scheme is screened out; when two operation evaluation functions are combined for evaluation, the two scores are all lower than a certain value and then are screened out or are all larger than a certain value to be reserved, so that an operation scheme meeting the training requirements of the model is obtained, the training effect of the model is better, and a more reasonable model prediction result can be obtained.
In an optional embodiment of the present invention, the obtaining of the historical surgical environment condition matched with the filtered historical surgical operation scheme includes:
performing cluster analysis on a plurality of historical surgical environment conditions and a plurality of filtered historical surgical operation schemes, and matching the historical surgical environment conditions with the historical surgical operation schemes, wherein each historical surgical environment condition is matched with a plurality of filtered historical surgical operation schemes, and each filtered historical surgical operation scheme is matched with a plurality of historical surgical environment conditions.
The historical operation scheme can be the whole process of a surgery or a process unit obtained by splitting the surgery according to the actual situation. For a surgery, even if the result of the surgery is good, the surgery does not necessarily represent that all parts in the whole process are perfect, so that better process subdivision is adopted in the embodiment, a plurality of historical surgery operation schemes and a plurality of screened historical surgery operation schemes are subjected to cluster analysis, the plurality of historical surgery operation schemes are matched and correspond to the plurality of historical surgery environment situations, a training set of the model is better obtained, and the model is trained.
Referring to fig. 1, a method for indicating a surgical operation according to another embodiment of the present invention includes the steps of:
s5, obtaining the condition of the operation environment;
and S6, inputting the condition of the operation environment into an operation estimation model to obtain an operation scheme of the operation for indicating the operation, wherein the operation estimation model is obtained by training by adopting the training method of the operation estimation model.
The operation indication method of the invention is the application of the operation estimation model obtained by the training of the operation estimation model, and in the process of operation of a doctor, model input can be obtained based on equipment such as an endoscope, namely the condition of the operation environment of the current time, and then the operation scheme of the current time which can be instructed to operate by a doctor can be output according to the estimated model of the operation. The beneficial effects of the operation indication method in this embodiment are similar to those of the above-mentioned operation estimation model training method, and are not repeated here.
The surgical operation scheme can be displayed in a video form or a sound form according to the situation, for example, the surgical operation scheme is output based on a display and an audio-video device, and based on the display and the audio-video device, a doctor can perform the surgical operation according to the prompt information.
In an alternative embodiment of the present invention, after obtaining the surgical procedure plan, display information for displaying a surgical environment picture and/or a patient phantom may be generated, and
generating indication information according to the operation scheme, wherein the indication information is suitable for indicating operation at the operation environment picture and/or the patient human body model.
In this embodiment, when a doctor performs an operation according to the operation scheme, an operation environment picture and a patient manikin may be generated to serve as a picture background for indicating the operation scheme, and the generated operation scheme is converted into indication information adapted to be visually displayed on the picture background, for example, operation animations and texts may be used, and the operation animations and the texts may be path planning and operation sequences of the operation, so that the doctor is indicated more accurately, and the operation is facilitated.
The operation environment picture and the patient human body model can be obtained by three-dimensional modeling based on the medical image of the patient.
In an optional embodiment of the present invention, the surgical procedure indication method further comprises the steps of:
acquiring a real-time operation condition and acquiring a preset contrast operation condition corresponding to the operation environment condition;
and generating early warning information according to the comparison condition of the real-time operation condition and the preset comparison operation condition.
In this embodiment, when a doctor performs an operation, a real-time operation condition can be obtained, for example, a position sensor based on the position of an instrument sensed by an endoscope or a surgical robot acquires sensor position information and endoscope real-time picture information that can be used for extracting a real-time operation, so as to compare the real-time operation condition with a preset comparison operation condition, and if it is determined that a problem exists in the current operation, an early warning is performed on the doctor, so that the doctor can terminate or change the operation conveniently, and the operation risk is reduced. In related embodiments, the real-time surgical status may also be a disease condition after a surgical operation in a current surgical procedure, and a corresponding preset contrast surgical status for contrast may also be a disease condition set in advance.
The preset contrast operation condition corresponding to the operation environment condition may be a better operation or a worse operation under the operation environment condition, so as to compare with the real-time operation, in this embodiment, the preset contrast operation condition includes a low-score operation scheme evaluated as a low score in a historical operation scheme, and the low-score operation scheme is obtained according to an operation evaluation function score, and the operation evaluation function includes:
F(T ope ,V ble ,L wou ) =1 /( C t *T ope + C v *V ble + C L *L wou ) And/or
F(T ope ,V ble ,L wou ,T recov ) =1 /( C t *T ope + C v *V ble + C L *L wou + C recov *T recov );
Wherein, F (T) ope ,V ble ,L wou ) And F (T) ope ,V ble ,L wou ,T recov ) Represents the score, T ope Indicating the time of the operation, V ble Indicating the amount of bleeding during operation, L wou Indicating wound size, T recov Indicating the time of postoperative recovery, C t 、C v 、C L And C recov Both represent weight coefficients.
In the embodiment of the invention, when the historical surgical operation schemes are screened, the historical surgical operation schemes with lower scores can be screened and reserved according to the surgical evaluation function, correspondingly, the historical surgical operation schemes correspond to the surgical environment condition, when the surgical environment condition is determined, the corresponding historical surgical operation schemes can be found according to characteristic analysis and used as the preset contrast surgical condition to be compared with the real-time surgical operation, and then the early warning is given to a doctor, and if the historical surgical operation schemes with lower scores of the real-time surgical operation are judged to have higher similarity, the early warning is output.
In another embodiment of the present invention, a surgical robot includes a memory, a processor, and a computer program stored in the memory and executable on the processor, wherein the computer program, when executed by the processor, implements a training method of a surgical operation prediction model as described above, and/or a surgical operation indication method as described above.
The surgical robot in this embodiment has a similar technical effect to the training method of the surgical operation prediction model and the surgical operation indication method, and will not be described herein again.
Further, the surgical robot further comprises a position sensor, the position sensor is arranged at a joint of the surgical robot, and the position sensor is used for generating a historical surgical operation scheme in a training method of the surgical operation estimation model and/or generating a real-time surgical operation in the surgical operation indication method.
In this embodiment, each joint of the surgical robot is provided with a position sensor to collect information of the surgical robot in the motion process, such as coordinate angles and other information, and then the motion trajectory of the end of the surgical robot instrument can be generated according to the information, so that surgical operation data such as real-time surgical operation in the historical surgical operation scheme and the surgical operation indication method in the training method of the surgical operation estimation model can be obtained, and thus, the training of the model and the indication of the surgical operation can be conveniently and rapidly realized.
Although the present disclosure has been described above, the scope of the present disclosure is not limited thereto. Various changes and modifications may be effected therein by one of ordinary skill in the pertinent art without departing from the spirit and scope of the present disclosure, and these changes and modifications are intended to be within the scope of the present disclosure.

Claims (8)

1. A training method of a surgical operation prediction model is characterized by comprising the following steps:
acquiring a historical surgical operation scheme;
screening the historical operation scheme according to a preset rule, wherein the preset rule is generated according to at least one of operation time, operation bleeding amount, wound size and postoperative rehabilitation time;
acquiring historical surgical environment conditions matched with the screened historical surgical operation schemes;
training the initial model according to the screened historical operation scheme and the screened historical operation environment condition to obtain an operation estimation model;
the preset rule comprises a surgery evaluation function; the screening the historical surgical operation scheme according to the preset rule comprises the following steps:
scoring the historical surgical procedure plan according to the surgical evaluation function;
screening out historical surgical operation schemes that the score of the score meets the preset score condition, comprising: screening out the historical surgical operation schemes of which the scores are lower than the lower limit of a preset score range, and screening out the historical surgical operation schemes of which the scores are higher than the upper limit of the preset score range;
the surgical merit function includes:
F(T ope ,V ble ,L wou ) =1 /( C t *T ope + C v *V ble + C L *L wou ) And are and
F(T ope ,V ble ,L wou ,T recov ) =1 /( C t *T ope + C v *V ble + C L *L wou + C recov *T recov );
wherein, F (T) ope ,V ble ,L wou ) And F (T) ope ,V ble ,L wou ,T recov ) Represents the score, T ope Indicating the time of the operation, V ble Indicating the amount of bleeding during operation, L wou Indicating wound size, T recov Indicating the time of postoperative recovery, C t 、C v 、C L And C recov Both represent weight coefficients.
2. The method for training the surgical operation prediction model according to claim 1, wherein the obtaining the historical surgical environment condition matched with the filtered historical surgical operation scheme comprises:
performing cluster analysis on a plurality of historical surgical environment conditions and a plurality of filtered historical surgical operation schemes, and matching the historical surgical environment conditions with the historical surgical operation schemes, wherein each historical surgical environment condition is matched with a plurality of filtered historical surgical operation schemes, and each filtered historical surgical operation scheme is matched with a plurality of historical surgical environment conditions.
3. A surgical procedure indication method, comprising:
acquiring the condition of the operation environment;
inputting the condition of the operation environment into an operation estimation model to obtain an operation scheme for indicating the operation, wherein the operation estimation model is obtained by training with the training method of the operation estimation model as claimed in claim 1 or 2.
4. The surgical operation indication method according to claim 3, further comprising:
acquiring a real-time operation condition and acquiring a preset contrast operation condition corresponding to the operation environment condition;
and generating early warning information according to the comparison condition of the real-time operation condition and the preset comparison operation condition.
5. The method of claim 4, wherein the real-time surgical condition comprises a real-time surgical procedure, and the predetermined contrast surgical condition comprises a low-score surgical plan, wherein the low-score surgical plan is scored according to a surgical evaluation function.
6. The surgical procedure indication method according to any one of claims 3 to 5, further comprising:
generating display information for displaying a picture of the surgical environment and/or a phantom of the patient, and
generating indication information according to the operation scheme, wherein the indication information is suitable for indicating operation at the operation environment picture and/or the patient human body model.
7. A surgical robot comprising a memory, a processor and a computer program stored on the memory and executable on the processor, the computer program when executed by the processor implementing a method of training a surgical procedure prediction model as claimed in claim 1 or 2 and/or a method of indicating a surgical procedure as claimed in any one of claims 3 to 6.
8. A surgical robot as claimed in claim 7, further comprising position sensors disposed at joints of the surgical robot for generating historical surgical procedure plans in the training method of the surgical procedure prediction model and/or for generating real-time surgical procedures in the surgical procedure indication method.
CN202111484118.2A 2021-12-07 2021-12-07 Training of operation prediction model and operation indication method Active CN113889224B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111484118.2A CN113889224B (en) 2021-12-07 2021-12-07 Training of operation prediction model and operation indication method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111484118.2A CN113889224B (en) 2021-12-07 2021-12-07 Training of operation prediction model and operation indication method

Publications (2)

Publication Number Publication Date
CN113889224A CN113889224A (en) 2022-01-04
CN113889224B true CN113889224B (en) 2022-10-21

Family

ID=79015822

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111484118.2A Active CN113889224B (en) 2021-12-07 2021-12-07 Training of operation prediction model and operation indication method

Country Status (1)

Country Link
CN (1) CN113889224B (en)

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112967819A (en) * 2021-03-08 2021-06-15 樊学海 Preoperative evaluation method and system for neurosurgery

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108320645B (en) * 2018-01-19 2020-02-07 中南大学湘雅二医院 Medical simulation training method
CN108682456B (en) * 2018-03-27 2020-12-22 青岛市市立医院 Operation simulation training method based on virtual reality technology
CN108777173A (en) * 2018-06-05 2018-11-09 四川大学 A kind of harelip operation clinic auxiliary system
CN110415826A (en) * 2019-06-17 2019-11-05 水瓶屿(上海)智能科技有限公司 A kind of operation and quality of anesthesia assessment based on perioperative big data and instruct system
EP3986315A1 (en) * 2019-07-15 2022-04-27 Surgical Theater, Inc. System and method for recommending parameters for a surgical procedure
CN110444264B (en) * 2019-08-16 2023-04-14 宜昌市中心人民医院 DRGs-based medical record homepage intelligent filling and training system
CN113160681A (en) * 2021-05-08 2021-07-23 福州大学 AR technology-based dental implantation surgery training system and method

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112967819A (en) * 2021-03-08 2021-06-15 樊学海 Preoperative evaluation method and system for neurosurgery

Also Published As

Publication number Publication date
CN113889224A (en) 2022-01-04

Similar Documents

Publication Publication Date Title
US11596483B2 (en) Motion execution of a robotic system
US11819188B2 (en) Machine-learning-based visual-haptic system for robotic surgical platforms
US20210345893A1 (en) Indicator system
KR20180068336A (en) Surgical system with training or auxiliary functions
KR20150004726A (en) System and method for the evaluation of or improvement of minimally invasive surgery skills
KR20180006939A (en) Surgical Procedure Composition of a surgical system with atlas
JP2020034849A (en) Work support device, work support method, and work support program
CN112043397B (en) Surgical robot and motion error detection method and detection device thereof
WO2020213484A1 (en) Surgery evaluation system
EP3414753A1 (en) Autonomic goals-based training and assessment system for laparoscopic surgery
CN114760903A (en) Method, apparatus, and system for controlling an image capture device during a surgical procedure
CN106156398A (en) For the operating equipment of area of computer aided simulation and method
CN112602157A (en) Hybrid simulation model for simulating medical procedures
JP7194889B2 (en) Computer program, learning model generation method, surgery support device, and information processing method
Long et al. Integrating artificial intelligence and augmented reality in robotic surgery: An initial dvrk study using a surgical education scenario
CN113889224B (en) Training of operation prediction model and operation indication method
Kil et al. Surgical suturing with depth constraints: Image-based metrics to assess skill
Batmaz Speed, precision and grip force analysis of human manual operations with and without direct visual input
Nicolaou et al. A Study of saccade transition for attention segregation and task strategy in laparoscopic surgery
Weede et al. Movement analysis for surgical skill assessment and measurement of ergonomic conditions
CN113925607B (en) Operation robot operation training method, device, system, medium and equipment
Zirino Vision-based deep reinforcement learning for autonomous target reaching in minimally invasive robotic surgery
Karimyan et al. Spatial awareness in Natural Orifice Transluminal Endoscopic Surgery (NOTES) navigation
Sun et al. Haptic modeling of stomach for real-time property and force estimation
Rahbar Visual Intelligence for Robotic and Laparoscopic Surgery: A Real-Time System for Bleeding Detection and Prediction

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant