CN117218922A - Auxiliary training and evaluating method and device for interventional operation robot - Google Patents

Auxiliary training and evaluating method and device for interventional operation robot Download PDF

Info

Publication number
CN117218922A
CN117218922A CN202311475395.6A CN202311475395A CN117218922A CN 117218922 A CN117218922 A CN 117218922A CN 202311475395 A CN202311475395 A CN 202311475395A CN 117218922 A CN117218922 A CN 117218922A
Authority
CN
China
Prior art keywords
user
training
interventional
robot
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202311475395.6A
Other languages
Chinese (zh)
Other versions
CN117218922B (en
Inventor
黄韬
解菁
杨贺
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Wemed Medical Equipment Co Ltd
Original Assignee
Beijing Wemed Medical Equipment Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Wemed Medical Equipment Co Ltd filed Critical Beijing Wemed Medical Equipment Co Ltd
Priority to CN202311475395.6A priority Critical patent/CN117218922B/en
Publication of CN117218922A publication Critical patent/CN117218922A/en
Application granted granted Critical
Publication of CN117218922B publication Critical patent/CN117218922B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Instructional Devices (AREA)

Abstract

The application provides an auxiliary training and evaluating method and device for an interventional operation robot. The auxiliary training and evaluation method is used cooperatively with the user control part and the interventional operation robot. The method includes, with at least one processor: receiving user identification information, a target interventional operation robot and a target interventional operation type input by a user; prompting a user to select a training mode; presenting corresponding training content according to the target intervention operation robot and the target intervention operation type, and guiding a user to execute training operation; acquiring training operation data of a user, and providing training feedback information for the user according to the training operation data of the user; receiving a grade to be authenticated input by a user; and authenticating the grade of the user according to the comprehensive evaluation operation data of the user. The method can meet the training requirement and the operation capability judging requirement of the operation system of the interventional operation robot for the user.

Description

Auxiliary training and evaluating method and device for interventional operation robot
Technical Field
The application relates to the technical field of auxiliary training and evaluation of interventional operation robots, in particular to an auxiliary training and evaluation method and device of an interventional operation robot.
Background
The minimally invasive intervention therapy with cardiovascular and cerebrovascular diseases is a main treatment means for cardiovascular and cerebrovascular diseases. Compared with the traditional surgery, the method has the obvious advantages of small incision, short postoperative recovery time and the like. The cardiovascular and cerebrovascular intervention operation is a treatment process by a doctor manually sending the catheter, the guide wire, the bracket and other instruments into a patient.
The intervention operation has the following 2 problems, firstly, in the operation process, as the DSA can emit X rays, the physical strength of doctors is reduced rapidly, the attention and the stability are also reduced, the operation precision is reduced, and accidents such as vascular intima injury, vascular perforation fracture and the like caused by improper pushing force are easy to occur, so that the life of patients is dangerous. Second, long-term ionizing radiation accumulation injuries can greatly increase the chances of a doctor suffering from leukemia, cancer, and acute cataracts. The phenomenon that doctors continuously accumulate rays due to interventional operations has become a non-negligible problem for damaging the professional lives of doctors and restricting the development of interventional operations.
The problem can be effectively solved by means of the robot technology, the accuracy and stability of operation can be greatly improved, meanwhile, the damage of radioactive rays to interventional doctors can be effectively reduced, and the occurrence probability of accidents in operation is reduced. The interventional robot is operated by manual operation, a doctor needs to be trained before clinical use, certain operation experience and skill are required, otherwise, operation errors are easy to occur, and operation failure is caused. Before the doctor operates the robot, the doctor should pass some checks to obtain operation license, so that the operation safety is ensured.
There are several problems in the training and evaluation of interventional surgical robots in China: (1) The training system for the special interventional operation robot is not carried out yet, doctors can only exercise at will by themselves, and the complete learning process of the system is lacking; (2) When a doctor uses an interventional robot to perform clinical operation, the condition of insufficient training often exists, so that the operation efficiency is reduced; (3) In the process of training the doctor intervention operation robot, the doctor lacks guidance and correct standard, and has low training efficiency; (4) Lack of assessment and evaluation criteria of the operation skill of the robot, and doctors cannot judge the operation level of the doctors; (5) Lack of correspondence of operation class division and open authority of the robot function module.
Disclosure of Invention
Aiming at the technical problems in the prior art, the application provides an auxiliary training and evaluating method and device for an interventional operation robot, which can solve the problems that a special interventional operation robot training system is lacked, a user is unskilled in operation of the interventional operation robot, a complete training process of the system is lacked, correct operation guidance in the training process is lacked, and operation level and robot function development are lacked at present.
In a first aspect, an embodiment of the present invention provides a method for assisting in training and evaluating an interventional surgical robot. The auxiliary training and evaluating method is cooperatively used with the user operation part and the interventional operation robot. The training aid and evaluation method includes executing steps S101 to S108 with at least one processor. Step S101: and receiving user identification information, a target interventional operation robot and a target interventional operation type which are input by a user. Step S102: in case of having been adapted to the target interventional surgical robot, the user is prompted to select a training mode. Step S103: and responding to the training mode selected by the user, presenting corresponding training contents according to the target interventional operation robot and the target interventional operation type, and guiding the user to execute training operation. Step S104: and acquiring training operation data of the user, and providing training feedback information for the user according to the training operation data of the user. Step S105: and receiving the level to be authenticated input by the user. Step S106: and providing corresponding comprehensive evaluation content according to the authentication level of the user, and guiding the user to execute comprehensive evaluation operation. Step S107: and authenticating the grade of the user according to the comprehensive evaluation operation data of the user. Step S108: and opening the operation functions of the corresponding level of the target interventional operation robot according to the authentication level of the user.
In a second aspect, an embodiment of the present invention further provides an auxiliary training and evaluation device for an interventional surgical robot, where the auxiliary training and evaluation device is used cooperatively with a user operating part and the interventional surgical robot, and the auxiliary training and evaluation device includes a processor configured to: receiving user identification information, a target interventional operation robot and a target interventional operation type input by a user; prompting a user to select a training mode in the case of having been adapted to the target interventional surgical robot; responding to a training mode selected by a user, and presenting corresponding training contents according to the target interventional operation robot and the target interventional operation type to guide the user to execute training operation; acquiring training operation data of a user, and providing training feedback information for the user according to the training operation data of the user; receiving a grade to be authenticated input by a user; providing corresponding comprehensive evaluation content according to the authentication level of the user, and guiding the user to execute comprehensive evaluation operation; authenticating the grade of the user according to the comprehensive evaluation operation data of the user; and opening the operation functions of the corresponding level of the target interventional operation robot according to the authentication level of the user.
Compared with the prior art, the embodiment of the application has the beneficial effects that: according to the application, the user identification information input by the user, the target intervention operation robot and the target intervention operation type are received, the user can be guided to execute training operation, training operation data of the user can be acquired, training feedback information can be provided for the user, the user can be authenticated at the level, the operation function of the corresponding level of the target intervention operation robot is opened, the operation system training requirement and operation capability judging requirement of the user on the intervention operation robot can be met at the same time, the user can practice repeatedly at any time and any time, the operation level and problem of the user can be quickly known through acquiring the training operation data of the user, and the targeted required training content arrangement is provided, so that the purposes of improving the user training efficiency and the success rate of the actual operation of the user are achieved.
Drawings
In the drawings, which are not necessarily drawn to scale, like numerals may describe similar components in different views. The accompanying drawings illustrate various embodiments by way of example in general and not by way of limitation, and together with the description and claims serve to explain the disclosed embodiments. Wherever possible, the same reference numbers will be used throughout the drawings to refer to the same or like parts. Such embodiments are illustrative and not intended to be exhaustive or exclusive of the present apparatus or method.
FIG. 1 is a first flowchart of a method for assisting in training and evaluating an interventional surgical robot according to an embodiment of the present application.
FIG. 2 is a second flowchart of a method of assisting in training and evaluating an interventional surgical robot in accordance with an embodiment of the present application.
FIG. 3 is a third flowchart of a method of assisting training and evaluation of an interventional surgical robot according to an embodiment of the present application.
Fig. 4 is a fourth flowchart of a method for assisting in training and evaluating an interventional surgical robot according to an embodiment of the present application.
Fig. 5 is a fifth flowchart of a method for assisting in training and evaluating an interventional surgical robot according to an embodiment of the present application.
Fig. 6 is a sixth flowchart of a method of assisting in training and evaluating an interventional surgical robot according to an embodiment of the present application.
Fig. 7 is a block diagram of the auxiliary training and evaluation device of the interventional operation robot according to the embodiment of the present application.
Detailed Description
It should be understood that various modifications may be made to the embodiments of the application herein. Therefore, the above description should not be taken as limiting, but merely as exemplification of the embodiments. Other modifications within the scope and spirit of the application will occur to persons of ordinary skill in the art.
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments of the application and, together with a general description of the application given above, and the detailed description of the embodiments given below, serve to explain the principles of the application.
These and other characteristics of the application will become apparent from the following description of a preferred form of embodiment, given as a non-limiting example, with reference to the accompanying drawings.
It is also to be understood that, although the application has been described with reference to some specific examples, those skilled in the art can certainly realize many other equivalent forms of the application.
The above and other aspects, features and advantages of the present application will become more apparent in light of the following detailed description when taken in conjunction with the accompanying drawings.
Specific embodiments of the present application will be described hereinafter with reference to the accompanying drawings; however, it is to be understood that the inventive embodiments are merely examples of the application, which may be embodied in various forms. Well-known and/or repeated functions and constructions are not described in detail to avoid obscuring the application in unnecessary or unnecessary detail. Therefore, specific structural and functional details disclosed herein are not intended to be limiting, but merely as a basis for the claims and as a representative basis for teaching one skilled in the art to variously employ the present application in virtually any appropriately detailed structure.
The specification may use the word "in one embodiment," "in another embodiment," "in yet another embodiment," or "in other embodiments," which may each refer to one or more of the same or different embodiments in accordance with the application.
The embodiment of the invention provides an auxiliary training and evaluating method of an interventional operation robot. The auxiliary training and evaluating method of the interventional operation robot can be applied to an auxiliary training and evaluating system of the interventional operation robot, the system can comprise a training module and an evaluating module, the training module and the evaluating module can be two independent modules, and a user can respectively utilize the training module to carry out auxiliary training and utilize the evaluating module to carry out training evaluation. The auxiliary training and evaluation method is cooperatively used with the user operating part and the interventional operation robot, so that a user can use the user operating part and the interventional operation robot to obtain real training in the training process, the user experience is enriched to the maximum extent, and the user operation skill is improved.
As shown in fig. 1, the auxiliary training and evaluation method includes performing steps S101 to S108 using at least one processor.
Step S101: and receiving user identification information, a target interventional operation robot and a target interventional operation type which are input by a user.
Optionally, the processor may be configured with a multi-user management mode, each user logs in to a respective account, and the memory of the auxiliary training and evaluation system of the interventional surgical robot may store unused user information, respectively, and the processor may manage the unused user information, where specific management content includes, but is not limited to, management of the operation functions of the interventional surgical robot by the user at the corresponding level.
Optionally, the user identification information is used to uniquely identify a user. The user identification information may be related to the user logging in the account number of the auxiliary training and evaluation system of the interventional operation robot, or may be related to biological characteristics such as facial information and fingerprint information of the user, which is not particularly limited in the present application, and a unique user may be determined through the user identification information.
Alternatively, the target interventional surgical robot may be understood as an interventional surgical robot to be trained and evaluated, which is determined by a user among a plurality of different types of interventional surgical robots. It should be noted that, the auxiliary training and evaluating system of the interventional operation robot supports a plurality of different interventional operation robots, before the use, the system can match data and parameters of different types of interventional operation robots, and after software and hardware are configured, the training and evaluating of the interventional operation robots of the corresponding types can be started.
Alternatively, the above-described target intervention type may be understood as an intervention type to be trained and evaluated, which is determined by a user among a plurality of different intervention types. Wherein, the operation difficulty of different grades can be respectively provided corresponding to different interventional operation types, so that a user can obtain more detailed training and low price.
The training content which is matched with the user identification information, the target intervention operation robot and the target intervention operation type and input by the user can be determined for the user through receiving the user identification information, the target intervention operation robot and the target intervention operation type, so that the user can be trained in a targeted mode.
Optionally, the processor may further receive department information input by the user, so as to further accurately determine the training content in combination with the department information where the user is located.
Step S102: in case of having been adapted to the target interventional surgical robot, the user is prompted to select a training mode.
Alternatively, the training patterns described above may be fixedly divided into a plurality of different training patterns. After determining the user identification information, at least one training mode matched with the user identification information can be prompted for the user in a plurality of training modes according to the user identification information.
Step S103: and responding to the training mode selected by the user, presenting corresponding training contents according to the target interventional operation robot and the target interventional operation type, and guiding the user to execute training operation.
Optionally, the training content is related to the training mode selected by the user, i.e. different training modes correspond to different training content.
Step S104: and acquiring training operation data of the user, and providing training feedback information for the user according to the training operation data of the user.
Alternatively, the training operation data of the user may include at least data capable of reflecting accuracy and proficiency of the training operation of the user, for example, data of user operation time, whether the operation sequence is accurate, and the like.
Optionally, the training feedback information provided to the user after the training operation data is collected may be at least used to intuitively present the training result of the present training operation for the user. The training results may include at least one or more of the following information: the training progress, the training time and the error position can also comprise direct scoring for the training operation, so that the user can directly acquire the training result of the training operation.
Optionally, the feedback information may further include training planning information determined based on training operation data, where the training planning information is used to provide a planning reference for a next training operation of the user, so that the user can effectively plan the training operation based on the feedback information.
Step S105: and receiving the level to be authenticated input by the user.
Alternatively, the ranking may be understood as an operation ranking related to the operation level of the user, which enables an efficient assessment of the operation capability ranking of the user. The user after the grade check can obtain the operation license of the corresponding grade.
Step S106: and providing corresponding comprehensive evaluation content according to the authentication level of the user, and guiding the user to execute comprehensive evaluation operation.
Optionally, the provided comprehensive evaluation content is detected and updated periodically, so that the user can perform level authentication according to the updated comprehensive evaluation content.
Step S107: and authenticating the grade of the user according to the comprehensive evaluation operation data of the user.
Optionally, the operation of the user can be directly scored according to the comprehensive evaluation operation data of the user, the scoring can be comprehensively evaluated based on the aspects of operation time, operation accuracy, operation completion degree and the like, whether the user reaches the operation capability required by the grade or not can be finally given, and the grade improvement authentication is obtained after the operation capability is reached.
Optionally, the comprehensive evaluation contents under different grades are different, and the comprehensive evaluation contents corresponding to the operation difficulty are provided according to the grade to be authenticated by the user, so that the user can select the grade suitable for the user to authenticate according to the operation level.
Illustratively, after the user passes the authentication according to the comprehensive evaluation operation data of the user, determining that the user reaches an operation level corresponding to the level to be authenticated; after the authentication is not passed according to the comprehensive evaluation operation data of the user, the user is determined to not reach the operation level corresponding to the level to be authenticated, at this time, the error position of the user can be determined according to the comprehensive evaluation operation data of the user, and the training operation of the user for strengthening the exercise required by the user is matched, so that the user can clearly know the training content which should be further exercised.
Step S108: and opening the operation functions of the corresponding level of the target interventional operation robot according to the authentication level of the user.
Thus, the operation function of the target interventional operation robot corresponding to the authentication level can be opened for the user who obtains the authentication level.
Optionally, after authenticating the user's level, a score may be given according to the user's completion status, and an authentication result may be provided. The authentication result can embody the information of the operation accuracy, the error position and the like of the user. After the user passes the level capacity authentication, the user can obtain the corresponding level qualification authentication, and the user can use the operation function of the corresponding interventional operation robot to perform interventional operation in the level range.
According to the application, the user identification information input by the user, the target intervention operation robot and the target intervention operation type are received, the user can be guided to execute training operation, training operation data of the user can be acquired, training feedback information can be provided for the user, the user can be authenticated at the level, the operation function of the corresponding level of the target intervention operation robot is opened, the operation system training requirement and operation capability judging requirement of the user on the intervention operation robot can be met at the same time, the user can practice repeatedly at any time and any time, the operation level and problem of the user can be quickly known through acquiring the training operation data of the user, and the targeted required training content arrangement is provided, so that the purposes of improving the user training efficiency and the success rate of the actual operation of the user are achieved.
In some embodiments, the providing training feedback information to the user according to the training operation data of the user in step S104 specifically includes: errors in the training operation of the user are recorded, and the user is prompted to strengthen the exercise of the training operation with the errors.
Therefore, the user can be prompted through errors in recorded training operation, the user can review conveniently at any time, and the user can conduct targeted training according to the errors of the training operation, so that the user is assisted to improve the operation capability as soon as possible.
Optionally, after recording the error in the training operation of the user, training content corresponding to the error can be generated based on the error in the training operation, that is, customized training content is generated for the user according to the error in the training operation, so that the user can train the error in the previous training operation in a targeted manner.
Optionally, the errors in the user training operation may include information such as error type, error frequency, etc. to summarize the error-prone training operation for the user, so as to facilitate the improvement of the operation capability of the user.
In some embodiments, the providing training feedback information to the user according to the training operation data of the user in step S104 specifically includes: training operation data is recorded for each training, wherein the training operation data comprises time and progress, and the time and the progress are used as training records of each time; at the next log-in of the user, each training record is provided to the user for a predetermined period of time, so that the user continues the previous training or looks up each training record.
Therefore, when the user logs in next time, the previous training record can be intuitively presented for the user, so that the user can review the past operation training and plan for the subsequent operation training.
Alternatively, each training record may be presented on the display interface in a bar form according to a time sequence, and after the user selects one of the training records, detailed information of the training records may be presented.
In some embodiments, as shown in fig. 2, step S102, in case of having been adapted to the target interventional surgical robot, prompting the user to select the training mode specifically includes steps S201 to S204.
Step S201: the user is presented with options for test mode, teaching mode, and custom mode.
Step S202: and responding to the selection of the test mode by the user, providing simulation test content for the user, guiding the user to execute simulation test operation, and providing an evaluation report for the user according to the simulation test operation data of the user.
Step S203: in response to the user selecting the teaching mode, a learning course associated with the target interventional surgical robot and the target interventional surgical type is provided to the user.
Step S204: in response to the user selecting the customization mode, the user is prompted to enter a customization requirement entry, and customization training content is provided to the user according to the customization requirement entry entered by the user.
Therefore, the user can selectively exercise according to the needs through the multiple training modes presented to the user, and the user can learn and mode the operation process more quickly because the user can train through the multiple training modes, so that the operation capability of the user can be improved more quickly.
Optionally, the above test mode may be understood that after the user performs the simulation test, an evaluation report may be given according to the user performance, and in the evaluation report, the user may propose an operation that needs to be enhanced, so that the user may be able to clearly identify a location where the user's operation capability is insufficient.
Optionally, in the teaching mode, the operation of the interventional operation robot can be comprehensively and systematically learned according to the functional chapters, and the operation knowledge can be more quickly mastered by combining teaching with a post-class exercise mode through various forms such as videos, pictures and the like.
Alternatively, after the user selects the teaching mode, the user may select the learned chapter as required according to the specification of the chapter by the current content with respect to the existing usage criteria. In each section, different learning and exercise contents are set, and the mode of combining video, pictures and characters is adopted for explanation. In addition, the content explanation can be carried out in combination with various forms such as online live broadcast, corresponding exercise titles, chapter test titles and the like are arranged after the lessons, and the chapter content study is considered to be completed after the test is completed. After the user completes the learning of the chapter, the learning progress of the user may be saved so that the user performs the learning in several times. The content of each chapter can be updated according to a regular period, so that a user can learn according to updated chapter content.
Alternatively, the customization mode may be understood as an operation that a user desires to train according to his own demand, and the corresponding portion of training content may be generated after receiving the customization requirement entry input by the user. In addition, real-time guidance can be provided for the user when the user performs training operation, places where the user is easy to make mistakes in operation can be counted, and the user can be prompted to strengthen the operations later.
In some embodiments, the training aid and evaluation method further comprises: and when the user selects the customization mode or the teaching mode, marking the operation error of the user and prompting the correct operation mode.
Therefore, the user can be intuitively prompted to operate the place with errors, and the user can immediately correct the operation of the user in a correct operation mode, so that the user can know the place with the errors in time, the normal operation mode can be obtained in time, and the user can be facilitated to quickly improve the operation capability.
In some embodiments, as shown in fig. 3, in response to the user selecting the customization mode, step S204, prompting the user to input the customization requirement entry specifically includes steps S301 to S302.
Step S301: in response to a user selecting the customization mode, candidates for the customization requirement entry and an editable box of the customization requirement entry are presented to the user in accordance with operational errors in the training records of the user at various times over a predetermined period.
Step S302: in response to a user's selection of a candidate for a customization requirement entry and editing the customization requirement entry in an editable box, customization training content is provided to the user in accordance with the user's selection of the candidate for the customization requirement entry and the edited customization requirement entry.
Therefore, the customized training can be performed for the operation errors in each training record in the preset period, so that the user can repeatedly train the places with insufficient operation, and the operation capability of the user can be improved rapidly.
Alternatively, the above candidates may be related to the operation errors in the training record, and in particular, the arrangement order of the candidates may be arranged according to the frequency of occurrence of the operation errors, for example, the higher the frequency of occurrence of the operation errors, the earlier the order of the candidates.
Optionally, an object which can be adjusted by the user according to the self requirement can be presented in the editable frame, so that the matching degree between the customized training content and the user is increased.
Optionally, after presenting the candidate items of the custom requirement item and the editable frame of the custom requirement item to the user, if the candidate items of the custom requirement item do not meet the user's requirement, a re-analysis function may be provided, and the user triggers the re-analysis function, and a different candidate item may be generated for the user than before, so as to further meet the requirement of the operation training of the user.
In some embodiments, as shown in fig. 4, the step S204 of providing the custom training content to the user according to the custom requirement entry input by the user specifically includes steps S401 to S402.
Step S401: and providing the preliminary custom training content for the user according to the custom requirement entry input by the user.
Step S402: and providing the preliminary customized training content as final customized training content under the condition that confirmation of the user on the preliminary customized training content is received, otherwise, receiving a supplementary customized entry of the user on the preliminary customized training content, and adjusting the preliminary customized training content according to the supplementary customized entry for confirmation of the user.
Therefore, the customized training content which is more matched with the user requirement is obtained by providing the initial customized training content for the user or supplementing customized items of the initial customized training content by the user, so that the user can quickly improve the operation capability of the user according to the customized training content.
The user can confirm the initial customized training content, understand that the initial customized training content contains training content required by the user, and provide the initial customized training content as final customized training content. And under the condition that the confirmation of the user on the preliminary customized training content is not received, the preliminary customized training content can be understood to not contain training content required by the user, and at the moment, the supplementary customized entry of the user on the preliminary customized training content can be received so as to adjust the preliminary customized training content according to the supplementary customized entry of the user based on the self requirement to generate final customized training content.
Alternatively, the above-described supplementary customization entry may be understood as an entry different from the provision of preliminary customization training contents to the user, which may be manually input by the user or may be determined by the user according to a selection from a plurality of provided entries.
In some embodiments, as shown in fig. 5, step S104 of providing training feedback information to the user according to training operation data of the user specifically includes steps S501 to S502.
Step S501: and prompting the user of a suggestion level for authentication according to the target intervention operation type, the operation time, the accuracy and the completion degree corresponding to the training operation data of the user.
Step S502: in the case that the user performs a confirmation operation on the authenticated advice level and selects the test mode, the user is provided with simulated test contents corresponding to the advice level, the user is guided to perform a simulated test operation, and an assessment report and a training advice for the advice level are provided to the user according to the simulated test operation data of the user.
Therefore, the user can be recommended with the authentication level according to the training operation degree of the user, the simulation test content can be provided for the user, the simulation training is provided for the subsequent real level authentication, the user can be timely reminded according to the real-time operation capability of the user, and the user can be assisted to apply for the authentication of higher level more smoothly.
Optionally, the number of the suggested levels may be one or more, and the difficulty and the comprehensive evaluation content corresponding to each suggested level may be presented to the user, so that the user may comprehensively understand each suggested level and then make a selection.
In some embodiments, as shown in fig. 6, the step S106 provides the corresponding comprehensive evaluation content according to the level of authentication to be performed by the user, and the operation of guiding the user to perform the comprehensive evaluation specifically includes steps S601 to S602.
Step S601: a question bank is formed that includes various levels of difficulty of operating content for various targeted interventional surgical robots and multiple chapters of a targeted interventional surgical type.
Step S602: and randomly extracting the operation content of the corresponding difficulty level of part of chapters from the question library as corresponding comprehensive evaluation content according to the authentication level of the user, and guiding the user to execute the comprehensive evaluation operation.
Therefore, the randomness of the generation of the comprehensive evaluation content can be increased, so that more accurate evaluation can be obtained according to the operation performance of the user, and the rationality of the grade authentication process is ensured.
Optionally, different levels of operational difficulty may be provided for different interventional surgical robots, as well as different comprehensive evaluation content. The user can select the corresponding interventional operation robots by himself as required, so that the user can conveniently switch between different interventional operation robots, and the user can be comprehensively trained and evaluated.
Alternatively, the question bank may be updated periodically.
Optionally, the operation contents in the question bank can be classified according to the categories, and in random extraction, the operation contents under each category can be extracted randomly respectively, so that the comprehensiveness and the comprehensiveness of the evaluation are ensured, and the problem that only one category of operation contents is evaluated is avoided.
The embodiment of the invention also provides an auxiliary training and evaluating device 100 of the interventional operation robot, and the auxiliary training and evaluating device 100 of the interventional operation robot is used cooperatively with the user operating part and the interventional operation robot. As shown in fig. 7, the interventional surgical robot's auxiliary training and evaluation device 100 comprises a processor 101, the processor 101 being configured to: receiving user identification information, a target interventional operation robot and a target interventional operation type input by a user; prompting a user to select a training mode in the case of having been adapted to the target interventional surgical robot; responding to a training mode selected by a user, and presenting corresponding training contents according to the target interventional operation robot and the target interventional operation type to guide the user to execute training operation; acquiring training operation data of a user, and providing training feedback information for the user according to the training operation data of the user; receiving a grade to be authenticated input by a user; providing corresponding comprehensive evaluation content according to the authentication level of the user, and guiding the user to execute comprehensive evaluation operation; authenticating the grade of the user according to the comprehensive evaluation operation data of the user; and opening the operation functions of the corresponding level of the target interventional operation robot according to the authentication level of the user.
According to the application, the user identification information input by the user, the target intervention operation robot and the target intervention operation type are received, the user can be guided to execute training operation, training operation data of the user can be acquired, training feedback information can be provided for the user, the user can be authenticated at the level, the operation function of the corresponding level of the target intervention operation robot is opened, the operation system training requirement and operation capability judging requirement of the user on the intervention operation robot can be met at the same time, the user can practice repeatedly at any time and any time, the operation level and problem of the user can be quickly known through acquiring the training operation data of the user, and the targeted required training content arrangement is provided, so that the purposes of improving the user training efficiency and the success rate of the actual operation of the user are achieved.
In some embodiments, the processor 101 is further configured to: errors in the training operation of the user are recorded, and the user is prompted to strengthen the exercise of the training operation with the errors.
In some embodiments, the processor 101 is further configured to: training operation data is recorded for each training, wherein the training operation data comprises time and progress, and the time and the progress are used as training records of each time; at the next log-in of the user, each training record is provided to the user for a predetermined period of time, so that the user continues the previous training or looks up each training record.
In some embodiments, the processor 101 is further configured to: presenting options of a test mode, a teaching mode and a customization mode to a user; responding to the selection of the test mode by the user, providing simulation test content for the user, guiding the user to execute simulation test operation, and providing an evaluation report for the user according to the simulation test operation data of the user; responsive to the user selecting the teaching mode, providing a learning course associated with the target interventional surgical robot and the target interventional surgical type to the user; in response to the user selecting the customization mode, the user is prompted to enter a customization requirement entry, and customization training content is provided to the user according to the customization requirement entry entered by the user.
In some embodiments, the processor 101 is further configured to: and when the user selects the customization mode or the teaching mode, marking the operation error of the user and prompting the correct operation mode.
In some embodiments, the processor 101 is further configured to: in response to a user selecting a customization mode, presenting candidates of customization requirement entries and editable boxes of customization requirement entries to the user according to operational errors in the training records of the user for each time within a predetermined period; in response to a user's selection of a candidate for a customization requirement entry and editing the customization requirement entry in an editable box, customization training content is provided to the user in accordance with the user's selection of the candidate for the customization requirement entry and the edited customization requirement entry.
In some embodiments, the processor 101 is further configured to: providing preliminary custom training content for a user according to custom requirement items input by the user; and providing the preliminary customized training content as final customized training content under the condition that confirmation of the user on the preliminary customized training content is received, otherwise, receiving a supplementary customized entry of the user on the preliminary customized training content, and adjusting the preliminary customized training content according to the supplementary customized entry for confirmation of the user.
In some embodiments, the processor 101 is further configured to: prompting a suggestion level for authentication to a user according to a target intervention operation type, operation time, accuracy and completion degree corresponding to training operation data of the user; in the case that the user performs a confirmation operation on the authenticated advice level and selects the test mode, the user is provided with simulated test contents corresponding to the advice level, the user is guided to perform a simulated test operation, and an assessment report and a training advice for the advice level are provided to the user according to the simulated test operation data of the user.
In some embodiments, the processor 101 is further configured to: forming a question bank including various difficulty level operation contents of various target interventional operation robots and various chapters of the target interventional operation types; and randomly extracting the operation content of the corresponding difficulty level of part of chapters from the question library as corresponding comprehensive evaluation content according to the authentication level of the user, and guiding the user to execute the comprehensive evaluation operation.
The embodiment of the application also provides a storage medium which stores a computer program, wherein the computer program realizes the steps of the auxiliary training and evaluating method of the interventional operation robot when being executed by a processor.
Note that according to various units in various embodiments of the application, the respective units may be implemented as computer-executable instructions stored on a memory, which when executed by a processor may implement the respective steps; may also be implemented as hardware having corresponding logic computing capabilities; and may also be implemented as a combination of software and hardware (firmware). In some embodiments, the processor may be implemented as any one of FPGA, ASIC, DSP chip, SOC (system on a chip), MPU (e.g., without limitation, cortex), etc. The processor may be communicatively coupled to the memory and configured to execute computer-executable instructions stored therein. The memory may include read-only memory (ROM), flash memory, random Access Memory (RAM), dynamic Random Access Memory (DRAM) such as Synchronous DRAM (SDRAM) or Rambus DRAM, static memory (e.g., flash memory, static random access memory), etc., upon which computer-executable instructions are stored in any format. Computer-executable instructions may be accessed by the processor, read from ROM or any other suitable memory location, and loaded into RAM for execution by the processor to implement a wireless communication method in accordance with various embodiments of the application.
It should be noted that, among the components of the system of the present application, the components thereof are logically divided according to functions to be implemented, but the present application is not limited thereto, and the components may be re-divided or combined as needed, for example, some components may be combined into a single component, or some components may be further decomposed into more sub-components.
Various component embodiments of the application may be implemented in hardware, or in software modules running on one or more processors, or in a combination thereof. Those skilled in the art will appreciate that some or all of the functions of some or all of the components in a system according to embodiments of the present application may be implemented in practice using a microprocessor or Digital Signal Processor (DSP). The present application can also be implemented as an apparatus or device program (e.g., a computer program and a computer program product) for performing a portion or all of the methods described herein. Such a program embodying the present application may be stored on a computer readable medium, or may have the form of one or more signals. Such signals may be downloaded from an internet website, provided on a carrier signal, or provided in any other form. Furthermore, the application may be implemented by means of hardware comprising several distinct elements, and by means of a suitably programmed computer. In the unit claims enumerating several means, several of these means may be embodied by one and the same item of hardware. The use of the words first, second, third, etc. do not denote any order. These words may be interpreted as names.
Furthermore, although exemplary embodiments have been described herein, the scope thereof includes any and all embodiments having equivalent elements, modifications, omissions, combinations (e.g., of the various embodiments across), adaptations or alterations as pertains to the present application. The elements in the claims are to be construed broadly based on the language employed in the claims and are not limited to examples described in the present specification or during the practice of the application, which examples are to be construed as non-exclusive.
The above description is intended to be illustrative and not restrictive. For example, the above-described examples (or one or more aspects thereof) may be used in combination with each other. For example, other embodiments may be used by those of ordinary skill in the art upon reading the above description. In addition, in the above detailed description, various features may be grouped together to streamline the application. This is not to be interpreted as an intention that the disclosed features not being claimed are essential to any claim. Rather, inventive subject matter may lie in less than all features of a particular disclosed embodiment. Thus, the claims are hereby incorporated into the detailed description as examples or embodiments, with each claim standing on its own as a separate embodiment, and it is contemplated that these embodiments may be combined with one another in various combinations or permutations. The scope of the application should be determined with reference to the appended claims, along with the full scope of equivalents to which such claims are entitled.
The above embodiments are only exemplary embodiments of the present application and are not intended to limit the present application, the scope of which is defined by the claims. Various modifications and equivalent arrangements of this application will occur to those skilled in the art, and are intended to be within the spirit and scope of the application.

Claims (10)

1. An auxiliary training and evaluation method for an interventional surgical robot, the auxiliary training and evaluation method being used in cooperation with a user manipulation part and the interventional surgical robot, the auxiliary training and evaluation method comprising, with at least one processor:
receiving user identification information, a target interventional operation robot and a target interventional operation type input by a user;
prompting a user to select a training mode in the case of having been adapted to the target interventional surgical robot;
responding to a training mode selected by a user, and presenting corresponding training contents according to the target interventional operation robot and the target interventional operation type to guide the user to execute training operation;
acquiring training operation data of a user, and providing training feedback information for the user according to the training operation data of the user;
Receiving a grade to be authenticated input by a user;
providing corresponding comprehensive evaluation content according to the authentication level of the user, and guiding the user to execute comprehensive evaluation operation;
authenticating the grade of the user according to the comprehensive evaluation operation data of the user;
and opening the operation functions of the corresponding level of the target interventional operation robot according to the authentication level of the user.
2. The method for assisting in training and evaluating an interventional procedure robot according to claim 1, wherein providing training feedback information to a user according to training operation data of the user specifically comprises:
errors in the training operation of the user are recorded, and the user is prompted to strengthen the exercise of the training operation with the errors.
3. The method for assisting in training and evaluating an interventional procedure robot according to claim 1, wherein providing training feedback information to a user according to training operation data of the user specifically comprises:
training operation data is recorded for each training, wherein the training operation data comprises time and progress, and the time and the progress are used as training records of each time;
at the next log-in of the user, each training record is provided to the user for a predetermined period of time, so that the user continues the previous training or looks up each training record.
4. The method for assisting training and evaluating an interventional surgical robot according to claim 1, characterized in that said prompting the user to select a training mode in case of having been adapted to a target interventional surgical robot comprises in particular:
presenting options of a test mode, a teaching mode and a customization mode to a user;
responding to the selection of the test mode by the user, providing simulation test content for the user, guiding the user to execute simulation test operation, and providing an evaluation report for the user according to the simulation test operation data of the user;
responsive to the user selecting the teaching mode, providing a learning course associated with the target interventional surgical robot and the target interventional surgical type to the user;
in response to the user selecting the customization mode, the user is prompted to enter a customization requirement entry, and customization training content is provided to the user according to the customization requirement entry entered by the user.
5. The method of assisting in training and evaluating an interventional procedure robot according to claim 4, further comprising: and when the user selects the customization mode or the teaching mode, marking the operation error of the user and prompting the correct operation mode.
6. The method of assisted training and evaluation of an interventional procedure robot of claim 4, wherein in response to a user selecting a customization mode, prompting the user to enter a customization requirement entry specifically comprises:
in response to a user selecting a customization mode, presenting candidates of customization requirement entries and editable boxes of customization requirement entries to the user according to operational errors in the training records of the user for each time within a predetermined period;
in response to a user's selection of a candidate for a customization requirement entry and editing the customization requirement entry in an editable box, customization training content is provided to the user in accordance with the user's selection of the candidate for the customization requirement entry and the edited customization requirement entry.
7. The method for assisting in training and evaluating an interventional procedure robot according to claim 4, wherein the providing the custom training content to the user according to the custom requirement entry entered by the user specifically comprises:
providing preliminary custom training content for a user according to custom requirement items input by the user;
and providing the preliminary customized training content as final customized training content under the condition that confirmation of the user on the preliminary customized training content is received, otherwise, receiving a supplementary customized entry of the user on the preliminary customized training content, and adjusting the preliminary customized training content according to the supplementary customized entry for confirmation of the user.
8. The method for assisting in training and evaluating an interventional procedure robot according to claim 4, wherein providing training feedback information to a user according to training operation data of the user specifically comprises:
prompting a suggestion level for authentication to a user according to a target intervention operation type, operation time, accuracy and completion degree corresponding to training operation data of the user;
in the case that the user performs a confirmation operation on the authenticated advice level and selects the test mode, the user is provided with simulated test contents corresponding to the advice level, the user is guided to perform a simulated test operation, and an assessment report and a training advice for the advice level are provided to the user according to the simulated test operation data of the user.
9. The method for assisting in training and evaluating an interventional operation robot according to claim 1, wherein providing the corresponding comprehensive evaluation content according to the level of authentication to be performed by the user, and guiding the user to perform the comprehensive evaluation specifically comprises:
forming a question bank including various difficulty level operation contents of various target interventional operation robots and various chapters of the target interventional operation types;
And randomly extracting the operation content of the corresponding difficulty level of part of chapters from the question library as corresponding comprehensive evaluation content according to the authentication level of the user, and guiding the user to execute the comprehensive evaluation operation.
10. An auxiliary training and evaluation device for an interventional surgical robot, the auxiliary training and evaluation device being cooperatively used with a user manipulation section and the interventional surgical robot, the auxiliary training and evaluation device comprising a processor configured to:
receiving user identification information, a target interventional operation robot and a target interventional operation type input by a user;
prompting a user to select a training mode in the case of having been adapted to the target interventional surgical robot;
responding to a training mode selected by a user, and presenting corresponding training contents according to the target interventional operation robot and the target interventional operation type to guide the user to execute training operation;
acquiring training operation data of a user, and providing training feedback information for the user according to the training operation data of the user;
receiving a grade to be authenticated input by a user;
providing corresponding comprehensive evaluation content according to the authentication level of the user, and guiding the user to execute comprehensive evaluation operation;
Authenticating the grade of the user according to the comprehensive evaluation operation data of the user;
and opening the operation functions of the corresponding level of the target interventional operation robot according to the authentication level of the user.
CN202311475395.6A 2023-11-08 2023-11-08 Auxiliary training and evaluating method and device for interventional operation robot Active CN117218922B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311475395.6A CN117218922B (en) 2023-11-08 2023-11-08 Auxiliary training and evaluating method and device for interventional operation robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311475395.6A CN117218922B (en) 2023-11-08 2023-11-08 Auxiliary training and evaluating method and device for interventional operation robot

Publications (2)

Publication Number Publication Date
CN117218922A true CN117218922A (en) 2023-12-12
CN117218922B CN117218922B (en) 2024-02-02

Family

ID=89051429

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311475395.6A Active CN117218922B (en) 2023-11-08 2023-11-08 Auxiliary training and evaluating method and device for interventional operation robot

Country Status (1)

Country Link
CN (1) CN117218922B (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20140057996A (en) * 2012-11-05 2014-05-14 한국과학기술원 Simulator for training needle interventional operation and interface apparatus for the same
US20190156690A1 (en) * 2017-11-20 2019-05-23 Medical Realities Limited Virtual reality system for surgical training
CN112015574A (en) * 2020-08-27 2020-12-01 北京博医时代教育科技有限公司 Remote medical education training method, device, equipment and storage medium
US11315438B1 (en) * 2016-06-24 2022-04-26 Verily Life Sciences Llc Surgical training systems and methods
CN114898626A (en) * 2022-06-29 2022-08-12 华中科技大学同济医学院附属同济医院 Laparoscopic surgery teaching training platform based on 3D printing technology and use method thereof
CN115331531A (en) * 2021-12-02 2022-11-11 上海市第六人民医院 Teaching device and method for full-true simulation arthroscopic surgery

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20140057996A (en) * 2012-11-05 2014-05-14 한국과학기술원 Simulator for training needle interventional operation and interface apparatus for the same
US11315438B1 (en) * 2016-06-24 2022-04-26 Verily Life Sciences Llc Surgical training systems and methods
US20190156690A1 (en) * 2017-11-20 2019-05-23 Medical Realities Limited Virtual reality system for surgical training
CN112015574A (en) * 2020-08-27 2020-12-01 北京博医时代教育科技有限公司 Remote medical education training method, device, equipment and storage medium
CN115331531A (en) * 2021-12-02 2022-11-11 上海市第六人民医院 Teaching device and method for full-true simulation arthroscopic surgery
CN114898626A (en) * 2022-06-29 2022-08-12 华中科技大学同济医学院附属同济医院 Laparoscopic surgery teaching training platform based on 3D printing technology and use method thereof

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
王亚冰;焦力群;谌燕飞;马妍;吉训明;朱凤水;李慎茂;: "血管介入手术模拟训练系统在缺血性脑血管病神经介入教学培训中的应用", 转化医学杂志, no. 04 *

Also Published As

Publication number Publication date
CN117218922B (en) 2024-02-02

Similar Documents

Publication Publication Date Title
Dickison et al. Integrating the National Council of State Boards of Nursing clinical judgment model into nursing educational frameworks
Lee et al. Gap between college education and clinical practice: Experience of newly graduated nurses
CN111144191B (en) Font identification method, font identification device, electronic equipment and storage medium
US20110165550A1 (en) Management system for online test assessment and method thereof
Havrilla et al. Exploring predictors of NCLEX-RN success: One school's search for excellence
US20160314702A1 (en) Individually customized online learning system
US20110172499A1 (en) Remote patient management system adapted for generating an assessment content element
Pérez-Rosas et al. Building a motivational interviewing dataset
WO2004001665A3 (en) Method and system for detecting and analyzing clinical pictures and the causes thereof and for determining proposals for appropriate therapy
JP5686183B2 (en) Questioning apparatus and questioning method
CN117218922B (en) Auxiliary training and evaluating method and device for interventional operation robot
Storkel Minimal, maximal, or multiple: Which contrastive intervention approach to use with children with speech sound disorders?
KR100532225B1 (en) A study apparatus using network and the method thereof
CN112017748A (en) Rehabilitation training system, method, equipment and storage medium for language disorder patient
CN117407682A (en) Medical model evaluation method, device, electronic equipment and storage medium
KR100299656B1 (en) Driving license examination system and driving license training system applied a computer tereminal and method of an advertisement using said training system and compact disc recorded said training program
Feuerle Testing interpreters: Developing, administering, and scoring court interpreter certification exams
Zahid et al. Surgical supervisor feedback affects performance: a blinded randomized study
Vargas et al. Trainee operative autonomy in plastic surgery
JP6930754B2 (en) Learning support device and questioning method
Capogna et al. Compuflo®‐Assisted Training vs Conventional Training for the Identification of the Ligamentum Flavum with an Epidural Simulator: A Brief Report
CN114860772A (en) Test paper generation method and device, electronic equipment and storage medium
CN117242507A (en) Guiding method and device for improving reading and writing capabilities
JP2008026583A (en) Adaptive test system and its method
DeMark et al. Using statistical natural language processing for understanding complex responses to free-response tasks

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant