WO2020213484A1 - Système d'évaluation d'intervention chirurgicale - Google Patents

Système d'évaluation d'intervention chirurgicale Download PDF

Info

Publication number
WO2020213484A1
WO2020213484A1 PCT/JP2020/015709 JP2020015709W WO2020213484A1 WO 2020213484 A1 WO2020213484 A1 WO 2020213484A1 JP 2020015709 W JP2020015709 W JP 2020015709W WO 2020213484 A1 WO2020213484 A1 WO 2020213484A1
Authority
WO
WIPO (PCT)
Prior art keywords
surgical
model
markers
calculation unit
evaluation system
Prior art date
Application number
PCT/JP2020/015709
Other languages
English (en)
Japanese (ja)
Inventor
山田 敏之
貴士 光部
大造 林田
松本 直樹
伊藤 正博
Original Assignee
学校法人慶應義塾
Jsr株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 学校法人慶應義塾, Jsr株式会社 filed Critical 学校法人慶應義塾
Priority to JP2021514899A priority Critical patent/JPWO2020213484A1/ja
Publication of WO2020213484A1 publication Critical patent/WO2020213484A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B19/00Teaching not covered by other main groups of this subclass
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B9/00Simulators for teaching or training purposes

Definitions

  • An embodiment of the present invention relates to a surgical evaluation system.
  • evaluation by a skilled doctor requires labor and is difficult to quantify because the subjectivity of the doctor himself is included. Evaluation by multiple people may be performed to reduce the influence of subjectivity, but it is difficult to carry out continuously because of the labor involved.
  • An object of the present invention is to provide a surgical evaluation system capable of quantitatively evaluating surgical skills.
  • the surgical evaluation system 1 includes a surgical instrument detection unit 12 and a calculation unit 201.
  • the surgical tool detection unit 12 detects the positions of the surgical tool markers 40A and 40B installed on the medical surgical tools 40 and 41 used by the user over time.
  • the calculation unit 201 calculates motion data representing the motions of the medical instruments 40, 40 based on the changes over time in the positions of the markers 40A and 40B.
  • FIG. 1 is a diagram showing an example of a schematic configuration of a surgical training system according to an embodiment.
  • FIG. 2 is a diagram showing an example of the hardware configuration of the information processing apparatus according to the embodiment.
  • FIG. 3 is a diagram showing an example of an instrument for surgical training according to the embodiment.
  • FIG. 4 is a diagram showing an example of an instrument for surgical training according to the embodiment.
  • FIG. 5 is a diagram showing an example of a biological model according to the embodiment.
  • FIG. 6 is a diagram showing an example of a biological model according to the embodiment.
  • FIG. 7 is a diagram showing an example of processing of the calculation unit according to the embodiment.
  • FIG. 8 is a diagram showing an example of processing of the calculation unit according to the embodiment.
  • FIG. 1 is a diagram showing an example of a schematic configuration of a surgical training system according to an embodiment.
  • FIG. 2 is a diagram showing an example of the hardware configuration of the information processing apparatus according to the embodiment.
  • FIG. 3 is a diagram showing an example
  • FIG. 9 is a diagram showing an example of processing of the calculation unit according to the embodiment.
  • FIG. 10 is a diagram showing an example of processing of the evaluation unit according to the embodiment.
  • FIG. 11 is a flowchart showing an operation example of the surgical training system according to the embodiment.
  • FIG. 12 is a diagram showing an example of an instrument for surgical training according to the second embodiment.
  • FIG. 13 is a diagram showing an example of comparison results by the evaluation unit according to the second embodiment.
  • FIG. 14 is a flowchart showing an operation example of the surgical training system according to the second embodiment.
  • FIG. 15 is a diagram showing an example of a medical surgical instrument according to another embodiment.
  • FIG. 1 is a diagram showing an example of a schematic configuration of the surgical training system 1 according to the embodiment.
  • the surgical training system 1 includes a model sensor 11, a surgical instrument sensor 12, and an information processing device 20.
  • Each device illustrated in FIG. 1 is in a state of being able to communicate directly or indirectly with each other by a network such as a LAN (Local Area Network) or a WAN (Wide Area Network).
  • the surgical training system 1 is an example of a surgical evaluation system.
  • the model sensor 11 is a sensor that detects the positions of the model markers 60 to 68 arranged on the biological model 50 used for surgical training over time.
  • the model sensor 11 is a camera and detects an AR marker for an image recognition type AR (Augmented Reality) system.
  • the model sensor 11 is an example of a model detection unit.
  • the model sensor 11, the biological model 50, and the model markers 60 to 68 will be described later.
  • the surgical tool sensor 12 is a sensor that detects the positions of the surgical tool markers 40A and 41A installed on the medical surgical tools 40 and 41 used for surgical training over time.
  • the surgical instrument sensor 12 is a camera and detects an AR marker for an image recognition type AR system.
  • the surgical tool sensor 12 is an example of a surgical tool detection unit.
  • the surgical instrument sensor 12, the medical surgical instrument 40, 41, and the surgical instrument marker 40A, 41A will be described later.
  • the model sensor 11 and the surgical tool sensor 12 can be any sensor as long as it can detect the position information of the biological model 50 or the medical surgical tools 40 and 41, such as a magnetic sensor.
  • the information processing device 20 is, for example, a computer such as a personal computer or a workstation, and analyzes a user's technique in surgical training based on the position information detected by the model sensor 11 and the surgical instrument sensor 12.
  • the information processing device 20 includes a calculation unit 201 and an evaluation unit 202.
  • the functions of the information processing device 20 are not limited to the calculation unit 201 and the evaluation unit 202.
  • the calculation unit 201 and the evaluation unit 202 will be described later.
  • FIG. 2 is a diagram showing an example of the hardware configuration of the information processing device 20 according to the embodiment.
  • the information processing device 20 includes a CPU (Central Processing Unit) 21, a ROM (Read Only Memory) 22, a RAM (Random Access Memory) 23, an auxiliary storage device 24, and an input device 25.
  • a display device 26 and an external I / F (Interface) 27 are provided.
  • the CPU 21 is a processor (processing circuit) that comprehensively controls the operation of the information processing device 20 by executing a program and realizes various functions of the information processing device 20. Various functions of the information processing device 20 will be described later.
  • the ROM 22 is a non-volatile memory and stores various data (information written at the manufacturing stage of the information processing device 20) including a program for activating the information processing device 20.
  • the RAM 23 is a volatile memory having a working area of the CPU 21.
  • the auxiliary storage device 24 stores various data such as a program executed by the CPU 21.
  • the auxiliary storage device 24 is composed of, for example, an HDD (Hard Disc Drive), an SSD (Solid State Drive), or the like.
  • the input device 25 is a device for an operator using the information processing device 20 to perform various operations.
  • the input device 25 is composed of, for example, a mouse, a keyboard, a touch panel, or hardware keys.
  • the operator corresponds to, for example, a medical person such as a doctor.
  • the display device 26 displays various information.
  • the display device 26 displays image data, model data, a GUI (Graphical User Interface) for receiving various operations from an operator, a medical image, and the like.
  • the display device 26 is composed of, for example, a liquid crystal display, an organic EL (Electro Luminescence) display, or a cathode ray tube display.
  • the input device 25 and the display device 26 may be integrally configured, for example, in the form of a touch panel.
  • the external I / F27 is an interface for connecting (communication) with an external device such as a medical image diagnostic device 10 or a modeling device.
  • FIGS. 3 and 4 are diagrams showing an example of an instrument for surgical training according to the embodiment.
  • FIG. 3 shows a side view of the housing 30 included in the surgical training set
  • FIG. 4 shows a front view of the housing 30.
  • the configurations shown in FIGS. 3 and 4 are merely examples, and various structures and their dimensions can be arbitrarily changed as long as they do not affect the functions according to the present embodiment.
  • the housing 30 is formed of a bottom plate 31, a front plate 32, a rear plate 33, and an upper plate 34.
  • the front plate 32 is provided with holes 32A for a user (for example, a doctor) performing surgical training to visually recognize the inside of the housing 30 and for the user to insert medical instruments 40, 41 (forceps). .. Further, the front plate 32 is fixed at an arbitrary angle so that the user can easily see the inside of the housing 30.
  • a side wall may be provided on the side surface of the housing 30.
  • the housing 30 includes a model sensor 11, a support member 35, and a biological model 50.
  • the model sensor 11 is installed at a position where the back surface of the biological model 50 can be photographed (for example, inside the rear plate 33). Further, it is preferable that the model sensor 11 is installed so that the line-of-sight direction of the camera is perpendicular to the biological model 50.
  • the support member 35 is a member for supporting the biological model 50.
  • the biological model 50 is an alternative structure that imitates a part of the living body to be operated on, such as skin and organs.
  • FIGS. 5 and 6 are diagrams showing an example of the biological model 50 according to the embodiment.
  • FIG. 5 shows the front surface of the biological model 50 for suturing training
  • FIG. 6 shows the back surface of the biological model 50.
  • the front surface of the biological model 50 is a surface that the user visually recognizes during training
  • the back surface is a surface opposite to the front surface.
  • the configurations shown in FIGS. 5 and 6 are merely examples, and the types, shapes, dimensions, etc. of the biological model 50 and the model markers 60 to 68 are arbitrary as long as they do not affect the functions according to the present embodiment. Can be changed to.
  • the biological model 50 is a surface-shaped structure that imitates a part of the heart, and a cruciform wound (cut 50A) is formed.
  • the thickness of the surface-shaped structure that imitates a part of the heart is, for example, 0.1 mm or more and 10 mm or less.
  • the biological model 50 is formed of a soft material such as silicone rubber.
  • a plurality of model markers 60 to 68 are arranged.
  • the markers 60 to 68 for each model are, for example, AR markers, and are discretely arranged on the surface (front surface) facing the surface (front surface) on which the surgical training is performed.
  • the operator can perform surgical training in an environment close to that at the time of surgery without visually recognizing the markers 60 to 68 for each model.
  • the model marker 60 is a reference marker for reference. The function of the reference marker will be described later.
  • the markers 60 to 68 for each model may be directly printed on the biological model 50, or may be printed on the sticker and then attached.
  • the surface is an example of the first surface.
  • the back surface is an example of the second surface.
  • the processing content in which the surgical training system 1 detects the positions of various instruments for surgical training and provides a training environment for high-quality surgical skills will be described.
  • the model sensor 11 photographs, for example, a plurality of model markers 60 to 68 arranged on the back surface of the biological model 50 over time.
  • the model sensor 11 analyzes the captured image by an image recognition technique, and detects the coordinate information of each model marker 60 to 68 in the image over time.
  • the model sensor 11 outputs the detected coordinate information of each model marker 60 to 68 to the information processing device 20. In this way, the model sensor 11 detects the positions of the plurality of model markers 60 to 68 arranged on the biological model 50.
  • the surgical instrument sensor 12 is arranged at a position where, for example, the medical surgical instruments 40, 41 and the surgical instrument markers 40A, 41A can be photographed, and the surgical instrument markers 40A, 41A are photographed over time.
  • the surgical instrument sensor 12 analyzes the captured image by an image recognition technique, and detects the positions and orientations of the surgical instrument markers 40A and 41A in the image over time.
  • the surgical instrument sensor 12 outputs the positions and orientations of the detected surgical instrument markers 40A and 41A to the information processing device 20. In this way, the surgical instrument sensor 12 detects the positions of the surgical instrument markers 40A and 41A arranged on the biological model 50, respectively.
  • an image recognition technique for detecting the coordinate information of the model markers 60 to 68 and the surgical tool markers 40A and 41A a known technique can be appropriately selected and applied. Further, the analysis by the image recognition technique may be executed inside the information processing apparatus 20.
  • the calculation unit 201 calculates deformation data representing the deformation of the biological model 50 based on the time-dependent changes in the positions of the plurality of model markers 60 to 68. For example, the calculation unit 201 calculates the deformation data based on the difference between the positions of the plurality of model markers 61 to 68 and the positions of the reference markers (model markers 60). That is, the difference from the position of the reference marker can be used to cancel the overall movement component of the biological model 50 in surgical training.
  • the reference marker is preferably placed at a position that is not easily affected by the deformation of the biological model due to surgical training.
  • the calculation unit 201 provides deformation data such as distortion of markers 61 to 68 for each model, acceleration of strain, jerk of strain, direction of distortion, positional relationship of markers 61 to 68 for each model, and At least one of the shapes of the markers 61 to 68 for each model is calculated.
  • the "distortion” corresponds to the amount of movement (displacement amount) of the markers 61 to 68 for each model from the origin (initial position).
  • FIG. 7 to 9 are diagrams showing an example of processing of the calculation unit 201 according to the embodiment.
  • the marker position 61'and the marker position 63' are the positions of the model marker 61 and the model marker 63 after t seconds, respectively.
  • the "position after t seconds" is, for example, a position in the image of the frame immediately after, but is not limited to this, and may be a position in the image after several frames.
  • the calculation unit 201 calculates the distance between the model marker 61 and the marker position 61'as "distortion”. Then, the calculation unit 201 calculates the differential value regarding the time of distortion as the “velocity”. Further, the calculation unit 201 calculates the differential value of the velocity with respect to the time as “acceleration”. Further, the calculation unit 201 calculates the differential value regarding the acceleration time as the “jerk”. Further, the calculation unit 201 calculates the moving direction (right direction in the drawing) of the model marker 61 as the “distortion direction”.
  • the calculation unit 201 calculates the distance between the model marker 61 and the model marker 63 as the "positional relationship" between the model marker 61 and the model marker 63. This positional relationship changes to the distance between the marker position 61'and the marker position 63'after t seconds.
  • the calculation unit 201 calculates the distortion in the depth direction of the markers 60 to 68 for each model based on the change in the enlargement ratio. For example, as shown in FIG. 8, when the model marker 61 is enlarged after t seconds, the calculation unit 201 calculates the distortion in the depth direction according to the change in the enlargement ratio. In this case, the correspondence between the distortion in the depth direction of the markers 60 to 68 for each model and the amount of change in the enlargement ratio is measured in advance.
  • the calculation unit 201 calculates the shapes of the markers 60 to 68 for each model. As shown in FIG. 9, when the model marker 61 is crushed in the vertical direction after t seconds, it can be seen that the model marker 61 receives external pressure from the vertical direction. In this case, the calculation unit 201 calculates the rate of change of the model marker 61 in the vertical direction as the "shape" of the model marker 61.
  • the "shape” is not limited to the “crushing" of the model marker, but the "elongation” of the model marker can also be used.
  • the calculation unit 201 uses at least one of the distortion, the acceleration of the strain, the jerk of the strain, the direction of the strain, the positional relationship of the markers 61 to 68 for each model, and the shapes of the markers 61 to 68 for each model as deformation data. One calculates the number of times that a predetermined condition is satisfied. For example, the calculation unit 201 calculates the “number of sudden accelerations” by counting the number of times the acceleration exceeds a predetermined threshold value during surgical training.
  • the calculation unit 201 describes the "number of times the threshold value is exceeded” and / or “threshold value” regarding the strain, the speed of strain, the jerk of strain, and the positional relationship of the markers 61 to 68 for each model. It is possible to calculate the number of times the number has fallen below. Further, the "number of times a predetermined condition is satisfied” is not limited to the comparison with the threshold value, and can be set as a condition of a change in shape such as "the number of times the model marker is crushed in the vertical direction”. ..
  • the calculation unit 201 calculates motion data representing the motions of the medical instruments 40 and 41 based on the changes over time in the positions of the markers 40A and 41A for the surgical instruments. For example, the positional relationship between the medical instrument 40 and the marker 40A for the surgical instrument is known. Therefore, the calculation unit 201 converts the position and orientation of the surgical tool marker 40A into the position and orientation of the medical surgical tool 40 based on this positional relationship. Similarly, the calculation unit 201 converts the position and orientation of the surgical instrument marker 41A into the position and orientation of the medical instrument 41.
  • the calculation unit 201 uses at least one of the movement amount of the medical instruments 40 and 41, the acceleration of the movement amount, the jerk of the movement amount, the direction of movement, and the change in the direction of the medical instruments 40 and 41 as motion data. Calculate one.
  • the calculation unit 201 calculates the deformation data of the biological model 50 and the operation data of the medical instruments 40 and 41, respectively. Then, the calculation unit 201 outputs the calculated deformation data and operation data to the evaluation unit 202.
  • the calculation unit 201 can calculate various parameters other than the above parameters.
  • the calculation unit 201 may calculate the start time and the end time of the surgical training by using a predetermined operation as a trigger.
  • the calculation unit 201 sets the time when the two medical instruments 40 and 41 enter the inside of the housing 30 from the holes 32A as the "start time”, and sets the two medical instruments 40 and 41 from the holes 32A.
  • the time when the body 30 goes out of the body 30 is calculated as the "end time”.
  • the calculated start time and end time can be used for specifying the range of the image to be evaluated, measuring the time required for the procedure, and the like.
  • each parameter was calculated for each model marker 61 to 68, but the embodiment is not limited to this.
  • the calculation unit 201 may set virtual points such as the midpoints of the markers 61 to 68 for each model, and calculate each of the above-mentioned parameters for the set virtual points.
  • the evaluation unit 202 evaluates the surgical skill of the user who executes the surgical training based on the deformation data and the motion data. For example, the evaluation unit 202 evaluates the user's surgical skill for each determination item set according to the surgical scenario to be evaluated in the surgical training.
  • FIG. 10 is a diagram showing an example of processing of the evaluation unit 202 according to the embodiment.
  • the evaluation unit 202 has "tissue protection", “right hand forceps", and “left hand forceps” of the user's procedure for two determination items of "horizontal direction” and “vertical direction”. Is evaluated on a scale of 5 from “A” to "E". The highest evaluation value among "A” to “E” is "A”, and the evaluation value decreases in the order of "B", "C", “D”, and "E".
  • lateral direction is a determination item related to lateral stitching in the surgical scenario of "suture of a cruciform wound”.
  • Vertical is a determination item related to vertical suture in the surgical scenario of "suture of a cruciform wound”.
  • tissue protection is an evaluation result indicating the lightness of the load applied to the living tissue of the patient by the procedure.
  • the "right hand forceps” is an evaluation result showing the smoothness of the movement of the forceps (medical surgical tool 40) operated by the user's right hand.
  • the “left hand forceps” is an evaluation result showing the stability of the movement of the forceps (medical surgical tool 41) operated by the user's left hand.
  • the evaluation unit 202 calculates the evaluation value of "organizational protection" so that the smaller the total value of the strains of the markers 61 to 68 for each model, the higher the evaluation value. This is because it is considered that the smaller the strain, the lighter the load on the patient's living tissue.
  • the evaluation unit 202 evaluates the "right hand forceps" so that the smoother the locus of movement of the medical instrument 40 and the more constant the moving speed of the medical instrument 40, the higher the evaluation value. Calculate the value. This is based on the finding that the forceps of the right hand move smoothly as a skilled doctor, and not as smoothly as an inexperienced doctor.
  • the evaluation unit 202 calculates the evaluation value of the "left hand forceps" so that the smaller the deviation in the position and direction of the medical instrument 41, the higher the evaluation value. This is based on the finding that a skilled doctor has less blurring (position change) of the forceps on the left hand, and an inexperienced doctor has a larger blurring.
  • the evaluation unit 202 evaluates the user's surgical skill based on the deformation data and the motion data. Then, the evaluation unit 202 displays the evaluation result on the display device 26 and stores it in various storage devices.
  • the evaluation unit 202 can calculate the evaluation value by appropriately selecting or combining various parameters calculated by the calculation unit 201.
  • the evaluation unit 202 may calculate the evaluation value of "organizational protection" so that the evaluation value becomes higher as the number of sudden accelerations of the markers 61 to 68 for each model is smaller.
  • FIG. 11 is a flowchart showing an operation example of the surgical training system 1 according to the embodiment. Since the specific contents of each step are as described above, detailed description thereof will be omitted as appropriate.
  • the model sensor 11 and the surgical tool sensor 12 detect the positions of the model markers 60 to 68 and the positions of the surgical tool markers 40A and 41A over time (step S101). Subsequently, the calculation unit 201 calculates the deformation data of the biological model 50 based on the change over time in the positions of the model markers 60 to 68 (step S102). Further, the calculation unit 201 calculates the operation data of the medical surgical instruments 40 and 41 based on the time-dependent changes in the positions of the surgical instrument markers 40A and 41A (step S103). Then, the evaluation unit 202 evaluates the user's surgical skill based on the deformation data and the motion data (step S104). Then, the evaluation unit 202 outputs the evaluation result (step S105), and ends the process.
  • processing procedure shown in FIG. 11 is merely an example, and is not limited to the contents shown in the figure.
  • other processing procedures can be added (inserted) or the order of each processing can be changed as long as there is no contradiction in the processing contents.
  • the model sensor 11 detects the position of the model marker placed on the biological model used for the surgical training over time.
  • the calculation unit 201 calculates deformation data representing the deformation of the biological model 50 based on the change over time in the position of the model marker. According to this, the surgical training system 1 can provide a training environment for high-quality surgical skills.
  • the surgical training is performed using the biological model 50
  • the user can obtain a feeling associated with the procedure.
  • the procedure using the biological model 50 is automatically detected and the physical quantity is calculated and evaluated from the detected position information, highly quantitative evaluation can be performed.
  • Module 1 of the first embodiment In the first embodiment, the case where two cameras (model sensor 11 and surgical instrument sensor 12) are used has been illustrated, but the embodiment is not limited to this. For example, there may be one camera. When there is one camera, for example, the model markers 60 to 68 are discretely arranged on the surface (surface) to which the surgical training is performed. Then, the surgical tool markers 40A and 41A and the model markers 60 to 68 are detected over time by one camera. The method of analyzing each of the detected markers is the same as that of the first embodiment.
  • the device configuration can be simplified and the manufacturing cost can be suppressed.
  • FIG. 12 is a diagram showing an example of an instrument for surgical training according to the second embodiment.
  • FIG. 12 shows a side view of the housing 30 included in the surgical training set.
  • the configuration shown in FIG. 12 is merely an example, and various structures and their dimensions can be arbitrarily changed as long as they do not affect the functions according to the present embodiment.
  • the surgical training instrument shown in FIG. 12 is basically the same as the surgical training instrument shown in FIG. 3, except that the camera 13 is provided.
  • the camera 13 is a camera for capturing an image of a target site for surgery.
  • the camera 13 provides an image corresponding to an image from an endoscope in an actual operation. That is, the user can perform the training in the same environment as the endoscopic surgery by performing the procedure while viewing the image captured by the camera 13.
  • FIG. 12 shows the surgical instrument sensor 12, which is for clarifying that the surgical instrument sensor 12 is arranged at a position where the positions of the surgical instrument markers 40A and 41A can be detected.
  • the configuration of the surgical instrument sensor 12 is the same as the configuration of the surgical instrument sensor 12 shown in FIG. That is, the surgical instrument sensor 12 detects the position of the surgical instrument marker installed on the medical instrument used by the user over time.
  • the model sensor 11 shown in FIG. 12 does not have to be installed.
  • the information processing device 20 includes a calculation unit 201 and an evaluation unit 202.
  • the calculation unit 201 can execute the same processing as the calculation unit 201 shown in FIG. 1, but it is not necessary to execute the processing for calculating the deformation data.
  • the evaluation unit 202 can execute the same process as the evaluation unit 202 shown in FIG. 1, but it is not necessary to execute the process of evaluating based on the deformation data.
  • the calculation unit 201 calculates motion data representing the motions of the medical instruments 40 and 41 based on the changes over time in the positions of the markers 40A and 41A for the surgical instruments. For example, the calculation unit 201 calculates statistical values representing changes over time in physical quantities with respect to the positions of the surgical instrument markers 40A and 41A as motion data.
  • the calculation unit 201 may use physical quantities such as "elapsed time”, “total distance per unit time”, “forcer rotation”, “forcer hand position”, “forcer direction”, “velocity”, and “velocity vector”. Calculate at least one of "acceleration” and "acceleration vector”.
  • the “elapsed time” represents the time elapsed since the procedure was started.
  • the “total distance per unit time” represents the distance that the center position (or marker position) of the forceps (medical instruments 40, 41) has moved at an arbitrary time interval (for example, sampling interval).
  • “Forceps rotation” represents a position (rotation angle) of the forceps in the rotation direction. The “rotation” corresponds to the rotation of the medical instruments 40 and 41 with the main axis as the rotation axis.
  • the “forceps hand position” represents the position (coordinates) of the forceps handle.
  • “Forceps direction” represents the direction of forceps.
  • Velocity represents the time derivative value of the position change of the center position (or marker position) of the forceps (scalar amount).
  • Velocity vector represents velocity and its direction (vector quantity).
  • Accelation represents the time derivative of velocity (scalar quantity).
  • the “vector of acceleration” represents the acceleration and its direction (vector quantity).
  • the calculation unit 201 calculates a statistical value representing a change over time in the physical quantity as operation data.
  • the calculation unit 201 calculates statistical values such as an average value, a variance value, and a moving variance value for each of the above physical quantities.
  • the calculation unit 201 may calculate the statistical value after performing a predetermined calculation (time differentiation, integration, etc.) on each of the above physical quantities.
  • the calculation unit 201 uses the motion data as the standard deviation of the differential values of the rotation angles (forcer rotation) of the medical instruments 40 and 41 and the standard deviation of the hand positions (forcer hand positions) of the medical instruments 40 and 41. At least one statistical value is calculated out of the mean value and the standard deviation of the standard deviations of the hand positions (forcer hand positions) of the medical instruments 40 and 41. Further, the calculation unit 201 calculates the average value of the accelerations of the medical instruments 40 and 41 as motion data.
  • calculation unit 201 can calculate not only the above-mentioned statistical values but also arbitrary statistical values as operation data.
  • the evaluation unit 202 evaluates the user's surgical skill based on the motion data. For example, the evaluation unit 202 describes the user's surgical skills as “overall efficiency / impression”, “force sensitivity (sensitivity)”, “left-right coordination”, “tissue handling”, and “thread management”. Evaluate at least one evaluation value (item) of "needle approach angle” and "needle pulling action”.
  • all efficiency / impression represents the efficiency / impression of movement throughout the procedure.
  • Force sensitivity (delicateness) represents the delicacy of the procedure.
  • Left-right coordination represents the coordination of the left and right medical instruments 40 and 41.
  • Haandling of tissue represents the smooth handling of living tissue (or biological model 50).
  • Thread management represents the appropriateness of suture management.
  • the “needle entry angle” represents the appropriateness of the angle at which the suture needle is inserted.
  • the “needle pulling action” represents the smoothness of the suture needle pulling action.
  • the evaluation unit 202 delicates the user's procedure based on three parameters: the standard deviation of the differential value of the forceps rotation, the average value of the standard deviation of the forceps hand position, and the standard deviation of the forceps hand position standard deviation. Evaluate the deviation as a surgical skill. Specifically, the evaluation unit 202 calculates the average value of three parameters: the standard deviation of the differential value of the forceps rotation, the average value of the standard deviation of the forceps hand position, and the standard deviation of the forceps hand position standard deviation. .. Then, the evaluation unit 202 calculates the evaluation value “force sensitivity (delicateness)” so that the smaller the calculated average value, the higher the value.
  • the evaluation unit 202 evaluates the left-right coordination as a surgical skill based on the average value of the acceleration of the forceps. Specifically, the evaluation unit 202 calculates the evaluation value "left-right coordination" so that the smaller the average value of the acceleration of the forceps, the higher the value.
  • the evaluation unit 202 evaluates the user's surgical skill by calculating the evaluation value based on the motion data.
  • the evaluation value calculated by the evaluation unit 202 may be presented to the user as a numerical value as it is, or may be presented to the user as a determination result determined in several stages based on the magnitude of the numerical value.
  • the evaluation unit 202 evaluates the surgical skills for each of the plurality of users
  • the evaluation unit 202 compares the surgical skills of each of the plurality of users and outputs the comparison result. For example, the evaluation unit 202 displays the evaluation results of the surgical skills of each of the plurality of users in parallel.
  • FIG. 13 is a diagram showing an example of the comparison result by the evaluation unit 202 according to the second embodiment.
  • FIG. 13 illustrates a case where surgical skills are evaluated and compared for two doctors, doctor A and doctor B.
  • the case where the surgical skill is evaluated in three stages of "A", "B", and “C” is illustrated. In this case, it is shown that the evaluation of "A” is the highest, followed by the evaluation of "B” and "C” in that order.
  • the evaluation unit 202 displays a list of evaluation results of the surgical skills of doctor A and doctor B.
  • This list includes "overall efficiency / impression”, “force sensitivity (sensitivity)”, “left-right coordination”, “tissue handling”, “thread management”, “needle entry angle”, and The evaluation results of doctor A and doctor B are included for each of the seven items of "needle pulling action”.
  • the processing content of the evaluation unit 202 described above is merely an example, and is not limited to this.
  • the evaluation unit 202 does not necessarily have "overall efficiency / impression”, “force sensitivity (delicateness)", “left-right coordination”, “tissue handling”, “thread management”, and “needle entry angle”. It is not necessary to evaluate all seven items of "the operation of pulling out the needle”. The evaluation unit 202 can evaluate any of these seven items.
  • the evaluation unit 202 calculates the "force sensitivity (delicateness)" of the standard deviation of the differential value of the force rotation, the average value of the standard deviation of the forceps hand position, and the standard deviation of the forceps hand position. It is not necessary to use all three parameters of. For example, the evaluation unit 202 can calculate the "force sensitivity (delicateness)" based on any one or two parameters out of these three parameters.
  • comparison result presented by the evaluation unit 202 is not limited to the list shown in FIG.
  • the comparison result may be presented by a graph or numerical values. Further, the comparison result may be presented as a difference in the evaluation value between the doctor A and the doctor B.
  • the output destination of the evaluation result and the comparison result by the evaluation unit 202 is not limited to the display device 26.
  • the evaluation unit 202 stores the evaluation result and the comparison result in a storage device (ROM22, RAM23, auxiliary storage device 24, etc.), transfers the evaluation result and the comparison result to the external device via the external I / F27, and displays the evaluation result at the transfer destination. It is also possible to do it.
  • FIG. 14 is a flowchart showing an operation example of the surgical training system 1 according to the second embodiment. Since the specific contents of each step are as described above, detailed description thereof will be omitted as appropriate.
  • the surgical instrument sensor 12 detects the positions of the surgical instrument markers 40A and 41A over time (step S201). Subsequently, the calculation unit 201 calculates the operation data of the medical surgical instruments 40 and 41 based on the time-dependent changes in the positions of the surgical instrument markers 40A and 41A (step S202). Then, the evaluation unit 202 evaluates the user's surgical skill based on the motion data (step S203). Then, the evaluation unit 202 outputs the evaluation result (step S204), and ends the process.
  • processing procedure shown in FIG. 14 is merely an example, and is not limited to the contents shown in the figure.
  • other processing procedures can be added (inserted) or the order of each processing can be changed as long as there is no contradiction in the processing contents.
  • the surgical instrument sensor 12 detects the position of the surgical instrument marker installed on the medical instrument used by the user over time.
  • the calculation unit 201 calculates motion data representing the motion of the medical instrument based on the change over time in the position of the marker for the surgical instrument. According to this, the surgical training system 1 can quantitatively evaluate the surgical skill.
  • the surgical training system 1 when the surgical skills are evaluated for each of a plurality of users, the surgical skills of each of the plurality of users are compared and the comparison result is output. This is expected to make each doctor more competitive and motivated to improve surgical skills.
  • the surgical training system 1 does not necessarily have to include the evaluation unit 202.
  • the surgical training system 1 may present the physical quantity calculated by the calculation unit 201 to the user as it is. As a result, the user can browse the physical quantity with high quantitative property, and can perform self-evaluation of the surgical training.
  • FIG. 15 is a diagram showing an example of a medical surgical instrument according to another embodiment.
  • FIG. 15 illustrates a medical device provided with a finger hole and a finger hook.
  • the medical instrument 70 has a plurality of surgical instrument markers 81, 82, 83, 84, 85 in the area 80 of the spindle.
  • the positions of the medical instrument 70 in the rotation direction are different from each other for each of the plurality of surgical instrument markers 81, 82, 83, 84, 85. Specifically, the positions of the medical instrument 70 in the rotational direction and the positions in the longitudinal direction of the plurality of surgical instrument markers 81, 82, 83, 84, and 85 are different from each other.
  • any one of the plurality of surgical tool markers 81, 82, 83, 84, 85 is detected by the surgical tool sensor 12, the detection position of the marker and the marker and the medical surgical tool 70 The position and orientation of the medical instrument 70 can be detected based on the positional relationship.
  • the method of arranging the plurality of surgical tool markers 81, 82, 83, 84, 85 is also applicable to medical surgical tools that are not provided with finger holes or finger hooks.
  • the present invention is not limited to this.
  • the final state (suture state) of the suture sewn to the patient (or biological model) can be imaged with a camera or the like and evaluated on the image.
  • the programs executed by the information processing devices 10 and 40 of the above-described embodiments are files in an installable format or an executable format, and are CD-ROM, flexible disk (FD), CD-R, DVD, USB ( It may be configured to be recorded and provided on a computer-readable recording medium such as USB (Universal Serial Bus), or may be configured to be provided or distributed via a network such as the Internet. Further, various programs may be provided by being incorporated in advance in a non-volatile storage medium such as a ROM.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Physics & Mathematics (AREA)
  • Educational Administration (AREA)
  • Educational Technology (AREA)
  • General Physics & Mathematics (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Instructional Devices (AREA)

Abstract

L'invention concerne, selon un mode de réalisation, un système (1) d'évaluation d'intervention chirurgicale qui comprend une unité (12) de détection d'instruments chirurgicaux et une unité de calcul (201). L'unité (12) de détection d'instruments chirurgicaux détecte les positions en fonction du temps de marqueurs (40A, 40B) d'instruments chirurgicaux installés sur des instruments chirurgicaux médicaux (40, 41) utilisés par un utilisateur. L'unité de calcul (201) calcule des données de mouvement représentant les mouvements des instruments chirurgicaux médicaux (40, 40) sur la base de la variation en fonction du temps des positions des marqueurs (40A, 40B) d'instruments chirurgicaux.
PCT/JP2020/015709 2019-04-19 2020-04-07 Système d'évaluation d'intervention chirurgicale WO2020213484A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2021514899A JPWO2020213484A1 (fr) 2019-04-19 2020-04-07

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2019079830 2019-04-19
JP2019-079830 2019-04-19

Publications (1)

Publication Number Publication Date
WO2020213484A1 true WO2020213484A1 (fr) 2020-10-22

Family

ID=72837843

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2020/015709 WO2020213484A1 (fr) 2019-04-19 2020-04-07 Système d'évaluation d'intervention chirurgicale

Country Status (2)

Country Link
JP (1) JPWO2020213484A1 (fr)
WO (1) WO2020213484A1 (fr)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113017830A (zh) * 2021-02-23 2021-06-25 刘睿 一种基于视频识别的显微外科吻合操作评分系统
CN113223342A (zh) * 2021-05-11 2021-08-06 浙江大学医学院附属邵逸夫医院 一种基于虚拟现实技术的手术仪器操作训练系统及其设备
CN113971896A (zh) * 2021-11-17 2022-01-25 苏州大学 一种手术训练系统及训练方法
WO2023189462A1 (fr) * 2022-03-30 2023-10-05 ソニーグループ株式会社 Dispositif de traitement d'informations, procédé de traitement d'informations et programme

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003265616A (ja) * 2002-03-13 2003-09-24 Terumo Corp ガイドワイヤ
JP2009236963A (ja) * 2008-03-25 2009-10-15 Panasonic Electric Works Co Ltd 内視鏡手術用トレーニング装置、内視鏡手術用技能評価方法
JP2012081136A (ja) * 2010-10-13 2012-04-26 Toshiba Corp ステントおよびx線診断装置
JP2012521568A (ja) * 2009-03-20 2012-09-13 ジョンズ ホプキンス ユニバーシティ 技術的技能を定量化する方法及びシステム
JP2015196075A (ja) * 2014-04-03 2015-11-09 学校法人産業医科大学 内視鏡用訓練装置および内視鏡用訓練プログラム
JP2016036515A (ja) * 2014-08-07 2016-03-22 株式会社東芝 X線診断装置
WO2019008737A1 (fr) * 2017-07-07 2019-01-10 オリンパス株式会社 Système de formation à l'endoscopie
JP2019510604A (ja) * 2016-03-24 2019-04-18 エヌビュー メディカル インク 画像再構築のためのシステム及び方法

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003265616A (ja) * 2002-03-13 2003-09-24 Terumo Corp ガイドワイヤ
JP2009236963A (ja) * 2008-03-25 2009-10-15 Panasonic Electric Works Co Ltd 内視鏡手術用トレーニング装置、内視鏡手術用技能評価方法
JP2012521568A (ja) * 2009-03-20 2012-09-13 ジョンズ ホプキンス ユニバーシティ 技術的技能を定量化する方法及びシステム
JP2012081136A (ja) * 2010-10-13 2012-04-26 Toshiba Corp ステントおよびx線診断装置
JP2015196075A (ja) * 2014-04-03 2015-11-09 学校法人産業医科大学 内視鏡用訓練装置および内視鏡用訓練プログラム
JP2016036515A (ja) * 2014-08-07 2016-03-22 株式会社東芝 X線診断装置
JP2019510604A (ja) * 2016-03-24 2019-04-18 エヌビュー メディカル インク 画像再構築のためのシステム及び方法
WO2019008737A1 (fr) * 2017-07-07 2019-01-10 オリンパス株式会社 Système de formation à l'endoscopie

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113017830A (zh) * 2021-02-23 2021-06-25 刘睿 一种基于视频识别的显微外科吻合操作评分系统
CN113223342A (zh) * 2021-05-11 2021-08-06 浙江大学医学院附属邵逸夫医院 一种基于虚拟现实技术的手术仪器操作训练系统及其设备
CN113223342B (zh) * 2021-05-11 2023-06-16 浙江大学医学院附属邵逸夫医院 一种基于虚拟现实技术的手术仪器操作训练系统及其设备
CN113971896A (zh) * 2021-11-17 2022-01-25 苏州大学 一种手术训练系统及训练方法
CN113971896B (zh) * 2021-11-17 2023-11-24 苏州大学 一种手术训练系统及训练方法
WO2023189462A1 (fr) * 2022-03-30 2023-10-05 ソニーグループ株式会社 Dispositif de traitement d'informations, procédé de traitement d'informations et programme

Also Published As

Publication number Publication date
JPWO2020213484A1 (fr) 2020-10-22

Similar Documents

Publication Publication Date Title
WO2020213484A1 (fr) Système d'évaluation d'intervention chirurgicale
US20240115333A1 (en) Surgical system with training or assist functions
KR102523779B1 (ko) 수술 절차 아틀라스를 갖는 수술 시스템의 구성
KR101811888B1 (ko) 연상 능력 검사 및/또는 훈련
JP6169562B2 (ja) サンプルタスク軌跡を分析する、コンピュータで実現される方法、および、サンプルタスク軌跡を分析するシステム
Talasaz et al. Integration of force reflection with tactile sensing for minimally invasive robotics-assisted tumor localization
KR20150004726A (ko) 최소 침습 수술 기량의 평가 또는 개선을 위한 시스템 및 방법
US9332965B2 (en) Method and apparatus for managing and displaying ultrasound image according to an observation operation
AU2019354913A1 (en) Automatic endoscope video augmentation
US20200113636A1 (en) Robotically-assisted surgical device, robotically-assisted surgery method, and system
US20210307861A1 (en) Communication of detected tissue characterization in a surgical robotic platform
WO2017098503A1 (fr) Gestion de base de données pour chirurgie laparoscopique
WO2017098506A1 (fr) Système autonome d'évaluation et de formation basé sur des objectifs destiné à la chirurgie laparoscopique
JP7323647B2 (ja) 内視鏡検査支援装置、内視鏡検査支援装置の作動方法及びプログラム
JPWO2019139931A5 (fr)
Bihlmaier et al. Learning dynamic spatial relations
EP3977406A1 (fr) Systèmes et procédés d'imagerie médicale composite
US11771508B2 (en) Robotically-assisted surgical device, robotically-assisted surgery method, and system
Konstantinova et al. Evaluating manual palpation trajectory patterns in tele-manipulation for soft tissue examination
Nicolaou et al. A Study of saccade transition for attention segregation and task strategy in laparoscopic surgery
Konstantinova et al. Force-velocity modulation strategies for soft tissue examination
US11779412B2 (en) Robotically-assisted surgical device, robotically-assisted surgery method, and system
Dori et al. Speckle tracking technology for quantifying lung sliding
CN113889224B (zh) 手术操作预估模型的训练及手术操作指示方法
Finocchiaro et al. A framework for the evaluation of Human Machine Interfaces of robot-assisted colonoscopy

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20792169

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2021514899

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20792169

Country of ref document: EP

Kind code of ref document: A1