EP2401713A2 - Method and apparatus for surgical training - Google Patents

Method and apparatus for surgical training

Info

Publication number
EP2401713A2
EP2401713A2 EP10709575A EP10709575A EP2401713A2 EP 2401713 A2 EP2401713 A2 EP 2401713A2 EP 10709575 A EP10709575 A EP 10709575A EP 10709575 A EP10709575 A EP 10709575A EP 2401713 A2 EP2401713 A2 EP 2401713A2
Authority
EP
European Patent Office
Prior art keywords
further step
simulation
anatomical
previous
surgical
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
EP10709575A
Other languages
German (de)
French (fr)
Inventor
Paolo Fiorini
Debora Botturi
Davide Zerbato
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Surgica Robotica SpA
Original Assignee
Surgica Robotica SpA
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Surgica Robotica SpA filed Critical Surgica Robotica SpA
Publication of EP2401713A2 publication Critical patent/EP2401713A2/en
Ceased legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B23/00Models for scientific, medical, or mathematical purposes, e.g. full-sized devices for demonstration purposes
    • G09B23/28Models for scientific, medical, or mathematical purposes, e.g. full-sized devices for demonstration purposes for medicine
    • G09B23/30Anatomical models

Definitions

  • the present invention relates to a method and an apparatus for surgical training.
  • patent US 5,791,907 illustrates an interactive device for medical training, comprising a processing unit provided with a visualisation unit or display, wherein the processing unit is programmed for teaching medical procedures and training medical staff.
  • Patent US 5,454,722 illustrates in its turn an interactive system provided with processor for training to surgical procedures from remote positions.
  • the system uses visual, sound and textual databases, to train students to surgical procedures, using a system based on personal computer provided with graphical and multimedia resources.
  • simulator devices that can be used for didactic and training scopes, also for robotic surgery, namely devices suitable to provide the user with a greater perception of the involved surgical procedure. Practically, such devices prearrange a simulation environment able to reproduce in a likely way both the tactile and visual reactions perceived while carrying out a particular procedure or the use of a particular surgical instrument. Therefore the user is called to make decisions in response to specific stimuli that are reproduced each time by the simulator at both the tactile and visual levels.
  • patent US 5,828,813 illustrates a force feedback input device, with six degrees of freedom, comprising an articulated arm, a wrist and a base provided with encoders and motor members.
  • the input device functions as a master manipulator of microsurgical teleoperation robot system, comprising a slave manipulator coupled to an amplifier chassis, which is coupled to a control chassis connected to a workstation provided with a graphical interface.
  • the amplifier chassis is coupled to the motor members of the master manipulator, while the control chassis is connected to the encoders of the master robot manipulator.
  • a force feedback can be applied to the input device and can be generated by the slave robot in a way as to allow the user to operate the slave robot through the input device without physically viewing the slave robot.
  • the force feedback can be generated by the workstation to represent fictitious forces to constrain the control of the slave robot to be within imaginary predetermined boundaries.
  • the known simulators have considerable drawbacks.
  • such devices are generally made of very voluminous apparatuses, therefore expensive and difficult to transport.
  • Moreover such devices can be used by one user at a time, further weighing on the training costs.
  • the known simulators do not allow to provide the correct tactile perception of the anatomical tissues involved in the simulation.
  • the task of the present invention is that of solving the aforementioned problems, devising a method and an apparatus for surgical training that allow to carry out in an efficient way the training of medical staff for any type of surgical procedure, in particular for both the minimally invasive surgeries and for computer aided and robot type procedures.
  • Another object of the present invention is that of providing an apparatus having a simple conception, a securely reliable functioning and versatile use, as well as relatively economic cost.
  • figure 1 shows a schematic perspective view of the training apparatus in hand
  • figure 2 shows a perspective view of a further embodiment of the apparatus according to the invention.
  • figure 3 shows a flow diagram indicating the operative steps of the training method according to the invention. Best Mode
  • the apparatus for surgical training is indicated in its entirety with 1.
  • the apparatus 1 is mainly constituted by a hardware component and a software component.
  • the hardware component which is schematically represented in fig. 1, comprises a processing unit 2, connected to a couple of force feedback control means 3, to at least a monitor 4 and to at least an alphanumeric keyboard 5.
  • Control means 3 are preferably constituted of joysticks of the force feedback type, that is suitable to enable the user to perceive forces due to the contact of the instruments with the virtual anatomy of the patient.
  • joysticks 3 are of the type specifically illustrated by patent US 5,828,813, that is constituted by respective robotized arms comprising a plurality of articulation points, for example in correspondence of the wrist and the elbow.
  • joysticks 3a of simpler construction see fig. 2
  • with one only articulation in any case of the force feedback type, can be provided.
  • Joysticks 3, 3a have preferably seven degrees of freedom for the force feedback, six of which are suitable to represent the forces and the torques generated by the anatomical simulations, while the remaining degree of freedom represents the operation of the chosen virtual surgical instrument, as better described in the following.
  • the monitor 4 is suitable to provide the user with a preferably stereo vision of the virtual anatomy of the patient while carrying out the selected surgical intervention.
  • the keyboard 5 enables the user to insert textual information in relation with the procedure decisions object of the study or simply to note observations.
  • the apparatus 1 preferably comprises auricular and vocal interaction means, for example microphones, not represented in the figures for simplicity, suitable to enable the user to simulate the verbal interaction inside the operating room.
  • the training apparatus shapes a training console 10 that provides for a seat 6 between the joysticks 3a, for housing a portable processing unit of the known type, not represented.
  • the console 10 thus turns out to be easily transportable, further than very versatile, as it requires the only availability of a portable PC.
  • the software component installable in the processing unit 2 of the apparatus 1 or im- plementable on a portable PC to be inserted in the seat 6 of the console 10, comprises an interface suitable to provide the user with the execution of a series of operative steps of preparation to the simulation.
  • the interface enables the access to the simulation, the choice of the type of surgical procedure to be simulated, the selection of the level of support required during the simulation as well as the end feedback and valuation level.
  • the software component also comprises a database containing real anatomical data of a plurality of patients.
  • the database can illustrate the model of the anatomy of the patient, but also films of specific procedures carried out on the model.
  • the software component further comprises an anatomical viewer suitable to provide a graphical model of the anatomical area of interest, integrated with direct images of the involved organs.
  • the apparatus comprises a video viewer suitable to show, as a film, the surgical procedure effectively followed by the user, partially or entirely.
  • the software component can further provide a database of textual information related to various procedures, preferably enriched with animations and pictures to which the user can access to examine specific aspects of the procedure more in detail.
  • the software component comprises a simulator module suitable to enable the user in training phase to perform the selected intervention or the surgical procedure on a physical model of the patient's anatomy. More precisely, the simulator module enables to process and integrate to the graphical simulation, the real data of the patient obtainable from the medical instrumentation, as for example those obtained through CAT (Computer Axial Tomography) or MRI (Magnetic Resonance Imaging). Transformation of the evidences of medical report is made through a segmentation process which returns a tridimensional picture, thus provided with volumes. To such a segmented picture are then assigned physical properties, as in particular mass, elasticity and viscosity of the organic tissues, in a way as to obtain a biomechanical tridimensional model of the anatomical area of interest.
  • CAT Computer Axial Tomography
  • MRI Magnetic Resonance Imaging
  • Assignment of the physical properties enables to simulate, as better described in the following, a realistic interaction of the virtual surgical instruments with the simulated anatomical area. More precisely, such an interaction enables to simulate the interaction forces between the surgical instruments selected by the user and the simulated organs, with a rendering rate of the order of 1 kHz. The greater is the rendering rate of the dynamic interaction, the less perceptible by the user is the delay between the virtual interaction and the return of the tactile sensations through the joysticks 3, 3a as better described in the following (see fig. 3).
  • a viewer module able to allow the access to one or more virtual cameras.
  • the virtual cameras are suitable to enable the user to view the anatomical area of interest from the desired angle.
  • a module enables to view, according to the choice of the user, the visible parts, or the inaccessible areas, through activation of further artificial cameras.
  • a logout and playback module is also prearranged, in order to enable the registration in chronological sequence of the actions performed by the user, of the positions and the forces applied by the used tools, and suitable to play back, immediately or at the end of the procedure, the actions performed by the user.
  • the software component is further provided with a random event generation module, suitable to simulate the onset of unexpected events in the course of the intervention.
  • the software component can further comprise a warning module, suitable to alert the user in the case of possible dangerous situations or to suggest alternate intervention approaches.
  • One or more mirror modules are prearranged to enable one or more teachers to supervise the procedure carried out by the student, to intervene through corrections and warnings, and in the end to evaluate the level of preparation and training.
  • a specific evaluation module can be prearranged in order to evaluate the quality of the procedural choices performed by the student during the simulated intervention.
  • the processing unit 2 provides textual and graphical information related to the procedure, shows a movie of the procedure actually performed, as well as a simulation of the procedure in the simulation, to enable the user to browse forward or backward the operative step of interest and to view the intervention from different viewpoints.
  • the processing unit 2 shows every available textual data in relation with the patient, in suitable medical format.
  • the same unit 2 further shows the pre-operative data, in a way as to enable the user to formulate a diagnosis and to decide an intervention execution plan.
  • the user can acquire initial instructions for the correct setting of the robot peripheral or of the laparoscopic instruments suitable for the execution of the intervention.
  • the software component is able to show the simulation of an anatomical environment, according to the selected modalities and to the data inserted by the user, such as for example the positioning of the cameras, the typology, number and access point of the instruments.
  • force feedback joysticks 3, 3a can be selectively connected to respective surgical instruments showed by the processing unit 2, suitable to virtually interact with the simulated anatomical environment.
  • the processing unit 2 activates the physical simulation of the anatomical environment wherein the selected surgical instruments are suitable to operate through the manipulation of the joysticks 3, 3a.
  • the processing unit 2 in the meanwhile, records in chronological sequence every action, moving, typing on the keyboard 5, interaction with the simulation environment, to be able to play back in the following any step of the intervention.
  • the software component processes and produces through the hardware component suitable sound, visual or tactile signals, suitable to guide the user during the training, bearing to his mind possible errors and divergences with respect to an optimal procedure.
  • suitable sound, visual or tactile signals suitable to guide the user during the training, bearing to his mind possible errors and divergences with respect to an optimal procedure.
  • the enabling of such signals, as well as the choice between the available warning typologies, can be performed as a function of the learning grade of the user, selected in the preparation step previously described.
  • the method and the apparatus for surgical training therefore allow to attain the scope of carrying out in an efficient way the training of the medical staff for any type of surgical procedure, in particular for both the minimally invasive surgical procedures and the processor assisted and robotic type procedures.
  • a characteristic of the proposed method consists of the fact of integrating textual, multimedia and tactile type information, so as to process a physical anatomical model on which the user is able to simulate various surgical procedures according to his choice.
  • the user is able to operate on a realistic model provided with as much realistic physical properties and therefore to perceive during the simulation of the intervention the same tactile and dynamical sensations he would perceive in the real operating execution.
  • This characteristic ensures an effective learning of the surgical procedure by the user.
  • a further advantageous aspect of the proposed methods consists in the fact that it enables the user to be guided by the processing unit, so as to learn the best modality and, in the same time, to know the risks of the unadvised choices for carrying out the intervention.
  • the warning module in fact, is able to process the best path for the selected surgical procedure, alerting the user in useful time to make suitable decisions.
  • An advantage of the described training method is given by the fact that it provides the possibility of evaluating the skill level of the user, on the diagnosis capacities, the undertaken actions and the effective carrying out of the surgical simulation.
  • the apparatus for surgical training according to the invention can be connected to a master station managed by the teacher, who can intervene in the simulation of the student, any time he considers it necessary, simply using respective joysticks 3, 3a.
  • the apparatus enables to realize a kind of virtual surgery laboratory, wherein a plurality of processing units 2 are connected to a same master processing unit, in a way as to allow the simultaneous connection of an entire class of students in learning phase, coordinated and supervised by one or more teachers.
  • An advantage of the apparatus in hand consists of the possibility of continuously making the simulated anatomical model up-to-date with data updated with respect to the real anatomy of the patient, simply reiterating the process of transformation of the clinical data previously described (fig. 3). In such a way the method is able to generate always new learning sessions about the same surgical procedure.
  • the training apparatus can be simply constituted by connecting the console 10 comprising the joysticks 3a with a portable computer of common use, wherein the pre-chosen modules of the formerly illustrated software component have been previously installed.
  • the learning method can thus be realized substantially everywhere, simply transporting the console 10 in the desired place, with extreme ease of use.
  • the evaluation module that can be implemented in the apparatus, able to evaluate the user, based on the instructions sent by means of the control joysticks as well as the times for responding to the stimulus generated from time to time by the random event generation module, turns out to be very efficient.

Abstract

The method for surgical training comprises the steps of introducing in a processing unit (2) clinical data obtained by one or more images detected through suitable medical instrumentation; processing the clinical data so as to obtain through a segmentation process a tridimensional anatomical model; assigning to the tridimensional anatomical model respective biomechanical properties so as to obtain a physical virtual model; prearranging virtual surgical instruments and respective force feedback control means, suitable to simulate and render the tactile perception of a physical interaction between the virtual surgical instruments and the physical anatomical model; simulating through the operation of the control means exploration procedures and/or surgical procedures on the physical anatomical model by means of the virtual surgical instruments.

Description

Description Title of Invention: METHOD AND APPARATUS FOR SURGICAL
TRAINING
Technical Field
[1] The present invention relates to a method and an apparatus for surgical training.
Background Art
[2] Methods for surgery training and relative training apparatuses suitable to satisfy in some measure the necessity of learning surgical specialisation have been known for a while. Generally, these methods and relative apparatuses allow to operate the training to specific surgical procedures, using to such aim graphical simulations of generic organs of a virtual patient, for example for endoscopy, vascular surgery, hysterectomy.
[3] As an example, patent US 5,791,907 illustrates an interactive device for medical training, comprising a processing unit provided with a visualisation unit or display, wherein the processing unit is programmed for teaching medical procedures and training medical staff.
[4] Patent US 5,454,722 illustrates in its turn an interactive system provided with processor for training to surgical procedures from remote positions. The system uses visual, sound and textual databases, to train students to surgical procedures, using a system based on personal computer provided with graphical and multimedia resources.
[5] International application WO 2007/027101 describes a method for providing teaching and training relating to medical, veterinary and anatomical procedures. Such procedures are presented to the user through a multimedia interactive platform suitable to provide instruction and training.
[6] Nevertheless, such methods have considerable drawbacks, for example because they are limited to prearranging, as a model, generic virtual anatomies. Furthermore, the known methods cannot be used in a flexible way on surgical procedures of a different type, but have a very restricted range of use, generally limited to minimally invasive surgical procedures. On the other hand, the known training methods are not suitable to be used for other types of surgical operations, among which primarily those which involve robotic surgery.
[7] There are further known simulator devices that can be used for didactic and training scopes, also for robotic surgery, namely devices suitable to provide the user with a greater perception of the involved surgical procedure. Practically, such devices prearrange a simulation environment able to reproduce in a likely way both the tactile and visual reactions perceived while carrying out a particular procedure or the use of a particular surgical instrument. Therefore the user is called to make decisions in response to specific stimuli that are reproduced each time by the simulator at both the tactile and visual levels.
[8] As an example, patent US 5,828,813 illustrates a force feedback input device, with six degrees of freedom, comprising an articulated arm, a wrist and a base provided with encoders and motor members. The input device functions as a master manipulator of microsurgical teleoperation robot system, comprising a slave manipulator coupled to an amplifier chassis, which is coupled to a control chassis connected to a workstation provided with a graphical interface. The amplifier chassis is coupled to the motor members of the master manipulator, while the control chassis is connected to the encoders of the master robot manipulator. A force feedback can be applied to the input device and can be generated by the slave robot in a way as to allow the user to operate the slave robot through the input device without physically viewing the slave robot. Furthermore, the force feedback can be generated by the workstation to represent fictitious forces to constrain the control of the slave robot to be within imaginary predetermined boundaries.
[9] Nevertheless, the known simulators have considerable drawbacks. As an example, such devices are generally made of very voluminous apparatuses, therefore expensive and difficult to transport. Moreover such devices can be used by one user at a time, further weighing on the training costs. Finally, further to be limited to a reduced number of surgical procedures, with scarce possibilities of adaptation to other uses, the known simulators do not allow to provide the correct tactile perception of the anatomical tissues involved in the simulation.
[10] For the cited grounds, hospital structures are often lacking of suitable training means for the medical staff. In order to compensate for such a lack, surgical training is carried out with the only aid of the available literature. But the study of the medical literature requires non negligible times, without ensuring a suitable preparation for surgical practice.
Disclosure of Invention Disclosure
[11] The task of the present invention is that of solving the aforementioned problems, devising a method and an apparatus for surgical training that allow to carry out in an efficient way the training of medical staff for any type of surgical procedure, in particular for both the minimally invasive surgeries and for computer aided and robot type procedures.
[12] Another object of the present invention is that of providing an apparatus having a simple conception, a securely reliable functioning and versatile use, as well as relatively economic cost. [13] The above mentioned scopes are attained, according to the present invention, by the method and the apparatus for surgical training according to claims 1 and 16 Description Of Drawings
[14] Details of the invention shall be more apparent from the detailed description of preferred embodiments of the apparatus for surgical training according to the invention, illustrated for indicative purposes in the attached drawings, wherein:
[15] figure 1 shows a schematic perspective view of the training apparatus in hand;
[16] figure 2 shows a perspective view of a further embodiment of the apparatus according to the invention;
[17] figure 3 shows a flow diagram indicating the operative steps of the training method according to the invention. Best Mode
[18] With particular reference to such figures, the apparatus for surgical training is indicated in its entirety with 1. The apparatus 1 is mainly constituted by a hardware component and a software component.
[19] The hardware component, which is schematically represented in fig. 1, comprises a processing unit 2, connected to a couple of force feedback control means 3, to at least a monitor 4 and to at least an alphanumeric keyboard 5.
[20] Control means 3 are preferably constituted of joysticks of the force feedback type, that is suitable to enable the user to perceive forces due to the contact of the instruments with the virtual anatomy of the patient.
[21] In the embodiment illustrated in fig.1, joysticks 3 are of the type specifically illustrated by patent US 5,828,813, that is constituted by respective robotized arms comprising a plurality of articulation points, for example in correspondence of the wrist and the elbow. Alternatively, joysticks 3a of simpler construction (see fig. 2), with one only articulation, in any case of the force feedback type, can be provided.
[22] Joysticks 3, 3a have preferably seven degrees of freedom for the force feedback, six of which are suitable to represent the forces and the torques generated by the anatomical simulations, while the remaining degree of freedom represents the operation of the chosen virtual surgical instrument, as better described in the following.
[23] The monitor 4 is suitable to provide the user with a preferably stereo vision of the virtual anatomy of the patient while carrying out the selected surgical intervention.
[24] Finally, the keyboard 5 enables the user to insert textual information in relation with the procedure decisions object of the study or simply to note observations.
[25] The apparatus 1 preferably comprises auricular and vocal interaction means, for example microphones, not represented in the figures for simplicity, suitable to enable the user to simulate the verbal interaction inside the operating room. [26] According to a further embodiment illustrated in fig. 2, the training apparatus shapes a training console 10 that provides for a seat 6 between the joysticks 3a, for housing a portable processing unit of the known type, not represented. The console 10 thus turns out to be easily transportable, further than very versatile, as it requires the only availability of a portable PC.
[27] The software component installable in the processing unit 2 of the apparatus 1 or im- plementable on a portable PC to be inserted in the seat 6 of the console 10, comprises an interface suitable to provide the user with the execution of a series of operative steps of preparation to the simulation. As an example the interface enables the access to the simulation, the choice of the type of surgical procedure to be simulated, the selection of the level of support required during the simulation as well as the end feedback and valuation level.
[28] The software component also comprises a database containing real anatomical data of a plurality of patients. In particular, the database can illustrate the model of the anatomy of the patient, but also films of specific procedures carried out on the model.
[29] The software component further comprises an anatomical viewer suitable to provide a graphical model of the anatomical area of interest, integrated with direct images of the involved organs.
[30] Further to the graphical viewer, the apparatus comprises a video viewer suitable to show, as a film, the surgical procedure effectively followed by the user, partially or entirely.
[31] The software component can further provide a database of textual information related to various procedures, preferably enriched with animations and pictures to which the user can access to examine specific aspects of the procedure more in detail.
[32] The software component comprises a simulator module suitable to enable the user in training phase to perform the selected intervention or the surgical procedure on a physical model of the patient's anatomy. More precisely, the simulator module enables to process and integrate to the graphical simulation, the real data of the patient obtainable from the medical instrumentation, as for example those obtained through CAT (Computer Axial Tomography) or MRI (Magnetic Resonance Imaging). Transformation of the evidences of medical report is made through a segmentation process which returns a tridimensional picture, thus provided with volumes. To such a segmented picture are then assigned physical properties, as in particular mass, elasticity and viscosity of the organic tissues, in a way as to obtain a biomechanical tridimensional model of the anatomical area of interest. Assignment of the physical properties enables to simulate, as better described in the following, a realistic interaction of the virtual surgical instruments with the simulated anatomical area. More precisely, such an interaction enables to simulate the interaction forces between the surgical instruments selected by the user and the simulated organs, with a rendering rate of the order of 1 kHz. The greater is the rendering rate of the dynamic interaction, the less perceptible by the user is the delay between the virtual interaction and the return of the tactile sensations through the joysticks 3, 3a as better described in the following (see fig. 3).
[33] Moreover a viewer module able to allow the access to one or more virtual cameras is provided. The virtual cameras are suitable to enable the user to view the anatomical area of interest from the desired angle. Furthermore such a module enables to view, according to the choice of the user, the visible parts, or the inaccessible areas, through activation of further artificial cameras.
[34] A logout and playback module is also prearranged, in order to enable the registration in chronological sequence of the actions performed by the user, of the positions and the forces applied by the used tools, and suitable to play back, immediately or at the end of the procedure, the actions performed by the user.
[35] In order to implement the practical learning possibilities, the software component is further provided with a random event generation module, suitable to simulate the onset of unexpected events in the course of the intervention.
[36] The software component can further comprise a warning module, suitable to alert the user in the case of possible dangerous situations or to suggest alternate intervention approaches.
[37] One or more mirror modules are prearranged to enable one or more teachers to supervise the procedure carried out by the student, to intervene through corrections and warnings, and in the end to evaluate the level of preparation and training.
[38] At last a specific evaluation module can be prearranged in order to evaluate the quality of the procedural choices performed by the student during the simulated intervention.
[39] The training method realised by the described apparatus turns out to be easy to understand from the preceding description.
[40] In a step of preparation of the surgical simulation, the user performs the access to the training program and selects his own knowledge level between a set of different proposed levels.
[41] He successively selects a surgical procedure or a determined pathology and in the end chooses whether he operates in a study modality of a pre-chosen case already performed, or in practical execution modality.
[42] In the study modality, the processing unit 2 provides textual and graphical information related to the procedure, shows a movie of the procedure actually performed, as well as a simulation of the procedure in the simulation, to enable the user to browse forward or backward the operative step of interest and to view the intervention from different viewpoints.
[43] If, instead, the user chooses to perform the intervention, the processing unit 2 shows every available textual data in relation with the patient, in suitable medical format. The same unit 2 further shows the pre-operative data, in a way as to enable the user to formulate a diagnosis and to decide an intervention execution plan. Successively, the user can acquire initial instructions for the correct setting of the robot peripheral or of the laparoscopic instruments suitable for the execution of the intervention.
[44] At the end of a successive processing step, the software component is able to show the simulation of an anatomical environment, according to the selected modalities and to the data inserted by the user, such as for example the positioning of the cameras, the typology, number and access point of the instruments.
[45] If a suitable viewing instrument has been selected, in the successive step the software component is able to process the stereo viewing of the area interested by the intervention.
[46] In the successive step, force feedback joysticks 3, 3a can be selectively connected to respective surgical instruments showed by the processing unit 2, suitable to virtually interact with the simulated anatomical environment.
[47] In the following, the processing unit 2 activates the physical simulation of the anatomical environment wherein the selected surgical instruments are suitable to operate through the manipulation of the joysticks 3, 3a.
[48] The processing unit 2, in the meanwhile, records in chronological sequence every action, moving, typing on the keyboard 5, interaction with the simulation environment, to be able to play back in the following any step of the intervention.
[49] In the course of the simulation, the software component processes and produces through the hardware component suitable sound, visual or tactile signals, suitable to guide the user during the training, bearing to his mind possible errors and divergences with respect to an optimal procedure. The enabling of such signals, as well as the choice between the available warning typologies, can be performed as a function of the learning grade of the user, selected in the preparation step previously described.
[50] The method and the apparatus for surgical training therefore allow to attain the scope of carrying out in an efficient way the training of the medical staff for any type of surgical procedure, in particular for both the minimally invasive surgical procedures and the processor assisted and robotic type procedures.
[51] A characteristic of the proposed method consists of the fact of integrating textual, multimedia and tactile type information, so as to process a physical anatomical model on which the user is able to simulate various surgical procedures according to his choice. Particularly, thanks to the transformation of the data obtainable from the medical study instrumentation, mainly as CAT and MRI, the user is able to operate on a realistic model provided with as much realistic physical properties and therefore to perceive during the simulation of the intervention the same tactile and dynamical sensations he would perceive in the real operating execution. This characteristic ensures an effective learning of the surgical procedure by the user.
[52] A further advantageous aspect of the proposed methods consists in the fact that it enables the user to be guided by the processing unit, so as to learn the best modality and, in the same time, to know the risks of the unadvised choices for carrying out the intervention. The warning module, in fact, is able to process the best path for the selected surgical procedure, alerting the user in useful time to make suitable decisions.
[53] An advantage of the described training method is given by the fact that it provides the possibility of evaluating the skill level of the user, on the diagnosis capacities, the undertaken actions and the effective carrying out of the surgical simulation.
[54] The apparatus for surgical training according to the invention can be connected to a master station managed by the teacher, who can intervene in the simulation of the student, any time he considers it necessary, simply using respective joysticks 3, 3a.
[55] Furthermore the apparatus enables to realize a kind of virtual surgery laboratory, wherein a plurality of processing units 2 are connected to a same master processing unit, in a way as to allow the simultaneous connection of an entire class of students in learning phase, coordinated and supervised by one or more teachers.
[56] An advantage of the apparatus in hand consists of the possibility of continuously making the simulated anatomical model up-to-date with data updated with respect to the real anatomy of the patient, simply reiterating the process of transformation of the clinical data previously described (fig. 3). In such a way the method is able to generate always new learning sessions about the same surgical procedure.
[57] An important aspect of the described apparatus consists in the fact that it can be made through processing units and peripherals easy and economical to find. In fact, according to the embodiment illustrated in figure 2, the training apparatus can be simply constituted by connecting the console 10 comprising the joysticks 3a with a portable computer of common use, wherein the pre-chosen modules of the formerly illustrated software component have been previously installed. The learning method can thus be realized substantially everywhere, simply transporting the console 10 in the desired place, with extreme ease of use.
[58] The evaluation module that can be implemented in the apparatus, able to evaluate the user, based on the instructions sent by means of the control joysticks as well as the times for responding to the stimulus generated from time to time by the random event generation module, turns out to be very efficient.
[59] It is eventually to note that the apparatus according to the invention turns out to be easily handy and transportable. The apparatus in se is of simple construction, and so also of reduced cost. Its diffusion is also potentially very vast in the environments of surgical learning.
[60] In practice, the embodiment of the invention, the materials used, as well as the shape and dimensions, may vary depending on the requirements.
[61] Should the technical characteristics mentioned in each claim be followed byreference signs, such reference signs were included strictly with the aim of enhancing the understanding the claims and hence they shall not be deemed restrictive in any manner whatsoever on the scope of each element identified for exemplifying purposes by such reference signs.

Claims

Claims
[Claim 1] Method for surgical training comprising the following operative steps: a. introducing in a processing unit (2) clinical data detected by means of suitable medical instrumentation; b. processing said clinical data in a way as to obtain, through a segmentation process, a tridimensional anatomical model of an anatomical environment, that can be viewed through a graphical interface; c. assigning to said tridimensional anatomical model respective biome- chanical properties in a way as to obtain a physical anatomical model of said anatomical environment; d. prearranging virtual surgical instruments and respective force feedback control means, suitable to simulate through said graphical interface a physical interaction between said virtual surgical instruments and said physical anatomical model, as well as to render the tactile perception of said interaction; e. simulating through the operation of said control means exploration procedures and/or surgical procedures on said physical anatomical model by means of said virtual surgical instruments.
[Claim 2] Method according to claim 1, characterized in that it comprises the further step of selecting a desired viewpoint for visualizing said anatomical environment comprising said physical anatomical model, through the use of virtual cameras.
[Claim 3] Method according to claim 2, characterized in that it comprises the further step of selecting the visualization through said graphical interface of viewable portions of said anatomical environment or of inaccessible portions.
[Claim 4] Method according to claim 2, characterized in that it comprises the further step of rendering the tactile perception of the interaction forces between said virtual surgical instruments and said anatomical environment with a rate of the order of 1 kHz.
[Claim 5] Method according to claim 2, characterized in that it comprises the further step of selecting a suitable learning level and/or a suitable support level between a plurality of levels that can be selected, to simulate a said surgical procedure.
[Claim 6] Method according to claim 2, characterized in that it comprises the further step of prearranging a database comprising anatomical clinical data to be integrated in said step of processing of said anatomical model for simulating a said surgical procedure.
[Claim 7] Method according to one of the previous claims, characterized in that it comprises the further step of recording the sequence of graphical processing generated by said graphical interface during said simulation of said surgical procedure, to enable the playback thereof in film.
[Claim 8] Method according to one of the previous claims, characterized in that it comprises the further step of prearranging a database of textual information suitable to be viewed as animations of pictures during the course of said simulation.
[Claim 9] Method according to one of the previous claims, characterized in that said clinical data are obtained through CAT and/or MRI.
[Claim 10] Method according to one of the previous claims, characterized in that said biomechanical properties assigned to said tridimensional anatomical model comprise at least the mass, the elasticity and the viscosity of the organic tissues simulated in said anatomical environment.
[Claim 11] Method according to one of the previous claims, characterized in that it comprises the further step of prearranging through said graphical interface a random event generation, suitable to simulate the onset of unexpected events during said simulation.
[Claim 12] Method according to one of the previous claims, characterized in that it comprises the further step of providing signals suitable to advise about the onset of dangers and/or to suggest preferred solutions for performing said simulation.
[Claim 13] Method according to one of the previous claims characterized in that it comprises the further step of allowing the access of a supervisor suitable to supervise, intervene, signal and correct possible incorrect choices in the course of the execution of said simulation.
[Claim 14] Method according to one of the previous claims, characterized in that it comprises the further step of enabling to evaluate the degree of preparation of the executor of said simulation.
[Claim 15] Method according to claim 1, characterized in that it comprises the further step of selecting between a study modality of a said simulation already performed and an execution modality of a new simulation.
[Claim 16] Apparatus for surgical training comprising force feedback control means (3, 3a), suitable to be connected to a processing unit (2) for operating the method according to one of the previous claims, as well as to at least a monitor (4) for the graphical visualization and to at least an alphanumerical keyboard (5) for introducing textual or graphical data.
[Claim 17] Apparatus according to claim 16, characterized in that said control means (3, 3a) comprise a couple of joysticks (3, 3a) for virtually controlling respective virtual surgical instruments.
[Claim 18] Apparatus according to claim 17, characterized in that between said joysticks (3a) is provided a seat (6) for housing a said portable processing unit for operating said training method.
[Claim 19] Apparatus according to one of the claims from 16 to 18, characterized in that it comprises auricular and/or vocal interaction means suitable to simulate the proper verbal interaction of an operating room.
EP10709575A 2009-02-26 2010-02-25 Method and apparatus for surgical training Ceased EP2401713A2 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
ITBO2009A000111A IT1392871B1 (en) 2009-02-26 2009-02-26 METHOD AND SURGICAL TRAINING APPARATUS
PCT/IB2010/050824 WO2010097771A2 (en) 2009-02-26 2010-02-25 Method and apparatus for surgical training

Publications (1)

Publication Number Publication Date
EP2401713A2 true EP2401713A2 (en) 2012-01-04

Family

ID=42104533

Family Applications (1)

Application Number Title Priority Date Filing Date
EP10709575A Ceased EP2401713A2 (en) 2009-02-26 2010-02-25 Method and apparatus for surgical training

Country Status (3)

Country Link
EP (1) EP2401713A2 (en)
IT (1) IT1392871B1 (en)
WO (1) WO2010097771A2 (en)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6133881B2 (en) 2011-11-08 2017-05-24 コーニンクレッカ フィリップス エヌ ヴェKoninklijke Philips N.V. Interaction with 3D object datasets
WO2017098506A1 (en) * 2015-12-07 2017-06-15 M.S.T. Medical Surgery Technologies Ltd. Autonomic goals-based training and assessment system for laparoscopic surgery
CN107767719A (en) * 2016-08-17 2018-03-06 天津博诺智创机器人技术有限公司 A kind of industrial robot simulated training machine
US10810907B2 (en) 2016-12-19 2020-10-20 National Board Of Medical Examiners Medical training and performance assessment instruments, methods, and systems
CN109509555A (en) * 2018-11-26 2019-03-22 刘伟民 A kind of surgical operation preview appraisal procedure and system based on 3-dimensional image
CN117130489A (en) * 2023-10-25 2023-11-28 苏州阿基米德网络科技有限公司 VR-based medical equipment training method, electronic equipment and storage medium

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5255211A (en) * 1990-02-22 1993-10-19 Redmond Productions, Inc. Methods and apparatus for generating and processing synthetic and absolute real time environments
US5454722A (en) 1993-11-12 1995-10-03 Project Orbis International, Inc. Interactive multimedia eye surgery training apparatus and method
US5828813A (en) 1995-09-07 1998-10-27 California Institute Of Technology Six axis force feedback input device
US5791907A (en) 1996-03-08 1998-08-11 Ramshaw; Bruce J. Interactive medical training system
WO1999059106A1 (en) * 1998-05-13 1999-11-18 Acuscape International, Inc. Method and apparatus for generating 3d models from medical images
AU2001275308A1 (en) * 2000-06-06 2001-12-17 Frauenhofer Institut Fuer Graphische Datenverarbeitung The extended virtual table: an optical extension for table-like projection systems
IES20030352A2 (en) * 2002-05-10 2003-10-15 Haptica Ltd A surgical training simulator
US7376903B2 (en) * 2004-06-29 2008-05-20 Ge Medical Systems Information Technologies 3D display system and method
WO2007027101A1 (en) 2005-08-29 2007-03-08 Go Virtual Medical Limited Medical instruction system
EP2163092A4 (en) * 2007-05-18 2011-10-19 Uab Research Foundation Virtual interactive presence systems and methods

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See references of WO2010097771A2 *

Also Published As

Publication number Publication date
WO2010097771A3 (en) 2010-11-25
WO2010097771A2 (en) 2010-09-02
IT1392871B1 (en) 2012-04-02
ITBO20090111A1 (en) 2010-08-27

Similar Documents

Publication Publication Date Title
Bernardo Virtual reality and simulation in neurosurgical training
Alaraj et al. Virtual reality training in neurosurgery: review of current status and future applications
Cohen et al. Virtual reality simulation: basic concepts and use in endoscopic neurosurgery training
Székely et al. Virtual reality in medicine
US4907973A (en) Expert system simulator for modeling realistic internal environments and performance
US8550821B2 (en) Simulation system for arthroscopic surgery training
US11749137B2 (en) System and method for multisensory psychomotor skill training
US20130196300A1 (en) Robot assisted surgical training
KR101816172B1 (en) The simulation system for training and the method thereof
WO2010097771A2 (en) Method and apparatus for surgical training
US20210312834A1 (en) Vibrotactile Method, Apparatus and System for Training and Practicing Dental Procedures
Bayonat et al. Shoulder arthroscopy training system with force feedback
Voss et al. LAHYSTOTRAIN-Intelligent training system for laparoscopy and hysteroscopy
Kaufmann et al. Telepresence surgery system enhances medical student surgery training
Riva et al. Virtual reality as telemedicine tool: technology, ergonomics and actual applications
Lacey et al. Mixed-reality simulation of minimally invasive surgeries
Schott et al. CardioGenesis4D: interactive morphological transitions of embryonic heart development in a virtual learning environment
Witzke et al. Immersive virtual reality used as a platform for perioperative training for surgical residents
Horvath et al. Towards an advanced virtual ultrasound-guided renal biopsy trainer
Almohaimeed et al. Virtual reality-based surgical simulation training usability heuristics
Wieben Virtual and augmented reality in medicine
WO2018093816A1 (en) Devices and methods for interactive augmented reality
Violante Virtual Reality Simulation Transforms Medical Education: Can It Advance Student’s Surgical Skills and Application?
Bastawrous Teaching Robotic Colorectal Surgery
Norkhairani et al. Simulation for laparoscopy surgery with haptic element for medical students in HUKM: a preliminary analysis

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20110926

AK Designated contracting states

Kind code of ref document: A2

Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO SE SI SK SM TR

DAX Request for extension of the european patent (deleted)
17Q First examination report despatched

Effective date: 20121023

REG Reference to a national code

Ref country code: DE

Ref legal event code: R003

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION HAS BEEN REFUSED

18R Application refused

Effective date: 20151127