WO2023171437A1 - シミュレーションシステム、情報処理装置及び情報処理方法 - Google Patents
シミュレーションシステム、情報処理装置及び情報処理方法 Download PDFInfo
- Publication number
- WO2023171437A1 WO2023171437A1 PCT/JP2023/006943 JP2023006943W WO2023171437A1 WO 2023171437 A1 WO2023171437 A1 WO 2023171437A1 JP 2023006943 W JP2023006943 W JP 2023006943W WO 2023171437 A1 WO2023171437 A1 WO 2023171437A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- information processing
- processing device
- simulation
- simulation system
- surgical
- Prior art date
Links
- 230000010365 information processing Effects 0.000 title claims abstract description 231
- 238000004088 simulation Methods 0.000 title claims abstract description 181
- 238000003672 processing method Methods 0.000 title claims description 15
- 238000004891 communication Methods 0.000 claims abstract description 55
- 238000001356 surgical procedure Methods 0.000 claims abstract description 16
- 238000012545 processing Methods 0.000 claims description 90
- 210000004872 soft tissue Anatomy 0.000 claims description 53
- 238000004364 calculation method Methods 0.000 claims description 39
- 230000005540 biological transmission Effects 0.000 claims description 29
- 230000003287 optical effect Effects 0.000 claims description 27
- 210000000056 organ Anatomy 0.000 claims description 14
- 230000001133 acceleration Effects 0.000 claims description 7
- 230000003993 interaction Effects 0.000 claims description 5
- 238000000034 method Methods 0.000 description 31
- 230000033001 locomotion Effects 0.000 description 22
- 230000008569 process Effects 0.000 description 21
- 238000010586 diagram Methods 0.000 description 18
- 230000006870 function Effects 0.000 description 11
- 238000012549 training Methods 0.000 description 8
- 238000005516 engineering process Methods 0.000 description 6
- 238000012986 modification Methods 0.000 description 6
- 230000004048 modification Effects 0.000 description 6
- 230000000694 effects Effects 0.000 description 4
- 230000005484 gravity Effects 0.000 description 4
- 230000004044 response Effects 0.000 description 4
- 230000035807 sensation Effects 0.000 description 3
- 210000000683 abdominal cavity Anatomy 0.000 description 2
- 238000006243 chemical reaction Methods 0.000 description 2
- 238000001514 detection method Methods 0.000 description 2
- 239000002245 particle Substances 0.000 description 2
- 230000001360 synchronised effect Effects 0.000 description 2
- 210000001519 tissue Anatomy 0.000 description 2
- 230000000007 visual effect Effects 0.000 description 2
- 230000036772 blood pressure Effects 0.000 description 1
- 210000004556 brain Anatomy 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 230000001934 delay Effects 0.000 description 1
- 238000002357 laparoscopic surgery Methods 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 230000001151 other effect Effects 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 238000005381 potential energy Methods 0.000 description 1
- 238000009877 rendering Methods 0.000 description 1
- 238000007665 sagging Methods 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 210000000115 thoracic cavity Anatomy 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/10—Computer-aided planning, simulation or modelling of surgical operations
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/30—Surgical robots
- A61B34/37—Master-slave robots
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09B—EDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
- G09B9/00—Simulators for teaching or training purposes
Definitions
- the present disclosure relates to a simulation system, an information processing device, and an information processing method.
- One aspect of the present disclosure improves simulation performance.
- a simulation system includes a first information processing device and a second information processing device that cooperate with each other to provide a surgical simulation, and the first information processing device and the second information processing device provide a surgical simulation.
- the processing devices send and receive simulation data to and from each other via an all-optical communication network.
- An information processing device is an information processing device that provides a surgical simulation in cooperation with another information processing device, and includes a processing unit that calculates a simulation model including soft tissue, and a processing unit that calculates a simulation model including soft tissue; An optical transmission control unit that controls transmission and reception of simulation data to and from the processing device via an all-optical communication network.
- An information processing method is an information processing method in which a first information processing device and a second information processing device cooperate with each other to provide a surgical simulation, wherein the first information processing device
- the method includes the apparatus and the second information processing apparatus transmitting and receiving simulation data to and from each other via an all-optical communication network.
- FIG. 1 is a diagram schematically showing an overview of a simulation system according to an embodiment.
- FIG. 2 is a diagram illustrating an example of functional blocks of a simulation system.
- FIG. 2 is a block diagram showing an example of a schematic configuration of a haptic device.
- FIG. 3 is a diagram showing an example of a video.
- FIG. 3 is a diagram showing an example of a simulation loop.
- FIG. 2 is a diagram schematically showing remote DMA.
- 2 is a flowchart illustrating an example of processing (information processing method) executed in the simulation system. It is a figure showing a schematic structure of a modification.
- FIG. 3 is a diagram schematically showing the flow of simulation. It is a figure showing a schematic structure of a modification. It is a diagram showing an example of the hardware configuration of the device.
- At least one of the above issues is addressed by the disclosed technology.
- Using an all-optical communication network addresses the issue of low latency and ensures sufficient computational resources through cloud computing.
- remote DMA Direct Memory Access
- distributed processing across networks can be easily performed by synchronizing data in the memory of distributed information processing devices.
- Embodiment FIG. 1 is a diagram schematically showing an overview of a simulation system according to an embodiment.
- the simulation system 100 can be used as a haptic surgery simulator capable of large-scale calculations. For example, it can be used for training and preoperative planning for surgeries that deal with soft tissue (soft organs, etc.), such as laparoscopic surgery, thoracic surgery, and brain surgery. By training while feeling soft tissue deformation and realistic reaction forces, it is possible to gain more accurate preliminary experience, which is expected to be effective in improving skills and surgical results.
- the simulation system 100 can also be used as a training system for a surgical robot.
- Many conventional surgical support robots do not have force sense control. In such cases, doctors perform treatment while imagining the forces being applied to organs and tissues using only images. By calculating and presenting (displaying, etc.) accurate force sensations, it is possible to understand the force applied to organs and the appropriate force for suturing. More effective training becomes possible.
- the simulation system 100 includes an information processing device 1 and an information processing device 2.
- the information processing device 1 and the information processing device 2 are a first information processing device and a second information processing device that cooperate with each other to provide a surgical simulation.
- the information processing device 1 and the information processing device 2 are connected via an all-optical communication network N. Although details will be described later, the information processing device 1 and the information processing device 2 transmit and receive simulation data to and from each other via the all-optical communication network N.
- the connection of the information processing device 1 and the information processing device 2 by the all-optical communication network N means that at least all communications between the endpoint routers are established by optical communication. Note that communication after the endpoint router may be electrical communication after photoelectric conversion.
- the information processing device 1 and the information processing device 2 are connected with low delay through communication via the broadband all-optical communication network N. It is especially useful for real-time simulation.
- the information processing device 1 is placed in the edge area, and together with the haptic device 6 and the monitor 7 constitutes an edge simulator terminal.
- the haptic device 6 and the monitor 7 are also components of the information processing device 1, but they do not need to be components of the information processing device 1.
- a user U of the simulation system 100 is located within the edge region EF. In FIG. 1, only the part of the hand of the user U who operates the haptic device 6 is schematically shown.
- the surgical environment OE is simulated by the cooperation of the information processing device 1 and the information processing device 2.
- a surgical environment OE is schematically shown inside the information processing device 1 .
- the surgical environment OE includes virtual elements related to surgery. As elements included in the surgical environment OE, a soft tissue O, a surgical tool T, a robot arm R, and a camera C are illustrated in FIG.
- An example of soft tissue O is an organ that has softness.
- An example of the surgical tool T is a surgical tool such as forceps. Unlike the soft tissue O, the surgical tool T may have rigidity.
- Robot arm R is a surgical robot arm that supports camera C.
- the camera C is a surgical camera, and photographs the surgical field including the soft tissue O and the surgical tool T.
- the field of view of camera C is referred to as field of view F and is illustrated.
- the haptic device 6 is a device that the user U touches (with his/her hand) to operate the surgical tool T.
- the user U operates the surgical tool T by moving the haptic device 6.
- the surgical simulation proceeds by pinching or cutting the soft tissue O using the surgical instrument T.
- an external force is applied to the soft tissue O, and the soft tissue O is simulated to be deformed.
- a contact force (which may correspond to a sense of force, for example) generated by the interaction between the soft tissue O and the surgical tool T is reflected in the operation of the surgical tool T by the user U via the haptic device 6, and is fed back to the user U.
- the monitor 7 displays an image within the field of view F of the camera C described above, that is, an image including (at least a portion of) the soft tissue O and the surgical instrument T.
- User U operates haptic device 6 while viewing the video displayed on monitor 7 .
- the position, angle, etc. of the camera C may also be operated by the user U.
- the operation section of the camera C may be incorporated into the haptic device 6 or may be provided separately from the haptic device 6.
- the operation of camera C may be performed automatically. For example, the function of a control unit that automatically operates the camera C may be incorporated into the surgical environment OE.
- the information processing device 2 is placed in the cloud area CF.
- the cloud area CF is an area located on the opposite side of the edge area EF across the all-optical communication network N, and is, for example, an area away from the edge area EF.
- FIG. 2 is a diagram showing an example of functional blocks of the simulation system.
- FIG. 2 shows an example of functional blocks of the information processing device 1 and the information processing device 2.
- the information processing device 1 includes a processing section 11, a storage section 12, an IF section 13, a communication control section 14, an optical transmission control section 15, and an optical transmission control section 15. device 16.
- the processing unit 11 functions as an overall control unit (main control unit) that controls each element of the information processing device 1.
- the processing unit 11 also executes processing related to simulation. Details will be described later.
- the storage unit 12 stores information used by the information processing device 1. Simulation data is exemplified as the information stored in the storage unit 12.
- the simulation data includes data used in simulation calculations, data obtained by simulation, and the like.
- the IF section 13 provides an interface between the processing section 11, the haptic device 6, and the monitor 7.
- Position/force information (motion) is input from the haptic device 6 to the processing unit 11 via the IF unit 13 .
- the motion may also include information such as posture.
- the processing unit 11 causes the motion of the robot arm R in the surgical environment OE to correspond to the motion input via the haptic device 6.
- the user U operates the surgical tool T in the surgical environment OE via the haptic device 6 .
- the contact force generated by the contact between the surgical tool T and the soft tissue O is calculated.
- the calculated contact force is fed back to the user U via the haptic device 6.
- a command value for reflecting the contact force on the haptic device 6 is sent from the processing section 11 to the haptic device 6 via the IF section 13.
- the surgical environment OE includes the camera C, and its image is displayed on the monitor 7 (FIG. 1).
- the image of the camera C is sent from the processing section 11 to the monitor 7 via the IF section 13 and displayed.
- the communication control unit 14 controls communication between the information processing device 1 and external devices.
- An example of the external device is the information processing device 2, in which case an optical transmission control section 15 and an optical transmission device 16, which will be described next, are used.
- the optical transmission control unit 15 controls optical transmission by the optical transmission device 16.
- the optical transmission device 16 is a device for connecting the information processing device 1 to the all-optical communication network N.
- the optical transmission device 16 is mounted on, for example, PCIe (Peripheral Component Interconnect Express) and incorporated into the information processing device 1 .
- the information processing device 2 includes a processing section 21, a storage section 22, a communication control section 24, an optical transmission control section 25, and an optical transmission device 26.
- the processing unit 21 functions as an overall control unit that controls each element of the information processing device 2.
- the processing unit 21 also executes processing related to simulation. Details will be described later.
- the storage unit 22 stores information used by the information processing device 2. As the information stored in the storage unit 22, simulation data is exemplified.
- the communication control unit 24 controls communication between the information processing device 1 and external devices.
- An example of the external device is the information processing device 1, in which case the optical transmission control section 25 and the optical transmission device 26 are used.
- the optical transmission control unit 25 controls optical transmission by the optical transmission device 26.
- the optical transmission device 26 is a device for connecting the information processing device 2 to the all-optical communication network N.
- FIG. 3 is a block diagram showing an example of a schematic configuration of a haptic device.
- the haptic device 6 includes a plurality (n) of joints 61 and a haptics control section 62. In order to distinguish each joint 61, they are referred to as a joint 61-1 and a joint 61-n in the drawing.
- the joint 61 includes a motor 611, an encoder 612, a sensor 613, a motor control section 614, and a driver 615.
- the motor 611 is rotated according to the current driven by the driver 615.
- Encoder 612 detects the rotational position of joint 61.
- Examples of the sensor 613 are a force sensor, a torque sensor, an acceleration sensor, etc., and detect force, torque, acceleration of the joint 61, etc. applied to the joint 61.
- the sensor may be a six-axis sensor, or a plurality of sensors may be used in combination. More precise force control becomes possible.
- the motor control unit 614 performs feedback control of the motor 611 based on the detected value of the encoder 612 and the detected value of the sensor 613. For example, the position, force, torque, acceleration, etc. of the joint 61 are controlled.
- the feedback control may include calculations for force control and acceleration control, including estimation of a disturbance observer.
- Motor 611 is controlled via driver 615.
- Motor control section 614 gives a current command to driver 615.
- the driver 615 drives the current of the motor 611 based on the given current command.
- the motor control unit 614 sends sensor information, such as the detection value of the encoder 612 and the detection value of the sensor 613, to the haptics control unit 62.
- the haptics control unit 62 sends position/force information to the processing unit 11 (FIG. 2) based on sensor information from the motor control unit 614 of each joint 61.
- the haptics control unit 62 sends a control command to the motor control unit 614 of each joint 61 based on the command value from the processing unit 11 (FIG. 2).
- the motor control unit 614 of each joint 61 controls the motor 611 based on a control command from the haptics control unit 62.
- the simulation system 100 will be further described.
- the processing required for simulation is shared between the processing unit 11 of the information processing device 1 and the processing unit 21 of the information processing device 2.
- the processing unit 21 of the information processing device 2 placed in the cloud area CF has a higher calculation capacity than the processing unit 11 of the information processing device 1 placed in the edge area EF.
- processes necessary for the simulation processes that require a particularly large calculation load are executed by the processing unit 21 of the information processing device 2. An example of processing will be explained.
- the processing unit 11 of the information processing device 1 simulates the motion of the surgical tool T.
- Information on position and force is input from the haptic device 6 to the processing unit 11, and since the surgical instrument T has rigidity and does not deform like the soft tissue O, the calculation burden is not so large.
- the processing unit 21 of the information processing device 2 calculates a simulation model including the soft tissue O.
- the calculation of the simulation model includes modeling of the soft tissue O, calculation of deformation of the soft tissue O, and the like. For example, FEM, material point method (MPM), etc. may be used. Although the calculation load is large, since the processing unit 21 of the information processing device 2 has high calculation capacity, it is possible to calculate a highly accurate simulation model at high speed.
- the XPBD Extended Position Based Dynamics
- FEM Force Based Dynamics
- the processing unit 21 of the information processing device 2 calculates the contact force caused by the interaction between the soft tissue O and the surgical tool T.
- the contact force is calculated, for example, by simulating the contact between the deformed soft tissue O and the surgical tool T. It is possible to calculate the contact force with high accuracy based on the result of calculating the highly accurate deformation of the soft tissue O.
- pressure applied to the opening/closing/gripping portion of the tip of the surgical tool T may also be calculated. To the extent that there is no contradiction, such pressure may also be understood to be included in the contact force.
- the processing unit 21 calculates the deformation of the model, the presence or absence of collision, contact force, gravity, etc., in a calculation cycle shorter than the calculation cycle when it is assumed that the information processing device 1 performs these calculations, and in a haptic manner.
- the control period is the same as or longer than the control period of the device 6.
- the processing unit 21 calculates the simulation model at a predetermined cycle that is faster than conventional drawing processing and is equivalent to the movement control of a surgical robot.
- An example of such a period is a period greater than or equal to 60 Hz and comparable to (eg, substantially equal to) 1 kHz.
- the processing unit 11 of the information processing device 1 controls the haptic device 6 so that the contact force calculated by the processing unit 21 of the information processing device 2 is fed back to the user U (haptic control processing).
- the control period of the haptic device 6 may be the same as the calculation period of the simulation model by the processing unit 21 of the information processing device 2 .
- a fast cycle short cycle
- a slow cycle long cycle
- haptic feedback takes into account the sagging of organs due to gravity.
- the processing unit 11 of the information processing device 1 generates an image including the soft tissue O simulated by the processing unit 21 of the information processing device 2 (drawing process).
- An example of the cycle of the drawing process is 60 Hz.
- the drawing process may include CG rendering process. Ray tracing may also be used. It is possible to generate photorealistic images.
- the processing unit 11 may generate the video so that the information on the contact force calculated by the processing unit 21 of the information processing device 2 is included in the video. This will be explained with reference to FIG.
- FIG. 4 is a diagram showing an example of a video.
- the soft tissue O and the surgical tool T are displayed.
- Three surgical tools T are illustrated.
- Alphabets A, B, and C for distinguishing each surgical tool T are displayed in association with each other.
- information on the contact force corresponding to each surgical tool T specifically the value of the contact force (in N), is displayed numerically and as a bar.
- FIG. 5 is a diagram showing an example of a simulation loop. Processing executed during one frame is schematically shown.
- the above-mentioned XPBD method is used to simulate the soft tissue O.
- step S1 the simulation model is updated.
- the soft tissue O is incised, the mesh topology changes, and the model of the soft tissue O, etc. is updated.
- step S2 a simulation is performed using the soft tissue O etc. updated in step S1.
- the process in step S2 is subdivided into a fixed number of substeps. In the example shown in FIG. 5, the processes of sub-steps S21 to S24 are repeatedly executed.
- sub-step S21 the position of each meshed part is predicted.
- sub-step S22 a collision is detected according to the predicted position.
- a so-called constraint projection is performed in sub-step S23, the FEM-based constraints are solved based on the XPBD, and new positions are calculated in parallel (sub-step S23a). In this example, iterative processing using the Jacobian method is performed. The calculation result is obtained as a collision response (substep S23b).
- sub-step S24 the speed is calculated and updated based on the collision response, ie, the new position.
- the process of sub-step S21 is executed again. This substep loop is repeated within the time step of step S2.
- step S3 the simulation results in the previous step S2 are obtained.
- the simulation results include the soft tissue O after deformation.
- step S4 the visual model is updated and reflected in the video display.
- the above process is repeatedly executed at a fast cycle of 1 kHz or more.
- necessary simulation data is transmitted and received between the information processing device 1 and the information processing device 2.
- the simulation data is sent and received via the all-optical communication network N.
- the delay in sending and receiving simulation data can be suppressed (lower the delay) than when the all-optical communication network N is not used.
- the simulation system 100 by utilizing the high computing power of the processing unit 21 of the information processing device 2, it becomes possible to calculate a large-scale simulation model that is difficult to realize with the processing unit 11 of the information processing device 1 alone. For example, it is also possible to calculate a model of the entire abdominal cavity. Furthermore, deformation of the soft tissue O, etc. can be calculated accurately. Since the computational power is high and a fine mesh model can be used, even small deformations of soft tissue O can be accurately calculated. It is possible to provide a simulation of a realistic surgical environment OE.
- the information processing device 1 and the information processing device 2 may send and receive simulation data to and from each other using remote DMA.
- the UPDATE/processed data is transmitted and received between the storage unit 12 of the information processing device 1 and the storage unit 22 of the information processing device 2 by the remote DMA without the intervention of the CPU, and is synchronized.
- remote DMA By using remote DMA, it is possible to further reduce the delay. This will be explained with reference to FIG.
- FIG. 6 is a diagram schematically showing remote DMA.
- the functions of the processing section 21 and the storage section 22 are realized on the virtual layer 27.
- the simulation data stored in the storage unit 12 of the information processing device 1 and the simulation data ((at least part of) stored in the storage unit 22 of the information processing device 2 (at least a portion of) can be synchronized.
- the motion data of the surgical tool T processed by the processing unit 11 of the information processing device 1 and the data of the simulation model processed by the processing unit 21 of the information processing device 2 are shared with low delay. The effect of speeding up the simulation cycle can be further enhanced.
- FIG. 7 is a flowchart illustrating an example of processing (information processing method) executed in the simulation system.
- the processes of step S31, step S32, and steps S37 to S40 are executed in the edge region EF.
- the processes of step S34 and step S35 are executed in the cloud area CF.
- the processing in step S33 and step S36 is remote DMA.
- step S31 surgical instrument operation data is acquired.
- User U operates haptic device 6 .
- the haptic device 6 acquires surgical tool operation data according to user operations.
- Corresponding position/force information is sent from the haptic device 6 to the processing unit 11 of the information processing device 1 .
- step S32 surgical instrument motion information is updated.
- the processing unit 11 of the information processing device 1 updates the motion of the surgical instrument T in the surgical environment OE.
- step S33 remote DMA is performed.
- the simulation data in the storage unit 12 of the information processing device 1 is transferred to the storage unit 22 of the information processing device 2.
- step S34 a simulation model is calculated.
- the processing unit 21 of the information processing device 2 calculates the deformation of the soft tissue O and the contact force.
- step S35 the contact force and model shape are updated.
- the processing unit 21 of the information processing device 2 updates the simulation model of the contact force, the shape of the soft tissue O, etc. based on the calculation results of the simulation model.
- step S36 remote DMA is performed.
- the simulation data in the storage unit 22 of the information processing device 2 is transferred to the storage unit 12 of the information processing device 1.
- step S36 the process proceeds to step S37 and step S39, respectively.
- step S37 the haptic device 6 is controlled.
- the processing unit 11 of the information processing device 1 controls the haptic device 6 so that the updated contact force is fed back to the user U.
- step S38 the contact force is fed back. Via the haptic device 6, the contact force is fed back to the user U (force sensation is presented).
- step S39 a video is generated.
- the processing unit 11 of the information processing device 1 generates an image including the updated soft tissue O and the like.
- step S40 the video is displayed.
- the monitor 7 displays images.
- step S38 and step S40 When the processing in step S38 and step S40 is completed, the processing returns to step S31. A series of processes are repeatedly executed to advance the simulation.
- the simulation system 100 may include a plurality of information processing devices 1, each of which is used by a different user U. Multi-point connection via the all-optical communication network N is possible. This will be explained with reference to FIG.
- FIG. 8 is a diagram showing a schematic configuration of a modified example.
- Examples of the plurality of information processing devices 1 include an information processing device 1-1 and an information processing device 1-2.
- the information processing device 1-1 and the information processing device 1-2 may be placed apart from each other or may be placed close to each other.
- a user U who operates (the haptic device 6 of) the information processing apparatus 1-1 is shown as a user U-1.
- a user U who operates (the haptic device 6 of) the information processing apparatus 1-2 is shown as a user U-2.
- user U-1 is a surgeon and user U-2 is an assistant.
- the operations of both user U-1 and user U-2 are reflected in the simulation.
- user U-1 and user U-2 operate their respective surgical tools to treat the same soft tissue O. This will be explained with reference to FIG.
- FIG. 9 is a diagram schematically showing the flow of the simulation.
- the surgical tool T operated by the user U-1 is referred to as surgical tool T-1 in the drawing.
- the surgical tool T operated by the user U-2 is referred to as surgical tool T-2 in the drawing.
- step S51 a simulation model is calculated.
- the processing unit 21 of the information processing device 2 calculates a simulation model based on the motion of the surgical tool T-1 and the motion of the surgical tool T-2.
- step S52 the simulation model is updated. The updated data is transferred to the information processing device 1-1 and the information processing device 1-2.
- step S53 drawing/haptics control is performed by the information processing device 1-1.
- the processing unit 11 of the information processing device 1-1 generates an image based on the updated simulation model, and also controls the haptic device 6.
- step S54 drawing/haptics control is performed by the information processing device 1-2.
- the processing unit 11 of the information processing device 1-1 generates an image based on the updated simulation model, and also controls the haptic device 6.
- step S55 the user U-1 performs an operation.
- the processing unit 11 of the information processing device 1-1 simulates the motion of the surgical instrument T-1 in response to the operation of the haptic device 6 by the user U-1. As illustrated by the arrow extending from the broken line, the motion of the surgical tool T-1 changes.
- the motion data of the surgical tool T-1 is transferred to the information processing device 2.
- step S56 the user U-2 performs an operation.
- the processing unit 11 of the information processing device 1-2 simulates the motion of the surgical tool T-2 in response to the operation of the haptic device 6 by the user U-2. As illustrated by the arrow extending from the broken line, the motion of the surgical instrument T-2 changes.
- the motion data of the surgical tool T-2 is transferred to the information processing device 2.
- step S57 the motion data of the surgical tool T on the information processing device 2 side is updated.
- step S51 to step S57 Each process is repeatedly executed, with the process from step S51 to step S57 as one cycle.
- the processing from step S58 to step S61 shown in FIG. 9 is similar to the processing from step S52 to step S54, so the description will not be repeated.
- a plurality of users U can treat the same soft tissue O using different surgical tools T. Surgical training by a plurality of users U becomes possible.
- the simulation described so far may be applied to a surgical robot simulation.
- the user U operates a robot arm that supports the surgical tool T via the haptic device 6. Since the contact force is fed back to the user U, it is possible to evaluate whether the robot arm and even the organs can be manipulated with an appropriate operating force, and whether excessive force is being applied, thereby ensuring safe robot operation. You can learn.
- simulation may be incorporated into a master-slave system. This will be explained with reference to FIG.
- FIG. 10 is a diagram showing a schematic configuration of a modified example.
- the simulation system 100 includes a leader device 3, a follower device 4, and an information processing device 2.
- the basic components of the reader device 3 are the same as those of the information processing device 1 described above.
- the communication control unit 14 of the leader device 3 also controls communication between the leader device 3 and the follower device 4.
- the follower device 4 includes a camera 8, a robot arm 9, a processing section 41, an IF section 43, and a communication control section 44.
- a camera 8 photographs the surgical field, and a robot arm 9 supports a surgical tool T.
- the processing unit 41 functions as an overall control unit that controls each element of the follower device 4.
- the processing unit 41 observes the slave environment including the patient's condition based on the image of the camera 8.
- the slave environment may also include non-visual information such as the patient's blood pressure and heart rate.
- the processing unit 41 controls the robot arm 9 and switches the position of the camera 8 in accordance with the operation of the haptic device 6 in the reader device 3 .
- the IF section 43 provides an interface between the processing section 41, the camera 8, and the robot arm 9.
- the communication control unit 44 controls communication between the follower device 4 and the leader device 3. Note that communication between the follower device 4 and the information processing device 2 may also be controlled.
- the leader device 3 and follower device 4 function as a master device and slave device that can be controlled bilaterally.
- the surgery proceeds by controlling the robot arm 9 that supports the surgical tool T in a master-slave manner.
- the image captured by the camera 8 of the follower device 4 is transmitted from the follower device 4 to the leader device 3 and displayed on the monitor 7 of the leader device 3.
- Information on other slave environments is also transmitted from the follower device 4 to the leader device 3 and presented.
- Position/force information from the haptic device 6 of the leader device 3 is transmitted from the leader device 3 to the follower device 4 and reflected in the control of the robot arm 9.
- Position/force information from the robot arm 9 is transmitted from the follower device 4 to the leader device 3 and fed back to the user via the haptic device 6 of the leader device 3.
- a highly accurate simulation model as described above is calculated and updated.
- the simulation results are reflected in the control of the robot arm 9. For example, when position/force information from the haptic device 6 of the leader device 3 is transmitted to the follower device 4, control of the robot arm 9 is restricted to ensure safety.
- the processing unit 21 of the information processing device 2 executes the safety determination process based on the calculation results of the simulation model. For example, the processing unit 21 determines whether or not operating the robot arm 9 of the follower device 4 via the leader device 3 is dangerous. If it is determined that it is dangerous, the processing unit 21 intervenes in the operation of the follower device 4 by the leader device 3. For example, the processing unit 21 modifies the position/force information transmitted from the leader device 3 to the follower device 4 so that the amount of movement of the robot arm 9 is limited or the operating force is limited. The corrected position/force information is transmitted from the leader device 3 to the follower device 4. Further, the determination result and control result by the processing unit 21 of the information processing device 2 are notified to the leader device 3 and follower device 4.
- a state slightly future than the current time may be simulated while predicting the future movement based on the movement data of the haptic device 6.
- the high computing power of the information processing device 2 it is also possible to perform such a simulation in real time. Safety can be further improved.
- FIG. 11 is a diagram showing an example of the hardware configuration of the device.
- the information processing device 1, information processing device 2, etc. described so far are realized by, for example, a computer 1000 shown in FIG. 11.
- the computer 1000 has a CPU/GPU 1100, a RAM 1200, a ROM (Read Only Memory) 1300, an HDD (Hard Disk Drive) 1400, a communication interface 1500, and an input/output interface 1600. Each part of computer 1000 is connected by bus 1050.
- the CPU/GPU 1100 operates based on a program stored in the ROM 1300 or HDD 1400 and controls each part. For example, the CPU/GPU 1100 loads programs stored in the ROM 1300 or HDD 1400 into the RAM 1200, and executes processes corresponding to various programs.
- the ROM 1300 stores boot programs such as BIOS (Basic Input Output System) that are executed by the CPU/GPU 1100 when the computer 1000 is started, programs that depend on the hardware of the computer 1000, and the like.
- BIOS Basic Input Output System
- the HDD 1400 is a computer-readable recording medium that non-temporarily records programs executed by the CPU/GPU 1100 and data used by the programs.
- HDD 1400 is a recording medium that records a program for an information processing method according to the present disclosure, which is an example of program data 1450.
- the communication interface 1500 is an interface for connecting the computer 1000 to an external network 1550 (for example, the Internet).
- the CPU/GPU 1100 receives data from other devices or transmits data generated by the CPU/GPU 1100 to other devices via the communication interface 1500.
- the input/output interface 1600 is an interface for connecting the input/output device 1650 and the computer 1000.
- the CPU/GPU 1100 receives data from an input device such as a keyboard or mouse via the input/output interface 1600. Further, the CPU/GPU 1100 transmits data to an output device such as a display, speaker, or printer via an input/output interface 1600.
- the CPU/GPU 1100 receives motion data from the haptic device 6 and sends simulated contact force to the haptic device via the input/output interface 1600.
- the input/output interface 1600 may function as a media interface that reads programs and the like recorded on a predetermined computer-readable recording medium.
- the media is, for example, an optical recording medium such as a DVD (Digital Versatile Disc), a PD (Phase Change rewritable Disk), a magneto-optical recording medium such as an MO (Magneto-Optical Disk), a tape medium, a magnetic recording medium, or a semiconductor memory.
- an optical recording medium such as a DVD (Digital Versatile Disc), a PD (Phase Change rewritable Disk), a magneto-optical recording medium such as an MO (Magneto-Optical Disk), a tape medium, a magnetic recording medium, or a semiconductor memory.
- the CPU/GPU 1100 of the computer 1000 controls the processing unit 11 or the processing unit 21 by executing a program loaded on the RAM 1200. Realize each function. Further, the program may be stored in the HDD 1400. Note that the CPU/GPU 1100 reads the program data 1450 from the HDD 1400 and executes it, but as another example, the program may be acquired from another device via the external network 1550.
- Each of the above components may be constructed using general-purpose members, or may be constructed using hardware specialized for the function of each component. Such a configuration may be changed as appropriate depending on the level of technology at the time of implementation.
- the simulation system 100 includes an information processing device 1 (first information processing device) and an information processing device 2 (first information processing device) that cooperate with each other to provide a surgical simulation. a second information processing device).
- the information processing device 1 and the information processing device 2 transmit and receive simulation data to and from each other via the all-optical communication network N.
- a simulation is provided by the cooperation of the information processing device 1 and the information processing device 2. Further, by using the all-optical communication network N, it is possible to suppress (reduce delay) communication delay between the information processing device 1 and the information processing device 2. Therefore, simulation performance can be improved compared to, for example, a case where simulation is provided only by the information processing device 1. It is particularly useful for simulations that have functions such as haptic presentation that require real-time performance, and can perform simulations of deformation of flexible objects and minute tissues with higher precision, for example.
- the information processing device 1 and the information processing device 2 may send and receive simulation data to and from each other using remote DMA. This makes it possible to further reduce delay.
- the information processing device 2 may calculate a simulation model that includes soft tissue O (for example, an organ, etc.). Calculation of the simulation model may include calculation of deformation of the soft tissue O. Further, the information processing device 2 may calculate the contact force caused by the interaction between the soft tissue O and the surgical tool T. For example, by having the information processing device 2 execute such processing that requires a large calculation load, simulation performance can be improved. It is also possible to perform a thin simulation with a cycle that is faster than the cycle used in general video processing and is suitable for calculation of haptic sense, for example, 1 kHz or more.
- the information processing device 1 controls the haptic device 6 operated by the user U so that the contact force calculated by the information processing device 2 is fed back to the user U. You can control it.
- the information processing device 2 calculates the contact force at a predetermined period (and also calculates deformation of the model, presence or absence of collision, gravity, etc.), and the information processing device 1 calculates the contact force calculated by the information processing device 2.
- the haptic device 6 is controlled so that the information is fed back to the user U at that period, and the predetermined period is determined from the calculation period when the information processing device 1 calculates the simulation model including the calculation of the contact force.
- the control period may also be short, and may be the same as or longer than the control period of the haptic device 6.
- the predetermined period may be 60 Hz or more and approximately the same as 1 kHz. Precise haptic feedback becomes possible.
- the haptic device 6 may include a joint 61 in which at least one of acceleration and force is controlled.
- the haptic device 6 may include a joint 61 provided with at least one of a six-axis force sensor and a torque sensor. By using such a haptic device 6, more precise haptic feedback is possible.
- the information processing device 1 includes a plurality of information processing devices 1 (for example, an information processing device) each used by a different user U (for example, a user U-1 and a user U-2). 1-1 and information processing device 1-2). It becomes possible to simulate surgery by a plurality of users U.
- the surgery is a surgery in which the robot arm 9 supporting the surgical instrument T is controlled in a master-slave manner, and the results of the simulation may be reflected in the control of the robot arm 9. .
- simulations can be incorporated and used in master-slave systems.
- the information processing device 2 described with reference to FIGS. 1, 2, etc. is also one of the disclosed technologies.
- the information processing device 2 provides a surgical simulation in cooperation with the information processing device 1 (another information processing device).
- the information processing device 2 includes a processing unit 21 that calculates a simulation model including a soft tissue O, an optical transmission control unit that controls transmission and reception of simulation data between the information processing device 1 and the all-optical communication network N, Equipped with. Even with such an information processing device 2, simulation performance can be improved as described above.
- the information processing method described with reference to FIG. 7 and the like is also one of the disclosed technologies.
- an information processing device 1 first information processing device
- an information processing device 2 second information processing device
- the information processing method includes the information processing device 1 and the information processing device 2 transmitting and receiving simulation data to and from each other via the all-optical communication network N (step S33, step S36).
- simulation performance can also be improved by such an information processing method.
- the present technology can also have the following configuration.
- (1) comprising a first information processing device and a second information processing device that cooperate with each other to provide a surgical simulation;
- the first information processing device and the second information processing device transmit and receive simulation data to and from each other via an all-optical communication network.
- simulation system. (2)
- the first information processing device and the second information processing device transmit and receive the simulation data to each other by remote DMA (Direct Memory Access).
- the second information processing device calculates a simulation model including soft tissue;
- the calculation of the simulation model includes calculation of soft tissue deformation.
- the soft tissue includes an organ.
- the second information processing device calculates a contact force caused by interaction between the soft tissue and the surgical tool;
- the simulation system according to (4) or (5).
- the first information processing device controls a haptic device operated by the user so that the contact force calculated by the second information processing device is fed back to the user.
- the second information processing device calculates the contact force at a predetermined period,
- the first information processing device controls the haptic device so that the contact force calculated by the second information processing device is fed back to the user at the period,
- the predetermined period is shorter than a calculation period when it is assumed that the first information processing device performs calculations of the simulation model including calculation of the contact force, and is equal to or equal to the control period of the haptic device.
- the predetermined period is a period of 60 Hz or more and about the same as 1 kHz,
- the haptic device includes a joint in which at least one of acceleration and force is controlled.
- the haptic device includes a joint provided with at least one of a six-axis force sensor and a torque sensor.
- the first information processing device is a plurality of first information processing devices, each of which is used by a different user.
- the surgery is a surgery in which a robot arm supporting a surgical instrument is controlled in a master-slave manner, The results of the simulation are reflected in the control of the robot arm.
- the simulation system according to any one of (1) to (6).
- An information processing device that provides a surgical simulation in cooperation with another information processing device, a processing unit that calculates a simulation model including soft tissue; an optical transmission control unit that controls transmission and reception of simulation data with the other information processing device via an all-optical communication network; Equipped with Information processing device.
- An information processing method in which a first information processing device and a second information processing device cooperate with each other to provide a surgical simulation, the method comprising: the first information processing device and the second information processing device transmitting and receiving simulation data to and from each other via an all-optical communication network; Information processing method.
Landscapes
- Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Surgery (AREA)
- Robotics (AREA)
- Molecular Biology (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- General Health & Medical Sciences (AREA)
- Animal Behavior & Ethology (AREA)
- Theoretical Computer Science (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Medical Informatics (AREA)
- Business, Economics & Management (AREA)
- Physics & Mathematics (AREA)
- Educational Administration (AREA)
- General Physics & Mathematics (AREA)
- Educational Technology (AREA)
- Management, Administration, Business Operations System, And Electronic Commerce (AREA)
- Manipulator (AREA)
Abstract
Description
0.序
1.実施形態
2.変形例
3.ハードウェア構成の例
4.効果の例
外科手術には医師の高度なスキルが必要で、トレーニングが重要である。これまでの手術シミュレーションは、臓器を模した映像表示等が可能でその手技を学ぶことはできるが、実際の柔軟な臓器の変形や力覚を正確には反映しておらず、初期トレーニングの利用にとどまる。また、大規模な人体モデルを扱うことは難しいため、部分的な臓器のモデルシミュレーションに限定される。
図1は、実施形態に係るシミュレーションシステムの概要を模式的に示す図である。シミュレーションシステム100は、大規模演算が可能なハプティクス手術シミュレータとして用いることができる。例えば、腹腔鏡手術、胸部外科、脳外科等、軟組織(柔らかい臓器等)を扱う手術のトレーニングや術前計画に用いることができる。軟組織の変形やリアルな反力を感じながらトレーニングすることで、より正確性の高い事前体験が行え、スキルの向上・手術成績の向上に効果が期待される。
開示される技術は、上記の実施形態に限定されない。いくつかの変形例について説明する。例えば、シミュレーションシステム100は、各々が異なるユーザUによって利用される複数の情報処理装置1を含んでよい。全光通信ネットワークNを介した多拠点接続が可能である。図8を参照して説明する。
図11は、装置のハードウェア構成の例を示す図である。これまで説明した情報処理装置1、情報処理装置2等は、例えば図11に示されるコンピュータ1000によって実現される。
以上で説明した技術は、例えば次のように特定される。開示される技術の1つは、シミュレーションシステム100である。図1及び図2等を参照して説明したように、シミュレーションシステム100は、互いに協働することにより手術のシミュレーションを提供する情報処理装置1(第1の情報処理装置)及び情報処理装置2(第2の情報処理装置)を備える。情報処理装置1及び情報処理装置2は、全光通信ネットワークNを介して、シミュレーションデータを互いに送受信する。
(1)
互いに協働することにより手術のシミュレーションを提供する第1の情報処理装置及び第2の情報処理装置を備え、
前記第1の情報処理装置及び前記第2の情報処理装置は、全光通信ネットワークを介して、シミュレーションデータを互いに送受信する、
シミュレーションシステム。
(2)
前記第1の情報処理装置及び前記第2の情報処理装置は、リモートDMA(Direct Memory Access)によって、前記シミュレーションデータを互いに送受信する、
(1)に記載のシミュレーションシステム。
(3)
前記第2の情報処理装置は、軟組織を含むシミュレーションモデルを計算する、
(1)又は(2)に記載のシミュレーションシステム。
(4)
前記シミュレーションモデルの計算は、軟組織の変形の算出を含む、
(3)に記載のシミュレーションシステム。
(5)
前記軟組織は、臓器を含む、
(4)に記載のシミュレーションシステム。
(6)
前記第2の情報処理装置は、前記軟組織と術具との相互作用によって生じる接触力を算出する、
(4)又は(5)に記載のシミュレーションシステム。
(7)
前記第1の情報処理装置は、前記第2の情報処理装置によって算出された前記接触力がユーザにフィードバックされるように、前記ユーザが操作するハプティックデバイスを制御する、
(6)に記載のシミュレーションシステム。
(8)
前記第2の情報処理装置は、所定の周期で、前記接触力を算出し、
前記第1の情報処理装置は、前記第2の情報処理装置によって算出された前記接触力が前記周期で前記ユーザにフィードバックされるように、前記ハプティックデバイスを制御し、
前記所定の周期は、前記第1の情報処理装置が前記接触力の算出を含む前記シミュレーションモデルの計算を行ったと仮定した場合の計算周期よりも短く、前記ハプティックデバイスの制御周期と同じ又は当該制御周期よりも長い、
(7)に記載のシミュレーションシステム。
(9)
前記所定の周期は、60Hz以上、かつ1kHzと同程度の周期である、
(8)に記載のシミュレーションシステム。
(10)
前記ハプティックデバイスは、加速度及び力の少なくとも一方が制御される関節を含む、
(8)又は(9)に記載のシミュレーションシステム。
(11)
前記ハプティックデバイスは、6軸力センサ及びトルクセンサの少なくとも一方が設けられた関節を含む、
(8)~(10)のいずれかに記載のシミュレーションシステム。
(12)
前記第1の情報処理装置は、各々が異なるユーザによって利用される複数の第1の情報処理装置である、
(1)~(11)のいずれかに記載のシミュレーションシステム。
(13)
前記手術は、術具を支持するロボットアームをマスタスレーブ方式で制御する手術であり、
前記シミュレーションの結果は、前記ロボットアームの制御に反映される、
(1)~(6)のいずれかに記載のシミュレーションシステム。
(14)
別の情報処理装置との協働により手術のシミュレーションを提供する情報処理装置であって、
軟組織を含むシミュレーションモデルを計算する処理部と、
前記別の情報処理装置との間での全光通信ネットワークを介したシミュレーションデータの送受信を制御する光伝送制御部と、
を備える、
情報処理装置。
(15)
第1の情報処理装置及び第2の情報処理装置が互いに協働することにより手術のシミュレーションを提供する情報処理方法であって、
第1の情報処理装置及び前記第2の情報処理装置が全光通信ネットワークを介してシミュレーションデータを互いに送受信することを含む、
情報処理方法。
11 処理部
12 記憶部
13 IF部
14 通信制御部
15 光伝送制御部
16 光伝送装置
2 情報処理装置
21 処理部
22 記憶部
24 通信制御部
25 光伝送制御部
26 光伝送装置
27 仮想レイヤ
3 リーダー装置(マスタ装置)
4 フォロワー装置(スレーブ装置)
41 処理部
43 IF部
44 通信制御部
6 ハプティックデバイス
61 関節
611 モータ
612 エンコーダ
613 センサ
614 モータ制御部
615 ドライバ
62 ハプティクス制御部
7 モニタ
8 カメラ
9 ロボットアーム
100 シミュレーションシステム
1000 コンピュータ
1050 バス
1100 CPU/GPU
1200 RAM
1300 ROM
1400 HDD
1450 プログラムデータ
1500 通信インターフェイス
1600 入出力インターフェイス
1650 入出力デバイス
C カメラ
F 視野
N 全光通信ネットワーク
O 軟組織
R ロボットアーム
T 術具
U ユーザ
Claims (15)
- 互いに協働することにより手術のシミュレーションを提供する第1の情報処理装置及び第2の情報処理装置を備え、
前記第1の情報処理装置及び前記第2の情報処理装置は、全光通信ネットワークを介して、シミュレーションデータを互いに送受信する、
シミュレーションシステム。 - 前記第1の情報処理装置及び前記第2の情報処理装置は、リモートDMA(Direct Memory Access)によって、前記シミュレーションデータを互いに送受信する、
請求項1に記載のシミュレーションシステム。 - 前記第2の情報処理装置は、軟組織を含むシミュレーションモデルを計算する、
請求項1に記載のシミュレーションシステム。 - 前記シミュレーションモデルの計算は、軟組織の変形の算出を含む、
請求項3に記載のシミュレーションシステム。 - 前記軟組織は、臓器を含む、
請求項4に記載のシミュレーションシステム。 - 前記第2の情報処理装置は、前記軟組織と術具との相互作用によって生じる接触力を算出する、
請求項4に記載のシミュレーションシステム。 - 前記第1の情報処理装置は、前記第2の情報処理装置によって算出された前記接触力がユーザにフィードバックされるように、前記ユーザが操作するハプティックデバイスを制御する、
請求項6に記載のシミュレーションシステム。 - 前記第2の情報処理装置は、所定の周期で、前記接触力を算出し、
前記第1の情報処理装置は、前記第2の情報処理装置によって算出された前記接触力が前記周期で前記ユーザにフィードバックされるように、前記ハプティックデバイスを制御し、
前記所定の周期は、前記第1の情報処理装置が前記接触力の算出を含む前記シミュレーションモデルの計算を行ったと仮定した場合の計算周期よりも短く、前記ハプティックデバイスの制御周期と同じ又は当該制御周期よりも長い、
請求項7に記載のシミュレーションシステム。 - 前記所定の周期は、60Hz以上、かつ1kHzと同程度の周期である、
請求項8に記載のシミュレーションシステム。 - 前記ハプティックデバイスは、加速度及び力の少なくとも一方が制御される関節を含む、
請求項7に記載のシミュレーションシステム。 - 前記ハプティックデバイスは、6軸力センサ及びトルクセンサの少なくとも一方が設けられた関節を含む、
請求項7に記載のシミュレーションシステム。 - 前記第1の情報処理装置は、各々が異なるユーザによって利用される複数の第1の情報処理装置である、
請求項1に記載のシミュレーションシステム。 - 前記手術は、術具を支持するロボットアームをマスタスレーブ方式で制御する手術であり、
前記シミュレーションの結果は、前記ロボットアームの制御に反映される、
請求項1に記載のシミュレーションシステム。 - 別の情報処理装置との協働により手術のシミュレーションを提供する情報処理装置であって、
軟組織を含むシミュレーションモデルを計算する処理部と、
前記別の情報処理装置との間での全光通信ネットワークを介したシミュレーションデータの送受信を制御する光伝送制御部と、
を備える、
情報処理装置。 - 第1の情報処理装置及び第2の情報処理装置が互いに協働することにより手術のシミュレーションを提供する情報処理方法であって、
第1の情報処理装置及び前記第2の情報処理装置が全光通信ネットワークを介してシミュレーションデータを互いに送受信することを含む、
情報処理方法。
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
AU2023230574A AU2023230574A1 (en) | 2022-03-08 | 2023-02-27 | Simulation system, information processing device, and information processing method |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2022035336 | 2022-03-08 | ||
JP2022-035336 | 2022-03-08 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2023171437A1 true WO2023171437A1 (ja) | 2023-09-14 |
Family
ID=87935147
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2023/006943 WO2023171437A1 (ja) | 2022-03-08 | 2023-02-27 | シミュレーションシステム、情報処理装置及び情報処理方法 |
Country Status (2)
Country | Link |
---|---|
AU (1) | AU2023230574A1 (ja) |
WO (1) | WO2023171437A1 (ja) |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0591060A (ja) * | 1991-02-20 | 1993-04-09 | Centre Natl Etud Telecommun (Ptt) | 光通信ネツトワーク |
JP2001339418A (ja) * | 2000-05-25 | 2001-12-07 | Fuji Xerox Co Ltd | 光通信ネットワーク装置 |
JP2003144453A (ja) * | 2001-11-09 | 2003-05-20 | Sony Corp | 情報処理システムおよび情報処理方法、プログラムおよび記録媒体、情報処理装置、並びに制御装置および制御方法 |
JP2008080021A (ja) * | 2006-09-28 | 2008-04-10 | Univ Waseda | シミュレーション装置、制御装置及びこれらを用いた手術用ロボットの制御システム、並びにシミュレーション装置用のプログラム |
JP2013152320A (ja) * | 2012-01-25 | 2013-08-08 | Mitsubishi Precision Co Ltd | 手術シミュレーション用モデル作成方法及びその装置並びに手術シミュレーション方法及びその装置 |
JP2017510826A (ja) * | 2013-12-20 | 2017-04-13 | インテュイティブ サージカル オペレーションズ, インコーポレイテッド | 医療処置トレーニングのためのシミュレータシステム |
JP2020062494A (ja) | 2012-09-17 | 2020-04-23 | デピュイ・シンセス・プロダクツ・インコーポレイテッド | 外科およびインターベンションの計画、支援、術後経過観察、ならびに機能回復追跡のためのシステムおよび方法 |
-
2023
- 2023-02-27 AU AU2023230574A patent/AU2023230574A1/en active Pending
- 2023-02-27 WO PCT/JP2023/006943 patent/WO2023171437A1/ja active Application Filing
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0591060A (ja) * | 1991-02-20 | 1993-04-09 | Centre Natl Etud Telecommun (Ptt) | 光通信ネツトワーク |
JP2001339418A (ja) * | 2000-05-25 | 2001-12-07 | Fuji Xerox Co Ltd | 光通信ネットワーク装置 |
JP2003144453A (ja) * | 2001-11-09 | 2003-05-20 | Sony Corp | 情報処理システムおよび情報処理方法、プログラムおよび記録媒体、情報処理装置、並びに制御装置および制御方法 |
JP2008080021A (ja) * | 2006-09-28 | 2008-04-10 | Univ Waseda | シミュレーション装置、制御装置及びこれらを用いた手術用ロボットの制御システム、並びにシミュレーション装置用のプログラム |
JP2013152320A (ja) * | 2012-01-25 | 2013-08-08 | Mitsubishi Precision Co Ltd | 手術シミュレーション用モデル作成方法及びその装置並びに手術シミュレーション方法及びその装置 |
JP2020062494A (ja) | 2012-09-17 | 2020-04-23 | デピュイ・シンセス・プロダクツ・インコーポレイテッド | 外科およびインターベンションの計画、支援、術後経過観察、ならびに機能回復追跡のためのシステムおよび方法 |
JP2017510826A (ja) * | 2013-12-20 | 2017-04-13 | インテュイティブ サージカル オペレーションズ, インコーポレイテッド | 医療処置トレーニングのためのシミュレータシステム |
Also Published As
Publication number | Publication date |
---|---|
AU2023230574A1 (en) | 2024-08-08 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Kühnapfel et al. | Endoscopic surgery training using virtual reality and deformable tissue simulation | |
Baur et al. | VIRGY: a virtual reality and force feedback based endoscopic surgery simulator | |
Huynh et al. | Haptically integrated simulation of a finite element model of thoracolumbar spine combining offline biomechanical response analysis of intervertebral discs | |
JP6457262B2 (ja) | 外科手術をシミュレーションする方法およびシステム | |
US20090253109A1 (en) | Haptic Enabled Robotic Training System and Method | |
EP2896034A1 (en) | A mixed reality simulation method and system | |
Chen et al. | Force feedback for surgical simulation | |
Gibson et al. | Finite element simulation of the spine with haptic interface | |
Pan et al. | Graphic and haptic simulation system for virtual laparoscopic rectum surgery | |
Kim et al. | Haptic interaction and volume modeling techniques for realistic dental simulation | |
Fu et al. | Robot-assisted teleoperation ultrasound system based on fusion of augmented reality and predictive force | |
WO2007112486A1 (en) | Method of modelling the interaction between deformable objects | |
Ye et al. | A fast and stable vascular deformation scheme for interventional surgery training system | |
Wagner et al. | Integrating tactile and force feedback with finite element models | |
Courtecuisse et al. | Haptic rendering of hyperelastic models with friction | |
Cakmak et al. | VS One, a virtual reality simulator for laparoscopic surgery | |
Choi et al. | An efficient and scalable deformable model for virtual reality-based medical applications | |
Shen et al. | Haptic-enabled telementoring surgery simulation | |
WO2007019546A2 (en) | System, device, and methods for simulating surgical wound debridements | |
WO2023171437A1 (ja) | シミュレーションシステム、情報処理装置及び情報処理方法 | |
Huang et al. | Virtual reality simulator for training in myringotomy with tube placement | |
WO2023171413A1 (ja) | シミュレータ、シミュレーションデータ生成方法及びシミュレータシステム | |
Kuo et al. | Virtual reality: current urologic applications and future developments | |
Zhang et al. | Maxillofacial surgical simulation system with haptic feedback. | |
WO2004081899A1 (en) | Method of generating a computer model of a deformable object |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 23766615 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2024506076 Country of ref document: JP Kind code of ref document: A |
|
ENP | Entry into the national phase |
Ref document number: 2023230574 Country of ref document: AU Date of ref document: 20230227 Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2023766615 Country of ref document: EP |
|
ENP | Entry into the national phase |
Ref document number: 2023766615 Country of ref document: EP Effective date: 20241008 |