WO2023171263A1 - Dispositif de commande et robot médical - Google Patents

Dispositif de commande et robot médical Download PDF

Info

Publication number
WO2023171263A1
WO2023171263A1 PCT/JP2023/005129 JP2023005129W WO2023171263A1 WO 2023171263 A1 WO2023171263 A1 WO 2023171263A1 JP 2023005129 W JP2023005129 W JP 2023005129W WO 2023171263 A1 WO2023171263 A1 WO 2023171263A1
Authority
WO
WIPO (PCT)
Prior art keywords
physical interface
operator
robot arm
control device
robot
Prior art date
Application number
PCT/JP2023/005129
Other languages
English (en)
Japanese (ja)
Inventor
淳 新井
隆弘 柘植
桐郎 増井
岳夫 稲垣
景 戸松
Original Assignee
ソニーグループ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ソニーグループ株式会社 filed Critical ソニーグループ株式会社
Publication of WO2023171263A1 publication Critical patent/WO2023171263A1/fr

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots

Definitions

  • the present disclosure relates to a control device and a medical robot.
  • Medical robots that autonomously perform operations using surgical instruments are known. For example, in endoscopic surgery, an endoscope that operates autonomously is used to photograph the inside of a patient's abdominal cavity, and the photographed images are displayed on a display. By performing surgery while looking at the captured images displayed on the display, the operator (surgeon) can concentrate both hands on operating the surgical instruments. A hands-free operation means is required as an operation means for such a medical robot.
  • Patent Document 1 discloses a technique in which a physical interface for operating a medical robot is attached to a surgical instrument.
  • An object of the present disclosure is to provide a control device and a medical robot that can control the operation of a medical robot in a hands-free manner and manage risks in hands-free operation.
  • a control device includes a physical interface that can be operated by a surgeon with a body part other than his or her own hand, and controls the operation of a medical robot in accordance with the operation of the surgeon on the physical interface. do.
  • FIG. 2 is a schematic diagram showing an example of the arrangement of medical robots according to existing technology.
  • FIG. 2 is a schematic diagram showing an example of the arrangement of medical robots according to existing technology.
  • FIG. 2 is a schematic diagram showing an example of the arrangement of medical robots according to existing technology.
  • 1 is a diagram schematically showing an example of the configuration of an endoscopic surgery system according to existing technology.
  • FIG. 2 is a schematic diagram showing an example of a physical interface according to an embodiment. It is a schematic diagram for explaining the relationship between a robot arm and a surgical bed.
  • FIG. 3 is a schematic diagram for explaining the illumination function of the physical interface according to the embodiment.
  • 1 is a diagram schematically showing an example of the configuration of an endoscopic surgery system according to an embodiment.
  • FIG. 2 is a functional block diagram of an example for explaining the functions of the robot arm system according to the embodiment.
  • FIG. 3 is a schematic diagram for explaining an example of effective arrangement of physical interfaces according to the embodiment.
  • FIG. 3 is a schematic diagram for explaining an example of effective arrangement of physical interfaces according to the embodiment.
  • FIG. 3 is a schematic diagram showing an example of operation mode transition of a robot arm according to a first application example of the embodiment.
  • FIG. 7 is a schematic diagram showing an example of operation mode transition of the robot arm according to a second application example of the embodiment.
  • FIG. 2 is a schematic diagram showing an example of an emergency stop button provided on a robot arm.
  • FIG. 7 is a schematic diagram showing an example of the arrangement of physical interfaces according to a first modification of the embodiment.
  • FIG. 7 is a schematic diagram showing an example of a physical interface according to a first example of a first modification of the embodiment. It is a schematic diagram which shows the example of the physical interface based on the 2nd example of the 1st modification of embodiment. It is a schematic diagram which shows the example of the physical interface based on the 3rd example of the 1st modification of embodiment.
  • FIG. 7 is a schematic diagram showing an example of the arrangement of physical interfaces according to a second modification of the embodiment.
  • FIG. 6 is a schematic diagram showing an example of a physical interface according to a first example of a second modification of the embodiment. It is a schematic diagram which shows the example of the physical interface based on the 2nd example of the 2nd modification of embodiment.
  • Embodiment 2-1-1 Physical interface example according to embodiment 2-1-2.
  • a control device for controlling the operation of a medical robot, particularly a robot arm that assists a surgeon in surgery.
  • a control device includes a physical interface configured to be operable by a surgeon with a body part other than his/her own hand.
  • a physical interface is an interface that has a physical substance and is used to convert an operation by an operator (in this case, a surgeon) into an electrical signal.
  • FIGS. 1A to 1C are schematic diagrams showing examples of the arrangement of medical robots according to existing technology.
  • the medical robot is a robot arm that assists a surgeon during surgery.
  • FIG. 1A is a top view of a patient 101 lying on a surgical bed 100 and an operator 102 standing next to the surgical bed 100 for a surgical procedure.
  • 1B is an overhead view of the state shown in FIG. 1A from diagonally above the rear of the operator 102
  • FIG. 1C is an overhead view of the state shown in FIG. 1A from the opposite side of FIG. 1B.
  • the surgical bed 100 is provided with bed rails 110 on the side.
  • bed rails 110 are provided on both sides of the surgical bed 100 for each movable region.
  • the surgical bed 100 is held at a predetermined height from the floor by a pedestal 140.
  • the robot arm 120 includes a plurality of joints and an arm that connects the joints. By driving a plurality of joints in a predetermined manner, the robot arm 120 can freely change its posture within the movable range of each joint.
  • the robot arm 120 is installed on a trolley 130 and is used as a floor-standing device.
  • the surgical bed 100 is capable of changing the inclination for each movable region
  • the robot arm 120 operates independently of the inclination of each movable region of the surgical bed 100.
  • FIG. 2 is a diagram schematically showing an example of the configuration of an endoscopic surgery system 5000 according to existing technology.
  • FIG. 2 shows a surgeon 102 performing surgery on a patient 101 on a surgical bed 100 using an endoscopic surgery system 5000. Further, in FIG. 2, the surgical bed 100 is shown as seen from the feet or head side of the patient 101, and bed rails 110 are shown on both sides thereof. For example, instruments used in surgery are attached to the bed rail 110 using clamps or the like.
  • an endoscopic surgery system 5000 includes an endoscope 5001, other surgical instruments 5017, a robot arm 120 that is a medical robot that supports the endoscope 5001, and an endoscopic surgery system 5000 for endoscopic surgery. and a rack 5037 containing various devices.
  • Rack 5037 is mounted on truck 130.
  • trocars 5025a to 5025d are punctured into the abdominal wall. Then, the lens barrel 5003 of the endoscope 5001 and other surgical instruments 5017 are inserted into the body cavity of the patient 101 from the trocars 5025a to 5025d.
  • a pneumoperitoneum tube 5019, an energy treatment tool 5021, and forceps 5023 are inserted into the body cavity of the patient 101.
  • the energy treatment tool 5021 is a treatment tool that performs incision and exfoliation of tissue, sealing of blood vessels, etc. using high frequency current or ultrasonic vibration.
  • the surgical tool 5017 shown in FIG. 2 is just an example, and various surgical tools commonly used in endoscopic surgery, such as a lever or a retractor, may be used as the surgical tool 5017.
  • An image of the surgical site inside the body cavity of the patient 101 taken by the endoscope 5001 is displayed on the display device 5041.
  • the surgeon 102 uses the energy treatment instrument 5021 and forceps 5023 to perform a treatment such as cutting off the affected area while viewing the image of the surgical site displayed on the display device 5041 in real time.
  • the pneumoperitoneum tube 5019, the energy treatment instrument 5021, and the forceps 5023 are supported by the operator 102, an assistant, or the like during the surgery.
  • Robot arm 120 includes an arm portion 5031 extending from base portion 5029.
  • the arm portion 5031 includes joint portions 5033a, 5033b, 5033c, and links 5035a, 5035b, and is driven by control from an arm control device 5045.
  • Endoscope 5001 is supported by arm portion 5031, and its position and/or posture is controlled. Thereby, the endoscope 5001 can be stably fixed in position.
  • the arm control device 5045 is capable of autonomously controlling the operation of the robot arm 120, for example, based on a model learned by machine learning.
  • the position of the endoscope indicates the position of the endoscope in space, and can be expressed as three-dimensional coordinates such as coordinates (x, y, z), for example.
  • the posture of the endoscope indicates the direction in which the endoscope faces, and can be expressed as a three-dimensional vector, for example.
  • the endoscope 5001 will be briefly described.
  • the endoscope 5001 includes a lens barrel 5003 whose distal end is inserted into the body cavity of the patient 101 over a predetermined length, and a camera head 5005 connected to the proximal end of the lens barrel 5003.
  • an endoscope 5001 configured as a so-called rigid scope having a rigid tube 5003 is shown, but the endoscope 5001 is configured as a so-called flexible scope having a flexible tube 5003. Good too.
  • An opening into which an objective lens is fitted is provided at the tip of the lens barrel 5003.
  • a light source device 5043 mounted on a rack 5037 is connected to the endoscope 5001, and light generated by the light source device 5043 is directed into the lens barrel by a light guide extending inside the lens barrel 5003. The light is guided to the tip and irradiated toward the observation target in the body cavity of the patient 101 via the objective lens.
  • the endoscope 5001 may be a direct-viewing mirror, a diagonal-viewing mirror, or a side-viewing mirror.
  • An optical system and an image sensor are provided inside the camera head 5005, and reflected light (observation light) from an observation target is focused on the image sensor by the optical system.
  • the observation light is photoelectrically converted by the image sensor, and an electric signal corresponding to the observation light, that is, an image signal corresponding to the observation image is generated.
  • the image signal is transmitted as RAW data to a camera control unit (CCU) 5039.
  • the camera head 5005 is equipped with a function of adjusting magnification and focal length by appropriately driving its optical system.
  • the camera head 5005 may be provided with a plurality of image sensors, for example, in order to support stereoscopic viewing (3D display).
  • a plurality of relay optical systems are provided inside the lens barrel 5003 in order to guide observation light to each of the plurality of image sensors.
  • the rack 5037 includes a CCU 5039, a light source device 5043, an arm control device 5045, an input device 5047, a treatment tool control device 5049, an insufflation device 5051, a recorder 5053, and a printer 5055. It is equipped with.
  • the CCU 5039 is composed of a CPU (Central Processing Unit), a GPU (Graphics Processing Unit), and the like, and centrally controls the operations of the endoscope 5001 and the display device 5041. Specifically, the CCU 5039 performs various image processing, such as development processing (demosaic processing), on the image signal received from the camera head 5005 in order to display an image based on the image signal. The CCU 5039 provides the image signal subjected to the image processing to the display device 5041. Further, the CCU 5039 transmits a control signal to the camera head 5005 to control its driving.
  • the control signal may include information regarding imaging conditions such as magnification and focal length.
  • the display device 5041 displays an image based on an image signal subjected to image processing by the CCU 5039 under control from the CCU 5039. If the endoscope 5001 is compatible with high resolution imaging such as 4K (horizontal pixels 3840 x vertical pixels 2160) or 8K (horizontal pixels 7680 x vertical pixels 4320), and/or 3D display. In the case where the display device 5041 is capable of high-resolution display and/or 3D display, the display device 5041 may be capable of high-resolution display and/or 3D display. If the display device 5041 is compatible with high-resolution shooting such as 4K or 8K, a more immersive feeling can be obtained by using a display device 5041 with a size of 55 inches or more. Furthermore, a plurality of display devices 5041 having different resolutions and sizes may be provided depending on the purpose.
  • high resolution imaging such as 4K (horizontal pixels 3840 x vertical pixels 2160) or 8K (horizontal pixels 7680 x vertical pixels 4320), and/or 3D display. In
  • the light source device 5043 includes a light emitting element such as an LED (light emitting diode) and a drive circuit for driving the light emitting element, and supplies irradiation light to the endoscope 5001 when photographing the surgical site.
  • a light emitting element such as an LED (light emitting diode)
  • a drive circuit for driving the light emitting element, and supplies irradiation light to the endoscope 5001 when photographing the surgical site.
  • the arm control device 5045 includes a processor such as a CPU, and operates according to a predetermined program to control the drive of the arm portion 5031 of the robot arm 120 according to a predetermined control method.
  • the input device 5047 is an input interface for the endoscopic surgery system 5000.
  • the user can input various information and instructions to the endoscopic surgery system 5000 via the input device 5047.
  • the user inputs various information regarding the surgery, such as patient's physical information and information about the surgical technique, via the input device 5047.
  • the user may issue an instruction to drive the arm portion 5031 or an instruction to change the imaging conditions (type of irradiation light, magnification, focal length, etc.) by the endoscope 5001 via the input device 5047.
  • an instruction to drive the energy treatment instrument 5021, etc. are input.
  • the type of the input device 5047 is not limited, and the input device 5047 may be any of various known input devices.
  • input devices such as a mouse, keyboard, touch panel, switch, lever, joystick, etc. can be applied.
  • the input device 5047 it is also possible to use a mixture of multiple types of input devices.
  • a foot switch 5057 that is placed at the feet of an operator (for example, the operator 102) and is operated by the feet of the operator can also be applied as the input device 5047.
  • the touch panel may be provided on the display surface of the display device 5041.
  • the input device 5047 is not limited to the above example.
  • the input device 5047 can be a device worn by the user, such as a glasses-type wearable device or a head mounted display (HMD).
  • the input device 5047 can perform various inputs according to the user's gestures and line of sight detected by the devices worn by these users.
  • the input device 5047 can include a camera that can detect the user's movements. In this case, the input device 5047 can perform various inputs according to the user's gestures and line of sight detected from the video captured by the camera. Furthermore, the input device 5047 can include a microphone that can pick up the user's voice. In this case, the input device 5047 can perform voice recognition based on the voice picked up by the microphone, analyze the voice of the speaker (for example, the surgeon 102), and input various operations using voice.
  • the input device 5047 is configured to be able to input various information without contact, a user (for example, the operator 102) who belongs to a clean area can operate equipment belonging to a dirty area without contact. becomes possible. Further, since the user can operate the device without taking his hand off the surgical tool that he has, the user's convenience is improved.
  • the treatment tool control device 5049 controls the driving of the energy treatment tool 5021 for cauterizing tissue, incising, sealing blood vessels, etc.
  • the pneumoperitoneum device 5051 inflates the body cavity of the patient 101 through the pneumoperitoneum tube 5019 in order to secure a field of view with the endoscope 5001 and a working space for the operator 102. Inject gas.
  • the recorder 5053 is a device that can record various information regarding surgery.
  • the printer 5055 is a device that can print various types of information regarding surgery in various formats such as text, images, or graphs.
  • a foot switch operated by the operator 102 with his/her foot to the existing foot switch 5057.
  • a foot pedal may also be used as a means of switching the operation mode of the robot arm 120. Addition of this may reduce the operability of medical equipment other than the robot arm 120 and the robot arm 120 itself.
  • a control device for controlling the robot arm 120 according to the present disclosure includes a physical interface that can be operated by a surgeon with a body part other than his or her own hand.
  • the operator can control the operation of the medical robot hands-free at any time, and risk management during hands-free operation of the medical robot becomes possible. Further, by using the control device according to the present disclosure, it is possible to eliminate the need for physical interface replacement work that accompanies replacement of surgical tools during surgery.
  • FIG. 3 is a schematic diagram showing an example of a physical interface according to the embodiment.
  • the physical interface 10 according to the embodiment is removably attached to the bed rail 110 of the surgical bed 100 by, for example, a clamp (not shown). It is configured as a switch that is activated by pressing in the direction of .
  • the output of the physical interface 10 is transmitted to the arm control device 5045 via the cable 11, for example.
  • Arm control device 5045 can control the operation of robot arm 120 in response to signals transmitted from physical interface 10 .
  • the physical interface 10 is configured to be attachable at any position on the bed rail 110 of the surgical bed 100.
  • the physical interface 10 is removably attached to the bed rail 110 using a clamp or the like.
  • the physical interface 10 is preferably attached to the bed rail 110 of the surgical bed 100 at a position where it does not interfere with other equipment.
  • a plurality of physical interfaces 10 may be installed for one robot arm 120.
  • each of the plurality of physical interfaces 10 may instruct one robot arm 120 to perform the same operation.
  • control by the physical interface 10 can be performed easily. realizable.
  • the physical interface 10 is provided with a surgical interface at a position where the surgeon 102 can operate the body part other than his or her own hand, more specifically, at a position below the waist 1023 of the surgeon 102. It is attached to the bed 100 (bed rail 110).
  • Section (b) of FIG. 3 shows an example of the mounting position of the physical interface 10 according to the embodiment. In the example of section (b) in FIG. 3, the physical interface 10 is attached at a position where it can be operated depending on the body position from the knee 1021 to the thigh 1022.
  • the surgeon 102 can operate the physical interface 10 in a manner different from the foot switch 5057 without using his hands, that is, while handling the surgical instrument with both hands. I can do it. Therefore, the surgeon 102 can smoothly operate the robot arm 120 even when both hands are occupied for operating the surgical instrument. That is, the surgeon 102 can smoothly operate the robot arm 120 hands-free.
  • the operability for the operator 102 is improved.
  • the physical interface 10 by attaching the physical interface 10 to the bed rail 110, there is no need to attach or detach the physical interface 10 when replacing a surgical instrument, which occurs when attaching the physical interface 10 to a surgical instrument or hand. Furthermore, since the physical interface 10 is attached to the bed rail 110, it is possible to cleanly separate the physical interface 10 at low cost by putting it inside an existing covering cloth or covering it with a simple drape. Good economy.
  • FIG. 4 is a schematic diagram for explaining the relationship between the robot arm 120 and the surgical bed 100.
  • the robot arm 120 has a floor-standing configuration installed on a trolley 130, and operates independently of the surgical bed 100.
  • the angle of a part of the surgical bed 100 may be changed depending on, for example, the condition of the surgery. Therefore, for example, during surgery, if the inclination of the surgical bed 100 is changed while the lens barrel 5003 of the endoscope 5001 is inserted into the body cavity of the patient 101 by the robot arm 120, the lens barrel 5003 is inserted into the body cavity of the patient 101. This results in undesirable movement within the body cavity.
  • the physical interface 10 may include a tilt sensor that detects the inclination of the physical interface 10. Tilt information indicating the tilt detected by the tilt sensor is transmitted to the arm control device 5045 via the cable 11.
  • the physical interface 10 since the physical interface 10 according to the embodiment is attached to the bed rail 110 of the surgical bed 100, it is possible to detect the inclination of the surgical bed 100 at the attached position.
  • the physical interface 10 is attached to the surgical bed 100 at a position corresponding to the affected area to be operated on, and the operation of the robot arm 120 is controlled according to the output of the tilt sensor included in the physical interface 10. can be made to follow the inclination of the surgical bed 100.
  • a light emitting section is provided for the physical interface 10 to have a lighting function.
  • FIG. 5 is a schematic diagram for explaining the illumination function of the physical interface 10 according to the embodiment.
  • a patient 101 is covered with a cloth 20 except for the affected area, and a surgeon 102 is shown handling surgical tools with his left hand 102hL and right hand 102hR and performing surgery.
  • the bed rail 110 is covered with a covering cloth 20.
  • the physical interface 10 attached to the bed rail 110 (not shown) is located inside the upholstery 20.
  • the physical interface 10 is covered with a transparent drape to keep itself clean. Therefore, it may be difficult for the operator 102 to directly visually recognize the physical interface 10. Therefore, the physical interface 10 according to the embodiment is provided with a light emitting section 12 to have an illumination function, and the light emitting section 12 is made to emit light at all times during surgery, for example.
  • the light emitting unit 12 is provided at a position where the light emitted from the physical interface 10 can be easily recognized by the operator 102 when the physical interface 10 is attached to the bed rail 110, for example. Further, it is preferable that the light emitting unit 12 emits light with such intensity that the emitted light passes through the covering cloth 20 or the drape to some extent and is easily recognized by the operator 102. Note that the intensity of the light emitted by the light emitting unit 12 is preferably such that the operator 102 does not feel glare when passing through the covering cloth 20 or drape.
  • FIG. 6 is a diagram schematically showing an example of the configuration of an endoscopic surgery system 5000a according to the embodiment.
  • an endoscopic surgery system 5000a has a physical interface 10 added to the endoscopic surgery system 5000 according to the existing technology described using FIG.
  • the physical interface 10 is removably attached to the bed rail 110, as described above.
  • the physical interface 10 is wired via the cable 11 to an arm control device 5045a corresponding to the arm control device 5045 in FIG.
  • the physical interface 10 can also be connected to the arm control device 5045a by wireless communication without using the cable 11. However, considering the response to the operation of the physical interface 10 and risks such as communication errors, it is preferable that the physical interface 10 is connected to the arm control device 5045a by wire.
  • FIG. 7 is an example functional block diagram for explaining the functions of the robot arm system according to the embodiment.
  • the robot arm system 50 includes a robot arm control section 500 and a robot arm 120.
  • Robot arm control section 500 is included in arm control device 5045a shown in FIG.
  • the physical interface (I/F) 10 includes a switch (SW) section 13, a tilt sensor 14, and a light emitting section 12.
  • the switch unit 13 may have a configuration including, for example, a microswitch and an output circuit that outputs a signal according to an operation on the microswitch.
  • a signal output from the output circuit (referred to as a switch signal) is transmitted to the arm control device 5045a via the cable 11 and received by the robot arm control section 500.
  • the switch signal may be, for example, a signal that simply indicates on and off. Further, for example, the switch signal may be a signal that combines a synchronization pattern and a pattern indicating on and off.
  • the physical interface 10 is not limited to this, and may simply be a switch that opens/closes a circuit in response to an operation. In this case, for example, the robot arm control unit 500 needs to constantly supply signals to the physical interface 10.
  • the tilt sensor 14 detects the tilt of the physical interface 10 and outputs tilt information indicating the detected tilt. For example, the tilt sensor 14 detects the angle of the physical interface 10 with respect to the direction of gravity, and outputs the detected angle as tilt information. As the tilt sensor 14, for example, a gyro sensor that detects tilt based on angular acceleration of three axes can be used.
  • the tilt information output from the tilt sensor 14 is transmitted to the arm control device 5045a via the cable 11 and received by the robot arm control unit 500.
  • the light emitting unit 12 can use an LED (Light Emitting Diode) as a light emitting element.
  • the light emitting unit 12 may be controlled to turn on/off the light emitted by, for example, a switch provided on the physical interface 10 .
  • the present invention is not limited to this, and the robot arm control unit 500 may control on/off of the light emission of the light emitting unit 12.
  • the light emitting element applied to the light emitting unit 12 is not limited to an LED as long as it can achieve the purpose of being easily recognized by the operator 102 through the covering cloth 20 or drape.
  • the light emitting section 12 may have a dimming function.
  • the instruction recognizer 60 includes, for example, a voice recognizer, analyzes a voice signal based on the voice picked up by the microphone 61, and obtains instruction information indicating a voice instruction.
  • the instruction recognizer 60 can recognize, for example, an utterance by the surgeon 102 to control the operation of the robot arm 120, and can acquire instruction information for instructing the operation of the robot arm 120.
  • the instruction recognizer 60 may include a line of sight recognizer, recognize the line of sight (for example, the direction of the eyeball) based on the image captured by the camera 62, and acquire instruction information indicating instructions based on the line of sight. You can do it.
  • the instruction recognizer 60 can recognize, for example, the line of sight of the operator 102 to control the operation of the robot arm 120, and can acquire instruction information for controlling the robot arm 120.
  • the instruction recognizer 60 may have both a voice recognizer and a line of sight recognizer, or may have either one of them.
  • the robot arm 120 includes a joint section 121 and a drive control section 122.
  • the joint section 121 includes a joint information detection section 1210 and a joint section drive section 1211.
  • Drive control section 122 generates a drive control signal for driving joint section 121 based on command value information supplied from command value generation section 513, which will be described later.
  • the joint drive unit 1211 drives the joint 121 according to a drive control signal generated by the drive control unit 122.
  • the joint information detection section 1210 detects the state of the joint section 121 using a sensor or the like and acquires joint information. Joint information detection section 1210 passes the acquired joint information to state acquisition section 510 and command value generation section 513, which will be described later.
  • the robot arm 120 is shown to include one joint 121 for the sake of explanation, but in reality, the robot arm 120 includes a plurality of joints 121.
  • the robot arm control section 500 includes a state acquisition section 510, a calculation condition determination section 511, and a command value generation section 513.
  • the state acquisition unit 510 acquires the switch signal output from the switch unit 13 of the physical interface 10 and the tilt information output from the tilt sensor 14. Further, the state acquisition section 510 acquires each joint information output from the joint information detection section 1210 of each joint section 121. The state acquisition unit 510 passes the acquired switch signal, slope information, and each joint information to the calculation condition determination unit 511.
  • the calculation condition determination unit 511 acquires the switch signal, slope information, and each joint information passed from the state acquisition unit 510, and also acquires the instruction information output from the instruction recognizer 60. The calculation condition determining unit 511 determines how the robot arm 120 behaves based on each piece of information and signals acquired. The calculation condition determination unit 511 passes information indicating the determined behavior of the robot arm 120 to the force calculation unit 512.
  • the force calculation unit 512 has a model regarding the movement of the robot arm 120, which has been learned, for example, by machine learning.
  • the force calculation unit 512 applies information indicating the behavior of the robot arm 120 passed from the calculation condition determination unit 511 to the model, and predicts the movement of the robot arm 120.
  • the force calculation section 512 passes robot motion information indicating the predicted motion of the robot arm 120 to the command value generation section 513.
  • the command value generation unit 513 is further provided with joint information from each joint 121 in the robot arm 120.
  • the command value generation unit 513 generates a command value for instructing the drive of each joint 121 of the robot arm 120 based on the joint information passed from each joint 121 and the robot motion information passed from the force calculation unit 512. generate.
  • the command value generation unit 513 passes each generated command value to the drive control unit 122.
  • the drive control section 122 generates each drive control signal for driving each joint section 121 according to each command value passed from the command value generation section 513.
  • the robot arm 120 executes a predetermined operation by driving each joint 121 according to each drive control signal generated by the drive control unit 122.
  • the physical interface 10 according to the embodiment and the robot arm control unit 500 that controls the operation of the robot arm 120 according to the output of the physical interface 10 are included, and the robot arm control unit 500 controls the operation of the robot arm 120 according to the output of the physical interface 10.
  • a control device is configured to control the operation of the medical robot.
  • FIG. 8 is a schematic diagram showing an example of arrangement when one physical interface 10 according to the embodiment is arranged.
  • Section (a) of FIG. 8 is a top view of a patient 101 lying on a surgical bed 100 and an operator 102 standing next to the surgical bed 100 for a surgical procedure.
  • section (b) in the same figure is an overhead view of the state of section (a) in the same figure from diagonally above and behind the operator 102.
  • the physical interface 10 when one physical interface 10 is arranged, the physical interface 10 is arranged at a position corresponding to the standing position of the surgeon 102 on the bed rail 110 of the surgical bed 100.
  • the operator 102 when the operator 102 performs surgery on the patient 101 lying on the surgical bed 100, the operator 102 operates the physical interface 10 in the lower back of the operator 102 or in the body part below the waist.
  • the physical interface 10 is located at a location corresponding to the body part used for the purpose.
  • the physical interface 10 is placed at a position corresponding to the thigh or knee of the left leg of the operator 102. Since the physical interface 10 is detachably attached to the bed rail 110, the arrangement position can be easily adjusted.
  • FIG. 9 is a schematic diagram showing an arrangement example when a plurality of physical interfaces 10 are arranged.
  • two physical interfaces 10a and 10b are used, and that the physical interface 10a is operated by the surgeon 102, and the physical interface 10b is operated by the assistant 103 who assists the surgeon 102 in the surgery.
  • the robot arm system 50 performs the same control on each output of these physical interfaces 10a and 10b.
  • Section (a) of FIG. 9 is a top view of a patient 101 lying on a surgical bed 100 and an operator 102 standing next to the surgical bed 100 for a surgical procedure.
  • the assistant 103 stands in a position facing the surgeon 102 with the surgical bed 100 in between.
  • Section (b) in the same figure is an overhead view of the state of section (a) in the same figure from diagonally above and behind the assistant 103.
  • the arrangement position of the physical interface 10a operated by the surgeon 102 is the same as the position described using FIG. 8. Furthermore, the position of the physical interface 10b operated by the assistant 103 is basically the same as the positional relationship between the surgeon 102 and the physical interface 10a. Specifically, the physical interface 10b may be placed on the bed rail 110 of the surgical bed 100 at a position corresponding to the standing position of the assistant 103.
  • the first application example is an example in which control of the robot arm 120 is enabled or disabled according to an instruction recognized by the instruction recognizer 60 by operating the physical interface 10 according to the embodiment.
  • the instruction recognizer 60 acquires instruction information corresponding to the sound picked up by the microphone 61.
  • FIG. 10 is a schematic diagram showing an example of the operation mode transition of the robot arm 120 according to the first application example of the embodiment.
  • the voice recognition state 200 includes an autonomous control mode 211 that instructs autonomous control of the robot arm 120, and a manual control mode 211 that instructs autonomous control of the robot arm 120, and manually controls the robot arm 120 using, for example, a user interface (UI) provided by an input device 5047.
  • UI operation mode 212 to be operated.
  • the calculation condition determination unit 511 determines the voice recognition function of the instruction recognizer 60 for the robot arm 120 based on the switch signal output from the physical interface 10 and passed from the state acquisition unit 510 in response to the physical interface (IF) operation 220. It is determined whether the corresponding control is enabled or disabled.
  • the calculation condition determining unit 511 determines that the control is valid, it sets the voice recognition state 200 to a voice reception start state 210 in which voice reception is started. In the voice reception start state 210, the calculation condition determining unit 511 switches the operation mode of the robot arm 120 between the autonomous control mode 211 and the UI operation mode 212 according to the voice recognized by the instruction recognizer 60.
  • the calculation condition determination unit 511 determines that the control is invalid, it stops the control and transitions the operation mode of the robot arm 120 to the arm direct operation mode according to the operation on the arm direct operation 231 button.
  • the arm direct operation button 231 is, for example, an operator provided on the robot arm 120 to operate the robot arm 120 without being controlled by the arm control device 5045a.
  • enabling and disabling control of the robot arm 120 according to the voice recognition function is controlled according to a predetermined activation word uttered by the operator 102 or the like (for example, utterance of "start voice").
  • a predetermined activation word uttered by the operator 102 or the like for example, utterance of "start voice".
  • the activation word generated by the operator 102 or the like is changed into an operation of the physical interface 10. Therefore, it is not necessary to speak the activation word, and operability can be improved. That is, according to the first application example of the embodiment, the success rate of activation related to voice control of the operation of the robot arm 120 can be set to 100%.
  • calculation condition determining unit 511 determines whether the operation on the physical interface 10 is of a momentary type in which the state is on only while the operation is being performed (for example, by pressing), or an alternative type in which the state is switched on or off each time the operation is performed. may also be controlled.
  • the calculation condition determining unit 511 performs control according to the voice recognition function of the instruction recognizer 60 while the operator 102 is pressing the switch unit 13 of the physical interface 10, for example. is enabled, and the voice recognition state 200 is controlled to the voice reception start state 210. Further, if the switch unit 13 of the physical interface 10 is not pressed, the calculation condition determining unit 511 disables and stops the control according to the voice recognition function, and changes the operation mode of the robot arm 120 to the arm direct operation mode 230.
  • the calculation condition determination unit 511 switches between enabling and disabling the control according to the voice recognition function, for example, every time the surgeon 102 presses the switch unit 13 of the physical interface 10.
  • the second application example is an example in which the instruction recognized by the instruction recognizer 60 is ignored and the robot arm 120 can be manually operated by operating the physical interface 10 according to the embodiment. Similar to the above-described first application example, the instruction recognizer 60 will be described as a case where instruction information is acquired in accordance with the sound picked up by the microphone 61.
  • FIG. 11 is a schematic diagram showing an example of the operation mode transition of the robot arm 120 according to the second application example of the embodiment.
  • the calculation condition determining unit 511 changes the voice recognition state 200 shown in FIG. . Once the calculation condition determining unit 511 enters the voice reception start state 210 in the voice recognition state 200, it always maintains the voice reception start state 210 regardless of whether or not there is any operation on the physical interface 10. Further, the calculation condition determination unit 511 cancels the voice reception start state 210 in response to a predetermined end word 240b (for example, utterance of “end of voice”), stops control according to the voice recognition function, and controls the robot arm 120. The operation mode is changed to arm direct operation mode 230.
  • a predetermined end word 240b for example, utterance of “end of voice
  • a stop button 250 is provided as a hardware (HW) button.
  • the stop button 250 is an operator for stopping the operation of the robot arm 120 and transitioning the operation mode to the arm direct operation mode 230 in accordance with the operation.
  • This stop button 250 functions as risk management for the operation of the robot arm 120 in the case where a problem occurs in voice recognition.
  • the physical interface 10 is made to function as this stop button 250.
  • the instruction recognizer 60 fails to recognize a voice instructing the operator 102 to "stop the robot" and the robot arm 120 does not stop in response to the voice.
  • the surgeon 102 can stop the operation of the robot arm 120 by operating the physical interface 10. Therefore, according to the second application example of the embodiment, the voice recognition function can be enabled at all times with risk management in place, and operability can be improved.
  • the operator 102 wants to stop the operation of the robot arm 120 with a small delay in controlling the stop of the robot arm 120 in response to voice recognition.
  • the instruction recognizer 60 may require a longer time than usual for voice recognition processing.
  • the switch signal output from the physical interface 10 is a signal simply indicating on and off. Therefore, according to the second application example of the embodiment, the process of stopping the robot arm 120 can be executed faster than when the voice recognition function is used.
  • the third application example is an example in which the physical interface 10 according to the embodiment is used as an emergency stop button to emergency stop the robot arm 120.
  • the robot arm 120 is generally provided with an emergency stop button on its main body to completely stop its operation.
  • FIG. 12 is a schematic diagram showing an example of the emergency stop button 123 provided on the robot arm 120.
  • the emergency stop button 123 is provided directly to the robot arm 120. For example, when the operator 102 detects an abnormality in the operation of the robot arm 120, by operating the emergency stop button 123, the operator 102 can stop the operation of the robot arm 120 with an extremely small delay.
  • Section (b) of FIG. 12 schematically shows an example in which the emergency stop button 123 is placed at a position far from the operator 102.
  • the robot arm 120 is covered with a vinyl cover 21, and the emergency stop button 123 is provided at the middle portion of the robot arm 120.
  • the operator 102 In order to operate the emergency stop button 123, the operator 102 needs to extend his left hand 102hL from the position of the operator 102 through the upper part of the patient 101.
  • the operator 102 will stop operating the surgical instrument and operate the emergency stop button 123.
  • the vinyl cover 21 may get in the way. Therefore, there is a possibility that a delay may occur from the time when it becomes necessary to stop the operation of the robot arm 120 until the emergency stop button 123 is actually operated. Further, in some cases, an assistant or staff other than the surgeon 102 may be forced to operate the emergency stop button 123.
  • the operator 102 can stop operating the surgical instrument even when both hands are occupied with operating the surgical instrument. It is possible to stop the operation of the robot arm 120 without having to do so. That is, in the third application example of the embodiment, the surgeon 102 can stop the operation of the robot arm 120 hands-free.
  • FIG. 13 is a schematic diagram showing an example of the arrangement of physical interfaces according to the first modification of the embodiment.
  • a physical interface 10c according to a first modification of the embodiment is placed, for example, at the feet of a surgeon 102. It is preferable that the physical interface 10c be placed in a position where it does not interfere with the foot switch 5057 or the like.
  • a configuration for detecting an operation using light for example, distance measurement using reflection of light or detecting whether the operator 102's foot or the like is inserted into the physical interface 10c is possible based on the detection result of detecting whether or not the light is blocked. There may be a method to determine whether or not this is the case.
  • the physical interface 10c that detects operations using light By using the physical interface 10c that detects operations using light, it is possible to operate the physical interface 10c non-contact and with a different operation method than the foot switch 5057. Furthermore, by installing the physical interface 10c that detects operations using light at the feet of the surgeon 102, etc., even when both hands of the surgeon 102 are used exclusively for operating the surgical instrument, the physical interface 10c can be used to detect operations using light. It is possible to operate. That is, in the first modified example of the embodiment, the surgeon 102 can operate the physical interface 10c hands-free to control the operation of the robot arm 120.
  • FIG. 14 is a schematic diagram showing an example of a physical interface 10c-1 according to a first example of a first modification of the embodiment.
  • the physical interface 10c-1 has, for example, a structure in which one of each surface of a rectangular parallelepiped is an opening.
  • a distance measuring device 16 is provided on a surface 15, which is, for example, the top surface of an opening in a rectangular parallelepiped.
  • the distance measuring device 16 includes, for example, a light emitting section and a light receiving section, and the light emitting section irradiates the inside of the opening with light.
  • the distance measuring device 16 detects the object from the distance measuring device 16 based on the timing at which the emitted light 160 is emitted from the light emitting section and the timing at which the reflected light 161 from the object (not shown) is received by the light receiving section. Measure the distance to.
  • the distance measuring device 16 determines whether or not there is an operation on the physical interface 10c-1 based on the distance measurement result. The determination result by the distance measuring device 16 is transmitted to the arm control device 5045a via the cable 11.
  • the distance measured when nothing is inserted into the opening of the physical interface 10c-1 is set as the initial value. If distance measurement is performed with, for example, the foot (toe) of the operator 102 inserted into the opening, the emitted light 160 will be reflected by the inserted foot, so the distance measurement result will be a shorter distance than the initial value. be obtained.
  • the state acquisition unit 510 can detect an operation on the physical interface 10c-1 based on the distance measurement result.
  • the physical interface 10c-1 is shown as having a rectangular parallelepiped shape, but this is not limited to this shape.
  • the operation on the physical interface 10c-1 is detected by distance measurement using light, but this is not limited to this example.
  • a light emitting section is provided on the surface 15, and a light receiving section that receives light emitted from the light emitting section is provided on a surface opposite to the surface 15 at a position corresponding to the light emitting section.
  • the status acquisition unit 510 may detect an operation on the physical interface 10c-1 based on whether the light emitted from the light emitting unit is received by the light receiving unit.
  • FIG. 15 is a schematic diagram showing an example of a physical interface 10c-2 according to a second example of the first modification of the embodiment.
  • the physical interface 10c-2 has a wider opening in the horizontal direction than the physical interface 10c-1 shown in FIG. It has a structure in which the opening side of the side wall that is laterally in contact with the opening is notched. Further, in the example of FIG. 15, the physical interface 10c-2 is provided with a plurality of distance measuring devices 16a to 16c each including a light emitting section and a light receiving section on the top surface 17.
  • Each distance measuring device 16a to 16c determines whether or not there is an operation on the physical interface 10c-1 based on the distance measurement result, similarly to the first example of the first modification of the embodiment described above.
  • the determination results from each distance measuring device 16a to 16c are logically summed, for example, and transmitted to the arm control device 5045a via the cable 11.
  • the surgeon 102 can cause the physical interface 10c-2 to detect the operation by sliding the foot (tip of the foot) in the lateral direction.
  • FIG. 16 is a schematic diagram showing an example of a physical interface 10c-3 according to a third example of the first modification of the embodiment.
  • the physical interface 10c-3 includes a light emitting device 17-1 that emits a plurality of light beams 162, and a light receiving device 17-2 that receives each light beam 162 emitted from the light emitting device 17-1. It has a light curtain structure.
  • the light emitting device 17-1 and the light receiving device 17-2 are installed on the floor surface corresponding to the side surface of the surgical bed 100. At this time, it is preferable to make the distance between the light emitting device 17-1 and the light receiving device 17-2 somewhat wide.
  • the light receiving device 17-2 installs a light blocking object (for example, It may be determined that the tip of the foot of the operator 102 exists.
  • the determination result of the light receiving device 17-2 is transmitted to the arm control device 5045a via the cable 11.
  • the state acquisition unit 510 can detect an operation on the physical interface 10c-3 based on this determination result.
  • FIG. 17 is a schematic diagram showing an example of the arrangement of physical interfaces according to the second modification of the embodiment.
  • a physical interface 10d according to a second modification of the embodiment is placed, for example, on the floor at the feet of the surgeon 102. It is preferable that the physical interface 10d be placed in a position where it does not interfere with the foot switch 5057 or the like.
  • the state acquisition unit 510 can detect an operation on the physical interface 10d, for example, based on a detection result that the surgeon 102 has put his weight on the physical interface 10d.
  • the physical interface 10d that detects operations using pressure-sensitive sensors at the feet of the surgeon 102
  • the physical interface 10d can be used even when both hands of the surgeon 102 are used exclusively for operating the surgical instrument. It is possible to operate. That is, by applying the second modification of the embodiment, the surgeon 102 can operate the physical interface 10d hands-free and control the operation of the robot arm 120.
  • the physical interface 10d can be configured to have a certain area, making it easy to operate.
  • FIG. 18 is a schematic diagram showing an example of the physical interface 10d-1 according to the first example of the second modification of the embodiment.
  • the physical interface 10d-1 shown in FIG. 18 is sized so that its pressure sensitive range is limited to one person's operation.
  • the physical interface 10d-1 is preferably placed, for example, at the feet of the surgeon 102 when performing surgery.
  • the physical interface 10d-1 detects, for example, that the surgeon 102 has applied his/her weight.
  • the detection result of the physical interface 10d-1 is transmitted to the status acquisition unit 510 via the cable 11.
  • the status acquisition unit 510 can detect an operation on the physical interface 10d-1 based on the detection result sent from the physical interface 10d-1.
  • FIG. 19 is a schematic diagram showing an example of a physical interface 10d-2 according to a second example of a second modification of the embodiment.
  • the physical interface 10d-2 shown in FIG. 19 has a larger pressure sensitive range than the example shown in FIG. 18, and is sized to be operable by multiple people. It is preferable that the physical interface 10d-2 be placed, for example, over the entire area on one or both sides of the surgical bed 100 while avoiding interference with other equipment.
  • the present technology can also have the following configuration.
  • a physical interface configured so that the operator can operate it with a body part other than his or her own hand, controlling the operation of the medical robot according to the operator's operation on the physical interface; Control device.
  • the physical interface is configured such that the operator can operate it with his/her own waist and body parts below the waist; The control device according to (1) above.
  • the physical interface is an operation interface for operating a robot arm as the medical robot that assists the surgeon in surgery; The control device according to (1) or (2) above.
  • the physical interface includes at least one operation unit that can be operated by the operator.
  • the control device according to any one of (1) to (3) above.
  • the physical interface is configured to be attachable to a rail of a surgical bed; The control device according to any one of (1) to (4) above.
  • the physical interface includes a tilt detection unit that detects a tilt of the surgical bed.
  • the physical interface includes a lighting unit that irradiates light to the outside.
  • a plurality of the physical interfaces are installed for one medical robot, The control device according to any one of (1) to (7) above. (9) Each of the plurality of physical interfaces controls the same operation with respect to one of the medical robots according to the operation; The control device according to (8) above.
  • the physical interface is arranged at a position corresponding to the operator's feet, and detects the operation by the operator's feet using light.
  • the physical interface includes a pressure-sensitive sensor that detects pressure, and is disposed at a position corresponding to the operator's feet, and the physical interface detects the operation by the operator's feet as a result of the change in the pressure detected by the pressure-sensitive sensor. Detect based on The control device according to (1) above.
  • (12) a physical interface configured such that the operator can operate it with a body part other than his/her hands; a robot arm that assists the surgeon in surgery; a control unit that controls the operation of the robot arm according to the operator's operation on the physical interface; including, Medical robot.
  • the control unit includes: controlling the operation of the robot arm according to the result of the voice recognition; switching between enabling and disabling control of the operation of the robot arm according to the result of the voice recognition in accordance with the operator's operation on the physical interface; The medical robot according to (12) above.
  • the control unit includes: switching between enabling and disabling the voice recognition by the voice recognition unit according to the operator's operation on the physical interface; The medical robot according to (13) above.
  • the control unit includes: stopping the operation of the robot arm while the voice recognition by the voice recognition unit is enabled in response to the operator's operation on the physical interface; The medical robot according to (13) above.
  • the control unit includes: emergency stopping the operation of the robot arm in response to the operator's operation on the physical interface; The medical robot according to any one of (12) to (15) above.
  • Robot arm system 60 Instruction recognizer 61 Microphone 100 Surgical bed 101 Patient 102 Operator 103 Assistant 110 Bed rail 120 Robot arm 121 Joint part 122 Drive control part 123 Emergency stop button 160 Emitted light 161 Reflected light 162 Light beam 200 Voice recognition state 210 Voice reception start state 211 Autonomous control mode 212 UI operation mode 220 Physical interface operation 230 Arm direct operation mode 231 Arm direct operation button 250 Stop button 500 Robot arm control unit 510 Status acquisition Unit 511 Calculation condition determination unit 512 Force calculation unit 513 Command value generation unit 1210 Joint information detection unit 1211 Joint drive unit 5003 Lens barrel 5045, 5045a Arm control device 5057 Foot switch

Landscapes

  • Health & Medical Sciences (AREA)
  • Surgery (AREA)
  • Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Biomedical Technology (AREA)
  • Robotics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Manipulator (AREA)

Abstract

Le dispositif de commande selon la présente invention est pourvu d'une interface physique (10) qui est conçue pour pouvoir être actionnée par un praticien avec une partie du corps autre qu'une main du praticien, où l'actionnement d'un robot médical est commandé en fonction de l'actionnement de l'interface physique par le praticien. Le robot médical selon la présente invention comprend : une interface physique qui est conçue pour pouvoir être actionnée par un praticien avec une partie du corps autre qu'une main du praticien ; un bras de robot (120) qui aide une chirurgie par le praticien ; et une unité de commande (500) qui commande l'actionnement du bras de robot en fonction de l'actionnement de l'interface physique par le praticien.
PCT/JP2023/005129 2022-03-10 2023-02-15 Dispositif de commande et robot médical WO2023171263A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2022037472 2022-03-10
JP2022-037472 2022-03-10

Publications (1)

Publication Number Publication Date
WO2023171263A1 true WO2023171263A1 (fr) 2023-09-14

Family

ID=87936774

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2023/005129 WO2023171263A1 (fr) 2022-03-10 2023-02-15 Dispositif de commande et robot médical

Country Status (1)

Country Link
WO (1) WO2023171263A1 (fr)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018159155A1 (fr) * 2017-02-28 2018-09-07 ソニー株式会社 Système d'observation médicale, dispositif de commande et procédé de commande
JP2019134914A (ja) * 2018-02-05 2019-08-15 ミーレ カンパニー インク. 手術用ロボットのマスターコンソール
JP2021502195A (ja) * 2017-11-09 2021-01-28 クアンタム サージカル 軟組織に対する低侵襲医療介入のためのロボット機器
JP2021019949A (ja) * 2019-07-29 2021-02-18 株式会社メディカロイド 手術システム

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018159155A1 (fr) * 2017-02-28 2018-09-07 ソニー株式会社 Système d'observation médicale, dispositif de commande et procédé de commande
JP2021502195A (ja) * 2017-11-09 2021-01-28 クアンタム サージカル 軟組織に対する低侵襲医療介入のためのロボット機器
JP2019134914A (ja) * 2018-02-05 2019-08-15 ミーレ カンパニー インク. 手術用ロボットのマスターコンソール
JP2021019949A (ja) * 2019-07-29 2021-02-18 株式会社メディカロイド 手術システム

Similar Documents

Publication Publication Date Title
US9517109B2 (en) Medical system
JP7414770B2 (ja) 医療用アーム装置、医療用アーム装置の作動方法、及び情報処理装置
US11033338B2 (en) Medical information processing apparatus, information processing method, and medical information processing system
US20190022857A1 (en) Control apparatus and control method
KR101772958B1 (ko) 최소 침습 원격조종 수술 기구를 위한 환자측 의사 인터페이스
JP7015256B2 (ja) コンピュータ支援遠隔操作システムにおける補助器具制御
KR20220028139A (ko) 입체 뷰어를 위한 눈 시선 추적을 통합하는 의료 디바이스, 시스템, 및 방법
EP2988696A1 (fr) Champ de visualisation d'entrée de commande d'équipement chirurgical
KR20140112207A (ko) 증강현실 영상 표시 시스템 및 이를 포함하는 수술 로봇 시스템
US20160175057A1 (en) Assistance device for imaging support of a surgeon during a surgical operation
KR20140139840A (ko) 디스플레이 장치 및 그 제어방법
JP4027876B2 (ja) 体腔内観察システム
JP2020048706A (ja) 手術システムおよび表示方法
US20200015655A1 (en) Medical observation apparatus and observation visual field correction method
JP2018075218A (ja) 医療用支持アーム及び医療用システム
US20210251717A1 (en) Extended reality headset opacity filter for navigated surgery
US11348684B2 (en) Surgical support system, information processing method, and information processing apparatus
WO2020262262A1 (fr) Système d'observation médicale, dispositif de commande et procédé de commande
JP6902639B2 (ja) 手術システム
WO2023171263A1 (fr) Dispositif de commande et robot médical
JP3499946B2 (ja) 画像診断装置
JP2021062216A (ja) 手術システムおよび表示方法
JP2001238205A (ja) 内視鏡システム
WO2023176133A1 (fr) Dispositif de support d'endoscope, système de chirurgie endoscopique et procédé de commande
JP7128326B2 (ja) 手術システム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23766443

Country of ref document: EP

Kind code of ref document: A1