WO2019080317A1 - 手术导航定位机器人及其控制方法 - Google Patents
手术导航定位机器人及其控制方法Info
- Publication number
- WO2019080317A1 WO2019080317A1 PCT/CN2017/116668 CN2017116668W WO2019080317A1 WO 2019080317 A1 WO2019080317 A1 WO 2019080317A1 CN 2017116668 W CN2017116668 W CN 2017116668W WO 2019080317 A1 WO2019080317 A1 WO 2019080317A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- organ
- robot
- display
- surgical navigation
- target tissue
- Prior art date
Links
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/30—Surgical robots
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/70—Manipulators specially adapted for use in surgery
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/08—Detecting organic movements or changes, e.g. tumours, cysts, swellings
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B90/37—Surgical systems with images on a monitor during operation
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2046—Tracking techniques
- A61B2034/2055—Optical tracking systems
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B90/37—Surgical systems with images on a monitor during operation
- A61B2090/378—Surgical systems with images on a monitor during operation using ultrasound
Definitions
- the present invention relates to the field of medical device technology, and in particular, to a surgical navigation positioning robot and a control method thereof.
- Surgery refers to the process of using a medical device to cut or cut or otherwise manipulate a patient's skin, mucous membranes, or other tissue to treat a pathological condition.
- Surgical procedures such as laparotomy that cuts the skin and treats, reconstructs, or removes internal organs may present problems with blood loss, side effects, pain, and scarring, so the use of surgical robots is currently considered a popular alternative. the way.
- the surgical robot can not automatically identify the lesion area of the tissue and organ, nor can it automatically guide the doctor to find the lesion area, thus failing to ensure the accuracy and safety of the operation.
- the doctor can perform a surgical operation inside the patient's body while viewing the surgical image displayed on the display.
- the position of the display is generally fixed, it cannot be moved according to the doctor's posture according to the posture and position of the doctor, so that the doctor can view the surgical image on the display during the operation. bring inconvenience.
- the main object of the present invention is to provide a surgical navigation positioning robot and a control method thereof, which aim to solve the problem that the existing surgical robot cannot automatically recognize the lesion area of the tissue and organ, and the surgical image acquired by the surgical robot cannot be free according to the posture and position of the doctor.
- the problem of moving is to provide a surgical navigation positioning robot and a control method thereof, which aim to solve the problem that the existing surgical robot cannot automatically recognize the lesion area of the tissue and organ, and the surgical image acquired by the surgical robot cannot be free according to the posture and position of the doctor. The problem of moving.
- the present invention provides a surgical navigation positioning robot, including a first mechanical arm, a second mechanical arm, and a robot body, wherein the first mechanical arm is provided with an ultrasonic probe, and the second mechanical arm is disposed.
- a 3D display an outer surface of the ultrasonic probe is provided with an infrared positioner, an outer surface of the robot body is provided with an operation handle, and the robot body is provided with a micro controller suitable for implementing various program instructions and is adapted to Storing a plurality of program instructions, the program instructions being loaded by the microcontroller and performing the following steps: controlling the ultrasonic probe of the first robot arm to take an ultrasound image of a target tissue and organ during a patient's operation in real time; The ultrasound image is compared with the reference image of the normal tissue and organ to locate the lesion area of the target tissue and organ; the patient's lying operating table is the horizontal plane and the spatial coordinate system is established by the position and direction of the ultrasonic probe, and based on the spatial coordinate system Calcul
- the step of comparing the ultrasound image of the target tissue organ with the reference image of the normal tissue organ to locate the lesion region of the target tissue organ comprises the steps of: reading a reference image of the normal tissue organ from the memory; comparing the target Ultrasound images of tissues and organs and reference images of normal tissues and organs determine the difference in texture distribution between the two organs, and the lesion areas of the target tissues and organs are located according to the difference in texture distribution between the two.
- the step of generating a driving signal for driving the 3D display on the second robot arm to move in front of the doctor's eyes according to the operation instruction comprises the steps of: calculating a final position to which the 3D display is to be moved according to the operation instruction; The final position of the display calculates the angle at which the second robot arm is to be driven; generates and transmits a drive signal that drives the movement of the second robot arm at the calculated angle.
- the ultrasonic probe, the infrared positioner, the operating handle, the 3D display and the memory are all electrically connected to the microcontroller.
- the robot body further comprises a power supply device comprising a rechargeable lithium battery and a charging base, the lithium battery being electrically connected to the microcontroller, the charging base being electrically connected to the lithium battery.
- a power supply device comprising a rechargeable lithium battery and a charging base, the lithium battery being electrically connected to the microcontroller, the charging base being electrically connected to the lithium battery.
- the invention also provides a control method for a surgical navigation positioning robot, the surgical navigation positioning robot comprising a first mechanical arm, a second mechanical arm and a robot body, wherein the first mechanical arm is provided with an ultrasonic probe, and the second mechanical device a 3D display is disposed on the arm, an outer surface of the ultrasonic probe is disposed with an infrared positioner, and an outer surface of the robot body is provided with an operation handle, wherein the control method of the surgical navigation positioning robot includes the steps of: controlling the An ultrasonic probe of a robot arm ingests an ultrasonic image of a target tissue and organ during a patient's operation in real time; and compares an ultrasonic image of the target tissue and organ with a reference image of a normal tissue and organ to locate a lesion area of the target tissue and organ;
- the stage is a horizontal plane and establishes a spatial coordinate system with the position and direction of the ultrasonic probe, and calculates a position coordinate of the lesion area based on the space coordinate system; generate
- the step of comparing the ultrasound image of the target tissue organ with the reference image of the normal tissue organ to locate the lesion region of the target tissue organ comprises the steps of: reading a reference image of the normal tissue organ from the memory; comparing the target The ultrasonic image of the tissue organ and the reference image of the normal tissue and organ determine the difference in texture distribution between the two; and the lesion region of the target tissue and organ is located according to the difference in the texture distribution.
- the texture distribution difference includes a difference in tissue structure, a size difference, and a contour difference of a human tissue organogenesis lesion.
- the step of generating a driving signal for driving the 3D display on the second robot arm to move in front of the doctor's eyes according to the operation instruction comprises the steps of: calculating a final position to which the 3D display is to be moved according to the operation instruction; The final position of the display calculates the angle at which the second robot arm is to be driven; generates and transmits a drive signal that drives the movement of the second robot arm at the calculated angle.
- the surgical navigation command includes distance and direction information between the first mechanical arm and a lesion area of a target tissue organ
- the infrared light guiding point is a method for guiding a doctor to find a tissue organ during a patient's operation.
- the surgical navigation positioning robot and the control method thereof adopt the above technical solution, and achieve the following technical effects: acquiring the target tissue and organ during the operation by the ultrasonic probe 11 disposed on the first mechanical arm
- the ultrasonic image is used to track the lesion area of the target tissue and organ in real time through the infrared locator, so that the position of the lesion area is clearly visible and automatically navigated to the lesion area to guide the doctor to operate, thereby facilitating the doctor's operation and improving the efficiency of the operation.
- the 3D display placed on the second robot arm can be freely moved to the position required by the doctor, and the ultrasonic image is displayed on the 3D display for the doctor to make a surgical reference during the operation, thereby improving the accuracy and safety of the operation. Sex.
- FIG. 1 is a schematic diagram of an application environment of a preferred embodiment of the surgical navigation positioning robot of the present invention
- FIG. 2 is a schematic diagram showing the internal circuit connection of the surgical navigation positioning robot of the present invention.
- FIG. 3 is a flow chart of a preferred embodiment of a control method for a surgical navigation positioning robot of the present invention.
- FIG. 1 is a schematic diagram of an application environment of a preferred embodiment of the surgical navigation positioning robot of the present invention.
- the surgical navigation positioning robot 01 can be placed in a surgical room, and the operating room 02 is also placed in the operating room for the patient to lie flat for surgery.
- the medical robot 01 includes, but is not limited to, a first robot arm 1, a second robot arm 2, and a robot body 3.
- the robot body 3 is provided with a microcontroller 30, a memory 31 and a charging device 32.
- the outer surface of the robot body 3 is further provided with an operating handle 33.
- the first robot arm 1 is provided with an ultrasonic probe 11 electrically connected to the microcontroller 30 for ingesting an ultrasonic image of a patient's tissue and organ in real time during the patient's operation, the ultrasonic image being a tissue organ 3D image.
- the outer surface of the ultrasonic probe 11 is provided with an infrared locator 12 for generating an infrared light guiding point for guiding the doctor to find the location of the lesion of the tissue during the patient's operation.
- the second robot arm 2 is provided with a 3D display 21, which is electrically connected to the microcontroller 30.
- the embodiment of the present invention sets the 3D display 21 on the second robot arm 2 such that the 3D display 21 can freely move with the second robot arm 2.
- the 3D display 21 is a 3D display system capable of causing a doctor to feel a stereoscopic virtual feeling, and provides a stereoscopic 3D display image to a doctor, thereby providing a 3D display effect.
- the 3D display 21 can be implemented as a small, lightweight 3D display module.
- the 3D display 21 can be coupled to a second robot arm 2 having a certain degree of freedom of movement so that the doctor can move the 3D display 21 as desired.
- the 3D display 21 can be moved according to the doctor's posture according to the doctor's needs, for example, the process in which the doctor sits or stands for robotic surgery.
- the second robotic arm 2 can move the 3D display 21 to the front of the doctor's eyes to provide the most convenient location for the doctor to view the image of the 3D display 21.
- the microcontroller 30 of the present embodiment can drive the first robot arm 1 and the second robot arm 2 to move freely at a certain angle and direction.
- the microcontroller 30 may be a microprocessor, a microcontroller unit (MCU), or the like embedded in the robot body 3.
- the microcontroller 30 can receive manual input commands from the physician to determine the final location to which the 3D display 21 is to be moved. If the 3D display 21 is to be moved to the final position, the microcontroller 30 can calculate the angle at which the second robot arm 2 is to be driven, and then can generate and transmit a drive signal that drives the movement of the second robot arm 2 at the calculated angle.
- the 3D display 21 is coupled to the second robot arm 2, and the 3D display 21 can be moved to a position required by the doctor, so that the doctor can take an arbitrary posture for surgery and can view the 3D required during the surgery.
- Ultrasound images and lesion area images Therefore, when robot-assisted surgery is used, not only a doctor, an assistant, or a nurse can freely adjust the position of the 3D display 21 to view a 3D image to assist the doctor in accurately completing the surgery.
- the microcontroller 30 of the present embodiment can transmit the drive signal input by the operation handle 33 to the second robot arm 2, thereby causing the 3D display 21 to move to the front of the doctor's eyes.
- the microcontroller 30 can drive the second robot arm 2 according to the input of the operation handle 33 and Place the 3D display 21 in front of the doctor's eyes.
- FIG. 2 is a schematic diagram showing the internal circuit connection of the surgical navigation positioning robot of the present invention.
- the ultrasonic probe 11, the infrared positioner 12, the 3D display 21, the memory 31, and the power supply unit 32 are all electrically connected to the microcontroller 30.
- the electrical connection in this embodiment means that each electrical component is connected to the microcontroller 30 through one or more of a conductive line, a signal line, and a control line, so that the microcontroller 30 can control the above various electrical components to complete. The corresponding function.
- the microcontroller 30 can be a central processing unit (CPU), a microprocessor, a micro control unit chip (MCU), a data processing chip, or a control unit having a data processing function.
- the memory 31 can be a read only memory unit ROM, an electrically erasable memory unit EEPROM or a flash memory unit FLASH.
- the memory 31 stores a reference image of a normal human tissue of the human body, and stores pre-programmed computer program instructions.
- the microcontroller 30 can read and load the computer program instructions from the memory 31 so that the surgical navigation positioning robot 01 can perform surgery for the patient. Provide surgical guidelines during the procedure.
- the power supply unit 32 includes a rechargeable lithium battery 321 and a charging base 322 electrically connected to the microcontroller 30 for providing operating power to the robot 01.
- the charging stand 322 is electrically connected to the lithium battery 321 for plugging an external power source to charge the lithium battery 321 .
- the invention also provides a control method of a surgical navigation positioning robot based on a 3D display function, which is applied to the surgical navigation positioning robot 01.
- Fig. 3 is a flow chart showing a preferred embodiment of the control method of the surgical navigation positioning robot of the present invention.
- various method steps of the control method of the surgical navigation positioning robot are implemented by a computer software program in the form of computer program instructions and stored in a computer readable storage medium (eg, memory 31).
- the storage medium may include a read only memory, a random access memory, a magnetic disk or an optical disk, etc., and the computer program instructions can be loaded by the processor and perform the following steps S31 to S40.
- Step S31 controlling the ultrasonic probe of the first robot arm to take an ultrasonic image of the target tissue and organ during the patient's operation in real time; specifically, the microcontroller 30 controls the first robot arm 1 to move to the vicinity of the patient's operating table 02, and activates the ultrasonic probe.
- the ultrasonic probe 11 Real-time ingestion of ultrasound images of target tissues and organs during surgery of the patient on the operating table 02.
- the ultrasonic probe 11 may employ a three-dimensional ultrasonic probe that acquires a three-dimensional ultrasonic image of a target tissue organ in real time by transmitting a pyramid-shaped volume ultrasonic beam, and transmits the three-dimensional ultrasonic image to the microcontroller 30.
- Step S32 comparing the ultrasound image of the target tissue organ with the reference image of the normal tissue organ to locate the lesion region of the target tissue organ; specifically, the microcontroller 30 reads the reference image of the normal tissue organ from the memory 31, and The ultrasound image of the target tissue and organ is compared with the reference image of the normal tissue and organ to locate the lesion area of the target tissue and organ.
- the microcontroller 30 determines the difference in texture distribution between the ultrasound image of the target tissue organ and the reference image of the normal tissue organ to locate the lesion of the target tissue and organ according to the difference in texture distribution between the two. In the region, the difference in texture distribution includes differences in tissue structure, size difference, and contour difference of human tissue and organogenesis lesions.
- Step S33 taking the operating table lying on the patient as a horizontal plane and establishing a spatial coordinate system with the position and direction of the ultrasonic probe; specifically, the microcontroller 30 takes the operating table 02 of the patient lying flat as the horizontal plane and the position of the ultrasonic probe 11 and The direction establishes a spatial coordinate system.
- the position coordinates of the lesion area include the position and direction of the lesion area with respect to the ultrasonic probe 11.
- the pathological positioning module 102 establishes a spatial coordinate system XYZ according to the position and direction of the ultrasonic probe 11 according to the operating table 02 on which the patient lies.
- Step S34 calculating the position coordinate of the lesion area based on the space coordinate system; specifically, the microcontroller 30 calculates the arbitrary position of the ultrasonic image in the space coordinate system XYZ by the position and direction of the ultrasonic probe 11 in the space coordinate system XYZ.
- the position coordinates, in turn, the position and orientation of the target tissue relative to the ultrasound probe 11 can be known.
- Step S35 generating a surgical navigation instruction according to the position coordinates of the lesion area; specifically, the microcontroller 30 generates a surgical navigation instruction according to the position coordinates of the lesion area.
- the surgical navigation command includes distance and direction information between the first robot arm 1 and the lesion area of the target tissue organ.
- Step S36 controlling the infrared locator in the ultrasonic probe to generate an infrared light guiding point; specifically, the microcontroller 30 controls the infrared locator 12 in the ultrasonic probe 11 to generate an infrared light guiding point.
- the infrared light guiding point is a visible infrared dot for guiding a doctor to find a location of a tissue organ lesion during a patient's surgery.
- Step S37 driving the movement direction of the first robot arm according to the surgical navigation instruction to illuminate the infrared light guiding point in the lesion area; specifically, the microcontroller 30 controls the moving direction of the first mechanical arm 1 according to the surgical navigation instruction to make the infrared light guiding point Irradiation in the lesion area of the target tissue and organ, which can help the doctor to quickly and accurately find the lesion location of the target tissue and organ, thereby facilitating the doctor's surgery and improving the efficiency of the operation.
- step S38 when an operation command input from the operation handle 33 is received, a drive signal for driving the 3D display 21 on the second robot arm 2 to move in front of the doctor's eyes is generated according to the operation command.
- the microcontroller 30 receives an operation command manually input from the doctor through the operation handle 33, a drive signal for driving the second robot arm 2 to move in front of the doctor's eyes is generated according to the operation command.
- the operation instructions include moving to the left, moving to the right, moving up and moving up. That is, the doctor or nurse can manually toggle the operating handle 33 to the left, right, forward or backward, so that the second robot arm 2 drives the 3D display 21 to move to the left, to the right, to move upward, and to move up to The final location required.
- the microcontroller 30 calculates the final position to which the 3D display 21 is to be moved according to the operation instruction, calculates the angle at which the second robot arm 2 is to be driven by the final position of the 3D display 21, and then generates and transmits the calculated angle.
- a drive signal that drives the movement of the second robot arm 2.
- step S39 the movement of the second robot arm 2 is driven according to the driving signal to move the 3D display 21 to the front of the doctor's eyes.
- the microcontroller 30 drives the second robot arm 2 to move according to the driving signal, thereby driving the 3D display 21 disposed on the second robot arm 2 to move in front of the doctor's eyes, thereby providing the doctor with the largest ultrasound image of the 3D display 21. Convenient location.
- Step S40 displaying an ultrasonic image of the target tissue organ on the 3D display 21; since the 3D display 21 can be moved to a position required by the doctor, the doctor can take an arbitrary posture for surgery and can view the required operation during the surgery. Ultrasound images for the surgeon to make surgical reference during the operation, thus improving the accuracy and safety of the operation. Therefore, when the surgical navigation positioning robot 01 is used to assist the operation, not only the doctor, the assistant, or the nurse can freely adjust the position of the 3D display 21 to view the 3D ultrasonic image to assist the doctor in accurately completing the operation.
- the surgical navigation positioning robot of the present invention acquires an ultrasonic image of a target tissue and organ during a surgical procedure through an ultrasonic probe 11 disposed on the first robot arm 1, and tracks a lesion region of the target tissue and organ in real time through the infrared locator 12 to make the lesion region The location is clearly visible and automatically navigates to the lesion area to guide the surgeon's surgery, thus facilitating the surgeon's surgery and improving the efficiency of the procedure.
- the surgical navigation positioning robot of the present invention can freely move to a position required by a doctor through a 3D display 21 disposed on the second robot arm 2, and display the ultrasonic image on the 3D display 21 for the doctor to perform surgery during the surgery. Reference, thereby improving the accuracy and safety of the surgery.
- the surgical navigation positioning robot and the control method thereof adopt the above technical solution, and achieve the following technical effects: acquiring the target tissue and organ during the operation by the ultrasonic probe 11 disposed on the first mechanical arm
- the ultrasonic image is used to track the lesion area of the target tissue and organ in real time through the infrared locator, so that the position of the lesion area is clearly visible and automatically navigated to the lesion area to guide the doctor to operate, thereby facilitating the doctor's operation and improving the efficiency of the operation.
- the 3D display placed on the second robot arm can be freely moved to the position required by the doctor, and the ultrasonic image is displayed on the 3D display for the doctor to make a surgical reference during the operation, thereby improving the accuracy and safety of the operation. Sex.
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Surgery (AREA)
- Engineering & Computer Science (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Animal Behavior & Ethology (AREA)
- Heart & Thoracic Surgery (AREA)
- Medical Informatics (AREA)
- Molecular Biology (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Biomedical Technology (AREA)
- Robotics (AREA)
- Pathology (AREA)
- Radiology & Medical Imaging (AREA)
- Physics & Mathematics (AREA)
- Biophysics (AREA)
- Gynecology & Obstetrics (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Ultra Sonic Daignosis Equipment (AREA)
- Surgical Instruments (AREA)
Abstract
一种手术导航定位机器人(01)及其控制方法,该手术导航定位机器人(01)包括第一机械手臂(1)、第二机械手臂(2)和机器人本体(3),第一机械手臂(1)上设置有超声波探头(11),第二机械手臂(2)上设置有3D显示器(21),超声波探头(11)的外表面设置有红外定位器(12),机器人本体(3)的外表面设置有操作手柄(33),机器人本体(3)内设置有微控制器(30)和存储器(31)。该手术导航定位机器人(01)通过超声波探头(11)获取手术过程中目标组织器官的超声波图像,通过红外定位器(12)实时跟踪定位目标组织器官的病变区域,方便医生手术并提高手术的效率。此外,通过设置在第二机械手臂(2)上的3D显示器(21)自由移动到医生所需的位置,并将超声波图像显示在3D显示器(21)上供医生手术参考,提高手术的准确性和安全性。
Description
本发明涉及医疗器械技术领域,尤其涉及一种手术导航定位机器人及其控制方法。
手术是指使用医疗设备进行切割或切入或其它方式操作患者的皮肤、黏膜或其它组织,以处理病理状况的过程。诸如切开皮肤并且处理、重构或切除内部器官等的剖腹手术等的手术过程可能会存在失血、副作用、疼痛以及疤痕的问题,因此,手术机器人的使用目前被认为是一种受欢迎的替代方式。
目前,手术机器人不能自动识别出组织器官的病变区域,也不能自动指引医生找到病变区域,从而无法保证手术的准确性和安全性。此外,当使用手术机器人进行机器人手术时,医生可以观看显示器上显示的手术图像的同时为患者身体内部进行手术操作。然而,在医生坐着或站立进行机器人手术的过程,由于显示器的位置一般是固定的,因此不能根据医生的姿势和位置而按照医生的需要移动,从而医生在手术过程中观看显示器上的手术图像带来不便。
本发明的主要目的在于提供一种手术导航定位机器人及其控制方法,旨在解决现有手术机器人不能自动识别组织器官的病变区域,手术机器人获取的手术图像不能根据医生的姿势和位置需要而自由移动的问题。
为实现上述目的,本发明提供了一种手术导航定位机器人,包括第一机械手臂、第二机械手臂以及机器人本体,所述第一机械手臂上设置有超声波探头,所述第二机械手臂上设置有3D显示器,所述超声波探头的外表面设置有红外定位器,所述机器人本体的外表面设置有操作手柄,所述机器人本体内设置有适于实现各种程序指令的微控制器以及适于存储多条程序指令的存储器,所述程序指令由微控制器加载并执行如下步骤:控制所述第一机械手臂的超声波探头实时摄取病人手术过程中目标组织器官的超声波图像;将目标组织器官的超声波图像与正常组织器官的参考影像作比较定位出目标组织器官的病变区域;以病人平躺的手术台为水平面并以所述超声波探头的位置和方向建立空间坐标系,并基于该空间坐标系计算病变区域的位置坐标;根据所述病变区域的位置坐标产生手术导航指令,并控制所述超声波探头内的红外定位器产生红外导光点;根据所述手术导航指令驱动第一机械手臂移动方向使所述红外导光点照射在所述病变区域;当接收到操作手柄输入的操作指令时,根据操作指令产生驱动第二机械手臂上的3D显示器移动到医生眼睛前方的驱动信号;根据驱动信号驱动所述第二机械手臂移动使3D显示器移动至医生的眼睛前方,并将目标组织器官的超声波图像显示在3D显示器上。
优选的,所述将目标组织器官的超声波图像与正常组织器官的参考影像作比较定位出目标组织器官的病变区域的步骤包括如下步骤:从所述存储器读取正常组织器官的参考影像;比较目标组织器官的超声波图像与正常组织器官的参考影像确定两者的纹理分布差异,并根据两者的纹理分布差异定位出目标组织器官的病变区域。
优选的,所述根据操作指令产生驱动第二机械手臂上的3D显示器移动到医生眼睛前方的驱动信号的步骤包括如下步骤:根据所述操作指令计算出3D显示器将要移动到的最终位置;通过3D显示器的最终位置计算第二机械手臂将要被驱动的角度;产生并传输以所计算的角度驱动第二机械手臂移动的驱动信号。
优选的,所述超声波探头、红外定位器、操作手柄、3D显示器和存储器均电连接至微控制器上。
优选的,所述机器人本体还设置电源装置,该电源装置包括可充电的锂电池以及充电座,所述锂电池电连接至微控制器上,所述充电座电连接至锂电池上。
本发明还提供一种手术导航定位机器人的控制方法,该手术导航定位机器人包括第一机械手臂、第二机械手臂以及机器人本体,所述第一机械手臂上设置有超声波探头,所述第二机械手臂上设置有3D显示器,所述超声波探头的外表面设置有红外定位器,所述机器人本体的外表面设置有操作手柄,其中,所述手术导航定位机器人的控制方法包括步骤:控制所述第一机械手臂的超声波探头实时摄取病人手术过程中目标组织器官的超声波图像;将目标组织器官的超声波图像与正常组织器官的参考影像作比较定位出目标组织器官的病变区域;以病人平躺的手术台为水平面并以所述超声波探头的位置和方向建立空间坐标系,并基于该空间坐标系计算病变区域的位置坐标;根据所述病变区域的位置坐标产生手术导航指令,并控制所述超声波探头内的红外定位器产生红外导光点;根据所述手术导航指令驱动第一机械手臂移动方向使所述红外导光点照射在所述病变区域;当接收到来自操作手柄输入的操作指令时,根据操作指令产生驱动第二机械手臂上的3D显示器移动到医生眼睛前方的驱动信号;根据驱动信号驱动第二机械手臂移动使3D显示器移动至医生的眼睛前方,并将目标组织器官的超声波图像显示在3D显示器上。
优选的,所述将目标组织器官的超声波图像与正常组织器官的参考影像作比较定位出目标组织器官的病变区域的步骤包括如下步骤:从所述存储器读取正常组织器官的参考影像;比较目标组织器官的超声波图像与正常组织器官的参考影像确定两者的纹理分布差异;根据所述纹理分布差异定位出目标组织器官的病变区域。
优选的,所述纹理分布差异包括人体组织器官发生病变的组织结构差异、尺寸大小差异以及外形轮廓差异。
优选的,所述根据操作指令产生驱动第二机械手臂上的3D显示器移动到医生眼睛前方的驱动信号的步骤包括如下步骤:根据所述操作指令计算出3D显示器将要移动到的最终位置;通过3D显示器的最终位置计算第二机械手臂将要被驱动的角度;产生并传输以所计算的角度驱动第二机械手臂移动的驱动信号。
优选的,所述手术导航指令包括所述第一机械手臂与目标组织器官的病变区域之间的距离与方向信息,所述红外导光点为一种在病人手术过程中指引医生找出组织器官病变位置的可视红外圆点。
相较于现有技术,本发明所述手术导航定位机器人及其控制方法采用上述技术方案,达到了如下技术效果:通过设置在第一机械手臂上的超声波探头11获取手术过程中目标组织器官的超声波图像,通过红外定位器实时跟踪定位目标组织器官的病变区域,使病变区域的位置清晰可见并自动导航到病变区域指引医生手术,从而方便医生手术并提高了手术的效率。此外,通过设置在第二机械手臂上的3D显示器可以自由移动到医生所需的位置,并将超声波图像显示在3D显示器上供医生在手术过程作手术参考,从而提高了手术的准确性和安全性。
图1是本发明手术导航定位机器人优选实施例的应用环境示意图;
图2是本发明手术导航定位机器人的内部电路连接示意图。
图3是本发明手术导航定位机器人的控制方法优选实施例的流程图。
本发明目的实现、功能特点及优点将结合实施例,参照附图做进一步说明。
为更进一步阐述本发明为达成上述目的所采取的技术手段及功效,以下结合附图及较佳实施例,对本发明的具体实施方式、结构、特征及其功效进行详细说明。应当理解,此处所描述的具体实施例仅仅用以解释本发明,并不用于限定本发明。
参照图1所示,图1是本发明手术导航定位机器人优选实施例的应用环境示意图。在本实施例中,所述手术导航定位机器人01可以放置在手术室内,所述手术室内还放置有供病人平躺进行手术的手术台02。所述医疗机器人01包括,但不仅限于,第一机械手臂1、第二机械手臂2以及机器人本体3。所述机器人本体3内设置有微控制器30、存储器31以及充电装置32,所述机器人本体3的外表面还设置有操作手柄33。
所述第一机械手臂1上设置有超声波探头11,该超声波探头11电连接至微控制器30,用于在病人手术过程中通过超声波实时摄取病人组织器官的超声波图像,该超声波图像为组织器官的3D图像。所述超声波探头11的外表面设置有红外定位器12,用于在病人手术过程中产生一个指引医生找出组织器官病变位置的红外导光点。
所述第二机械手臂2上设置有3D显示器21,该3D显示器21电连接至微控制器30。本发明实施例将3D显示器21设置在第二机械手臂2上,从而使得3D显示器21可以随着第二机械手臂2自由地移动。在现有技术中,3D显示器21是一种能够使医生感受到立体虚拟感觉的3D显示系统,向医生提供立体观感的3D显示图像,从而提供3D显示效果。
在本实施例中,所述3D显示器21可以实现为小型、轻型的3D显示模块。3D显示器21可以与具有一定移动自由度的第二机械手臂2联结,从而使得医生可以按所需移动3D显示器21。例如,3D显示器21可以根据医生的姿势而按照医生的需要移动,例如在医生坐着或站立进行机器人手术的过程。第二机械手臂2可以将3D显示器21移动到医生的眼睛前方,从而为医生观看3D显示器21的图像提供最大便利的位置。
本实施例的微控制器30可以驱动第一机械手臂1和第二机械手臂2按照一定的角度和方向自由移动。所述微控制器30可以为嵌入机器人本体3中的微处理器、微控制器单元(MCU)等。微控制器30可以接收来自医生的手动输入指令,确定3D显示器21将要移动到的最终位置。如果3D显示器21将要移动到的最终位置,则微控制器30可以计算第二机械手臂2将要被驱动的角度,然后可以产生并传输以所计算的角度驱动第二机械手臂2移动的驱动信号。
在本实施例,通过3D显示器21与第二机械手臂2联结,3D显示器21可以移动到医生所需的位置,从而使医生可以采取任意的姿势进行手术并能观看到手术过程中所需的3D超声波图像和病变区域图像。因此,在利用机器人辅助手术时,不仅是医生、助手或护士也可以自由地调节3D显示器21的位置,以观看3D图像来辅助医生准确完成手术。
本实施例的微控制器30可以将由操作手柄33输入的驱动信号传输到第二机械手臂2,从而使得3D显示器21移动到医生的眼睛前方。在通过医生的手动操作移动前方的情形中,当医生操作所述操作手柄33将3D显示器21移动置于医生眼睛处时,微控制器30可以根据操作手柄33的输入驱动第二机械手臂2并且将3D显示器21置于医生的眼睛前方。
参照图2所示,图2是本发明手术导航定位机器人的内部电路连接示意图。在本实施例中,所述超声波探头11、红外定位器12、3D显示器21、存储器31和电源装置32均电连接至微控制器30。本实施例所述电连接是指各个电气元器件通过导电线、信号线、控制线的一种或多种连接至微控制器30,从而使得微控制器30能够控制上述各个电气元器件能够完成相应的功能。
在本实施例中,所述微控制器30可以为一种中央处理器(CPU)、微处理器、微控制单元芯片(MCU)、数据处理芯片、或者具有数据处理功能的控制单元。所述存储器31可以为一种只读存储单元ROM,电可擦写存储单元EEPROM或快闪存储单元FLASH等存储器。所述存储器31存储有人体正常组织器官的参考影像,以及存储预先编制的计算机程序指令,微控制器30能够从存储器31读取加载计算机程序指令并执行,以便手术导航定位机器人01能够为病人手术过程中提供手术指引。
所述电源装置32包括可充电的锂电池321以及充电座322,所述锂电池321电连接至所述微控制器30上,用于为所述机器人01提供工作电源。所述充电座322电连接至所述锂电池321上,用于接插外部电源为所述锂电池321进行充电。
本发明还提供了一种基于3D显示功能的手术导航定位机器人的控制方法,应用于手术导航定位机器人01中。参考图3所示,图3是本发明手术导航定位机器人的控制方法优选实施例的流程图。在本实施例中,所述手术导航定位机器人的控制方法的各种方法步骤通过计算机软件程序来实现,该计算机软件程序以计算机程序指令的形式并存储于计算机可读存储介质(例如存储器31)中,存储介质可以包括:只读存储器、随机存储器、磁盘或光盘等,所述计算机程序指令能够被处理器加载并执行如下步骤S31至步骤S40。
步骤S31,控制第一机械手臂的超声波探头实时摄取病人手术过程中目标组织器官的超声波图像;具体地,微控制器30控制第一机械手臂1移动到病人的手术台02附近,并启动超声波探头11实时摄取手术台02上的病人手术过程中目标组织器官的超声波图像。在本实施例中,所述超声波探头11可以采用三维超声波探头,其通过发射金字塔型容积超声束,实时获取目标组织器官的三维超声波图像,并将三维超声波图像发送至微控制器30上。
步骤S32,将目标组织器官的超声波图像与正常组织器官的参考图像作比较定位出目标组织器官的病变区域;具体地,微控制器30从存储在存储器31读取正常组织器官的参考影像,并将目标组织器官的超声波图像与正常组织器官的参考影像作比较定位出目标组织器官的病变区域。在本实施例中,所述微控制器30通过比较目标组织器官的超声波图像与正常组织器官的参考影像确定两者的纹理分布差异,以根据两者的纹理分布差异定位出目标组织器官的病变区域,所述纹理分布差异包括人体组织器官发生病变的组织结构差异、尺寸大小差异及外形轮廓差异。
步骤S33,以患者平躺的手术台为水平面并以超声波探头的位置和方向建立空间坐标系;具体地,微控制器30以病人平躺的手术台02为水平面并以超声波探头11的位置和方向建立空间坐标系。在本实施例中,所述病变区域的位置坐标包括病变区域相对于超声波探头11的位置和方向。参考图1所示,所述病理定位模块102根据病人平躺的手术台02为水平面并以超声波探头11的位置和方向建立空间坐标系XYZ。
步骤S34,基于空间坐标系计算出病变区域的位置坐标;具体地,微控制器30通过超声波探头11在空间坐标系XYZ下的位置和方向计算出超声波图像中任意一点在空间坐标系XYZ下的位置坐标,进而可以知道目标组织器官相对于超声波探头11的位置和方向。
步骤S35,根据病变区域的位置坐标产生手术导航指令;具体地,微控制器30根据所述病变区域的位置坐标产生手术导航指令。在本实施例中,所述手术导航指令包括第一机械手臂1与目标组织器官的病变区域之间的距离与方向信息。
步骤S36,控制超声波探头内的红外定位器产生红外导光点;具体地,微控制器30控制超声波探头11内的红外定位器12产生红外导光点。在本实施例中,所述红外导光点为一种用于在病人手术过程中指引医生找出组织器官病变位置的可视红外圆点。
步骤S37,根据手术导航指令驱动第一机械手臂移动方向使红外导光点照射在病变区域;具体地,微控制器30根据所述手术导航指令控制第一机械手臂1移动方向使红外导光点照射在目标组织器官的病变区域,进而能够辅助医生快速准确地找到目标组织器官的病变位置,从而方便医生手术并提高了手术的效率。
步骤S38,当接收到来自操作手柄33输入的操作指令时,根据操作指令产生驱动第二机械手臂2上的3D显示器21移动到医生眼睛前方的驱动信号。在本实施例中,当微控制器30接收到来自医生通过操作手柄33手动输入的操作指令时,根据操作指令产生驱动第二机械手臂2移动到医生眼睛前方的驱动信号。所述操作指令包括向左移动、向右移动、向上移动和向上移动。即医生或护士可以向左、向右、向前或向后手动拨动所述操作手柄33,从而使第二机械手臂2带动3D显示器21向左移动、向右移动、向上移动和向上移动到所需的最终位置。具体地,微控制器30根据操作指令计算出3D显示器21将要移动到的最终位置,通过3D显示器21的最终位置计算第二机械手臂2将要被驱动的角度,然后产生并传输以所计算的角度驱动第二机械手臂2移动的驱动信号。
步骤S39,根据驱动信号驱动第二机械手臂2移动使3D显示器21移动至医生的眼睛前方。具体地,微控制器30根据驱动信号驱动第二机械手臂2移动,从而带动设置在第二机械手臂2的3D显示器21移动至医生的眼睛前方,从而为医生观看3D显示器21的超声波图像提供最大便利的位置。
步骤S40,将目标组织器官的超声波图像显示在3D显示器21上;由于3D显示器21可以移动到医生所需的位置,从而使医生可以采取任意的姿势进行手术并能观看到手术过程中所需的超声波图像,以供医生在手术过程中作手术参考,从而提高手术的准确性和安全性。因此,在利用手术导航定位机器人01辅助手术时,不仅是医生、助手或护士也可以自由地调节3D显示器21的位置,以观看3D超声波图像来辅助医生准确完成手术。
本发明所述手术导航定位机器人通过设置在第一机械手臂1上的超声波探头11获取手术过程中目标组织器官的超声波图像,通过红外定位器12实时跟踪定位目标组织器官的病变区域,使病变区域的位置清晰可见并自动导航到病变区域指引医生手术,从而方便医生手术并提高了手术的效率。此外,本发明所述手术导航定位机器人通过设置在第二机械手臂2上的3D显示器21可以自由移动到医生所需的位置,并将超声波图像显示在3D显示器21上供医生在手术过程作手术参考,从而提高了手术的准确性和安全性。
以上仅为本发明的优选实施例,并非因此限制本发明的专利范围,凡是利用本发明说明书及附图内容所作的等效结构或等效功能变换,或直接或间接运用在其他相关的技术领域,均同理包括在本发明的专利保护范围内。
相较于现有技术,本发明所述手术导航定位机器人及其控制方法采用上述技术方案,达到了如下技术效果:通过设置在第一机械手臂上的超声波探头11获取手术过程中目标组织器官的超声波图像,通过红外定位器实时跟踪定位目标组织器官的病变区域,使病变区域的位置清晰可见并自动导航到病变区域指引医生手术,从而方便医生手术并提高了手术的效率。此外,通过设置在第二机械手臂上的3D显示器可以自由移动到医生所需的位置,并将超声波图像显示在3D显示器上供医生在手术过程作手术参考,从而提高了手术的准确性和安全性。
Claims (10)
- 一种手术导航定位机器人,包括第一机械手臂、第二机械手臂以及机器人本体,其特征在于,所述第一机械手臂上设置有超声波探头,所述第二机械手臂上设置有3D显示器,所述超声波探头的外表面设置有红外定位器,所述机器人本体的外表面设置有操作手柄,所述机器人本体内设置有适于实现各种程序指令的微控制器以及适于存储多条程序指令的存储器,所述程序指令由微控制器加载并执行如下步骤:控制所述第一机械手臂的超声波探头实时摄取病人手术过程中目标组织器官的超声波图像;将目标组织器官的超声波图像与正常组织器官的参考影像作比较定位出目标组织器官的病变区域;以病人平躺的手术台为水平面并以所述超声波探头的位置和方向建立空间坐标系,并基于该空间坐标系计算病变区域的位置坐标;根据所述病变区域的位置坐标产生手术导航指令,并控制所述红外定位器产生红外导光点;根据所述手术导航指令驱动第一机械手臂移动方向使所述红外导光点照射在所述病变区域;当接收到操作手柄输入的操作指令时,根据操作指令产生驱动第二机械手臂上的3D显示器移动到医生眼睛前方的驱动信号;根据驱动信号驱动所述第二机械手臂移动使3D显示器移动至医生的眼睛前方,并将目标组织器官的超声波图像显示在3D显示器上。
- 如权利要求1所述的手术导航定位机器人,其特征在于,所述将目标组织器官的超声波图像与正常组织器官的参考影像作比较定位出目标组织器官的病变区域的步骤包括如下步骤:从所述存储器读取正常组织器官的参考影像;比较目标组织器官的超声波图像与正常组织器官的参考影像确定两者的纹理分布差异,并根据两者的纹理分布差异定位出目标组织器官的病变区域。
- 如权利要求1所述的手术导航定位机器人,其特征在于,所述操作指令产生驱动第二机械手臂上的3D显示器移动到医生眼睛前方的驱动信号的步骤包括如下步骤:根据所述操作指令计算出3D显示器将要移动到的最终位置;通过3D显示器的最终位置计算第二机械手臂将要被驱动的角度;产生并传输以所计算的角度驱动第二机械手臂移动的驱动信号。
- 如权利要求1至3任一项所述的手术导航定位机器人,其特征在于,所述超声波探头、红外定位器、操作手柄、3D显示器和存储器均电连接至微控制器上。
- 如权利要求4项所述的手术导航定位机器人,其特征在于,所述机器人本体还设置电源装置,该电源装置包括可充电的锂电池以及充电座,所述锂电池电连接至微控制器上,所述充电座电连接至锂电池上。
- 一种手术导航定位机器人的控制方法,该手术导航定位机器人包括第一机械手臂、第二机械手臂以及机器人本体,其特征在于,所述第一机械手臂上设置有超声波探头,所述第二机械手臂上设置有3D显示器,所述超声波探头的外表面设置有红外定位器,所述机器人本体的外表面设置有操作手柄,其中,所述手术导航定位机器人的控制方法包括步骤:控制所述第一机械手臂的超声波探头实时摄取病人手术过程中目标组织器官的超声波图像;将目标组织器官的超声波图像与正常组织器官的参考影像作比较定位出目标组织器官的病变区域;以病人平躺的手术台为水平面并以所述超声波探头的位置和方向建立空间坐标系,并基于该空间坐标系计算病变区域的位置坐标;根据所述病变区域的位置坐标产生手术导航指令,并控制所述红外定位器产生红外导光点;根据所述手术导航指令驱动第一机械手臂移动方向使所述红外导光点照射在所述病变区域;当接收到来自操作手柄输入的操作指令时,根据操作指令产生驱动第二机械手臂上的3D显示器移动到医生眼睛前方的驱动信号;根据驱动信号驱动所述第二机械手臂移动使3D显示器移动至医生的眼睛前方,并将目标组织器官的超声波图像显示在3D显示器上。
- 如权利要求6所述的手术导航定位机器人的控制方法,其特征在于,所述将目标组织器官的超声波图像与正常组织器官的参考影像作比较定位出目标组织器官的病变区域的步骤包括如下步骤:从存储在存储器读取正常组织器官的参考影像;比较目标组织器官的超声波图像与正常组织器官的参考影像确定两者的纹理分布差异;根据所述纹理分布差异定位出目标组织器官的病变区域。
- 如权利要求7所述的手术导航定位机器人的控制方法,其特征在于,所述纹理分布差异包括人体组织器官发生病变的组织结构差异、尺寸大小差异以及外形轮廓差异。
- 如权利要求6所述的手术导航定位机器人的控制方法,其特征在于,所述根据操作指令坐标产生驱动第二机械手臂上的3D显示器移动到医生眼睛前方的驱动信号的步骤包括如下步骤:根据所述操作指令计算出3D显示器将要移动到的最终位置;通过3D显示器的最终位置计算第二机械手臂将要被驱动的角度;产生并传输以所计算的角度驱动第二机械手臂移动的驱动信号。
- 如权利要求6所述的手术导航定位机器人的控制方法,其特征在于,所述手术导航指令包括所述第一机械手臂与目标组织器官的病变区域之间的距离与方向信息,所述红外导光点为一种在病人手术过程中指引医生找出组织器官病变位置的可视红外圆点。
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201711027805.5A CN107714178A (zh) | 2017-10-28 | 2017-10-28 | 手术导航定位机器人及其控制方法 |
CN201711027805.5 | 2017-10-28 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2019080317A1 true WO2019080317A1 (zh) | 2019-05-02 |
Family
ID=61203061
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2017/116668 WO2019080317A1 (zh) | 2017-10-28 | 2017-12-15 | 手术导航定位机器人及其控制方法 |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN107714178A (zh) |
WO (1) | WO2019080317A1 (zh) |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108742876A (zh) * | 2018-08-02 | 2018-11-06 | 雅客智慧(北京)科技有限公司 | 一种手术导航装置 |
CN109938768A (zh) * | 2019-03-11 | 2019-06-28 | 深圳市比邻星精密技术有限公司 | 超声成像方法、装置、计算机设备及存储介质 |
CN112603546A (zh) * | 2020-12-24 | 2021-04-06 | 哈尔滨思哲睿智能医疗设备有限公司 | 一种基于腹腔镜手术机器人的远程手术系统及控制方法 |
CN113855123B (zh) * | 2021-11-12 | 2023-04-25 | 郑州大学第一附属医院 | 一种外科手术辅助机器人 |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN2710567Y (zh) * | 2004-07-22 | 2005-07-20 | 上海英迈吉东影图像设备有限公司 | 手术导航系统机械臂 |
CN101019765A (zh) * | 2007-03-29 | 2007-08-22 | 新奥博为技术有限公司 | 一种磁共振图像引导下的手术系统及手术导航方法 |
CN101375805A (zh) * | 2007-12-29 | 2009-03-04 | 清华大学深圳研究生院 | 一种计算机辅助引导电子内窥镜操作的方法和系统 |
US20130172904A1 (en) * | 2011-12-29 | 2013-07-04 | Mako Surgical Corporation | Interactive CSG Subtraction |
CN103908345A (zh) * | 2012-12-31 | 2014-07-09 | 复旦大学 | 一种基于平板电脑的手术导航用的体数据可视化方法 |
CN105943161A (zh) * | 2016-06-04 | 2016-09-21 | 深圳市前海康启源科技有限公司 | 基于医疗机器人的手术导航系统及方法 |
CN206063225U (zh) * | 2016-06-04 | 2017-04-05 | 深圳市前海康启源科技有限公司 | 用于辅助手术的医疗机器人 |
-
2017
- 2017-10-28 CN CN201711027805.5A patent/CN107714178A/zh active Pending
- 2017-12-15 WO PCT/CN2017/116668 patent/WO2019080317A1/zh active Application Filing
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN2710567Y (zh) * | 2004-07-22 | 2005-07-20 | 上海英迈吉东影图像设备有限公司 | 手术导航系统机械臂 |
CN101019765A (zh) * | 2007-03-29 | 2007-08-22 | 新奥博为技术有限公司 | 一种磁共振图像引导下的手术系统及手术导航方法 |
CN101375805A (zh) * | 2007-12-29 | 2009-03-04 | 清华大学深圳研究生院 | 一种计算机辅助引导电子内窥镜操作的方法和系统 |
US20130172904A1 (en) * | 2011-12-29 | 2013-07-04 | Mako Surgical Corporation | Interactive CSG Subtraction |
CN103908345A (zh) * | 2012-12-31 | 2014-07-09 | 复旦大学 | 一种基于平板电脑的手术导航用的体数据可视化方法 |
CN105943161A (zh) * | 2016-06-04 | 2016-09-21 | 深圳市前海康启源科技有限公司 | 基于医疗机器人的手术导航系统及方法 |
CN206063225U (zh) * | 2016-06-04 | 2017-04-05 | 深圳市前海康启源科技有限公司 | 用于辅助手术的医疗机器人 |
Also Published As
Publication number | Publication date |
---|---|
CN107714178A (zh) | 2018-02-23 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20240180647A1 (en) | Robotic Port Placement Guide and Method of Use | |
ES2907252T3 (es) | Sistema para realizar procedimientos quirúrgicos y de intervención automatizados | |
WO2019080358A1 (zh) | 3d影像手术导航机器人及其控制方法 | |
KR20200071743A (ko) | 로봇 암을 위한 경계의 표시를 제공하는 로봇 시스템 | |
Hamed et al. | Advances in haptics, tactile sensing, and manipulation for robot‐assisted minimally invasive surgery, noninvasive surgery, and diagnosis | |
US11602403B2 (en) | Robotic tool control | |
WO2019080317A1 (zh) | 手术导航定位机器人及其控制方法 | |
WO2017206519A1 (zh) | 基于医疗机器人的手术导航系统及方法 | |
Rassweiler et al. | Surgical navigation in urology: European perspective | |
CN111867438A (zh) | 手术辅助设备、手术方法、非暂时性计算机可读介质和手术辅助系统 | |
JP2010200894A (ja) | 手術支援システム及び手術ロボットシステム | |
JP2021166593A (ja) | ロボット手術支援システム、ロボット手術支援方法、及びプログラム | |
WO2023246521A1 (zh) | 基于混合现实的病灶定位方法、装置和电子设备 | |
Liu et al. | Laparoscopic stereoscopic augmented reality: toward a clinically viable electromagnetic tracking solution | |
CN109431606A (zh) | 一种血管介入手术机器人组合系统及其使用方法 | |
WO2021260545A1 (en) | Control scheme calibration for medical instruments | |
WO2022166929A1 (zh) | 计算机可读存储介质、电子设备及手术机器人系统 | |
Kladko et al. | Magnetosurgery: Principles, design, and applications | |
Looi et al. | KidsArm—An image-guided pediatric anastomosis robot | |
CN115192195A (zh) | 计算机可读存储介质、电子设备及手术机器人系统 | |
JP2021153773A (ja) | ロボット手術支援装置、手術支援ロボット、ロボット手術支援方法、及びプログラム | |
US20220061941A1 (en) | Robotic collision boundary determination | |
CN115429438A (zh) | 支撑装置不动点随动调整系统及手术机器人系统 | |
US11711596B2 (en) | System and methods for determining proximity relative to an anatomical structure | |
JP7414611B2 (ja) | ロボット手術支援装置、処理方法、及びプログラム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 17929907 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
32PN | Ep: public notification in the ep bulletin as address of the adressee cannot be established |
Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 23.09.2020) |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 17929907 Country of ref document: EP Kind code of ref document: A1 |