CN109330685B - Automatic navigation method for laparoscope of porous abdominal cavity surgical robot - Google Patents

Automatic navigation method for laparoscope of porous abdominal cavity surgical robot Download PDF

Info

Publication number
CN109330685B
CN109330685B CN201811285372.8A CN201811285372A CN109330685B CN 109330685 B CN109330685 B CN 109330685B CN 201811285372 A CN201811285372 A CN 201811285372A CN 109330685 B CN109330685 B CN 109330685B
Authority
CN
China
Prior art keywords
surgical instruments
interventional surgical
interventional
laparoscope
field image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811285372.8A
Other languages
Chinese (zh)
Other versions
CN109330685A (en
Inventor
周正东
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing University of Aeronautics and Astronautics
Original Assignee
Nanjing University of Aeronautics and Astronautics
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing University of Aeronautics and Astronautics filed Critical Nanjing University of Aeronautics and Astronautics
Priority to CN201811285372.8A priority Critical patent/CN109330685B/en
Publication of CN109330685A publication Critical patent/CN109330685A/en
Application granted granted Critical
Publication of CN109330685B publication Critical patent/CN109330685B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2065Tracking using image or pattern recognition
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • A61B2034/301Surgical robots for introducing or steering flexible instruments inserted into the body, e.g. catheters or endoscopes
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • A61B2034/302Surgical robots specifically adapted for manipulations within body cavities, e.g. within abdominal or thoracic cavities
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A50/00TECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE in human health protection, e.g. against extreme weather
    • Y02A50/30Against vector-borne diseases, e.g. mosquito-borne, fly-borne, tick-borne or waterborne diseases whose impact is exacerbated by climate change

Landscapes

  • Health & Medical Sciences (AREA)
  • Surgery (AREA)
  • Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Biomedical Technology (AREA)
  • Robotics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Endoscopes (AREA)

Abstract

The invention discloses a laparoscope automatic navigation method of a multi-hole abdominal cavity surgical robot, which utilizes interventional surgical instruments carrying a plurality of mark points and a surgical field image processing technology, and ensures that an observation field image is kept clear and the distal end of the surgical instruments are always positioned in a proper observation field of a surgeon by automatically adjusting the center of the surgical field image and automatically adjusting the focal length of the laparoscope, thereby effectively improving the quality and efficiency of the surgery and reducing the labor intensity of the surgeon.

Description

Automatic navigation method for laparoscope of porous abdominal cavity surgical robot
Technical Field
The invention relates to the field of minimally invasive surgery, in particular to a laparoscopic automatic navigation technology of a porous abdominal surgery robot.
Background
The multi-hole abdominal surgery robot is provided with a laparoscope formed by a robot control system, a robot arm, an interventional surgery instrument and a binocular camera, and a surgeon uses both hands to operate the main hand of the robot control system by means of images generated by the laparoscope, so that the movement and operation of the interventional surgery instrument are controlled. The porous abdominal cavity surgical robot can enable a surgeon to develop minimally invasive surgery more conveniently, effectively reduce fatigue of the surgeon, improve surgical quality and efficiency, and enable a patient to recover easily due to small surgical incision.
During laparoscopic surgery, a surgeon needs to receive stereoscopic feedback of the surgical field with the aid of a binocular camera (laparoscope) placed in the patient in order to perform precise surgical procedures. In order to maintain proper surgical viewing field and high quality of images, the pose and focal length of the camera need to be adjusted in time according to the actual condition of the surgery to follow the movement and operation of the surgical instrument. The current common technologies include voice control, visual tracking of the surgeon, foot pedal control and other modes, but the modes have the problems of bringing extra burden to the surgeon, being difficult to flexibly and accurately control the camera and the like.
Therefore, a convenient and accurate scheme for adjusting the pose and focal length of the intraoperative camera is urgently needed in clinic.
Chinese patent application No. 201580025333.2 proposes "a system and method for controlling camera position in a surgical robot system", which uses a method of controlling mode switching to perform manual master-slave control of a camera. In this patent, a robotic surgical system includes at least one robotic arm, a camera, and a console. The console includes a first handle, a second handle, and a selector switch configured to select between a robot control mode and a camera control mode. In the system, the first handle or the second handle controls at least one robotic arm in a robotic control mode, and the first handle and the second handle control the camera in a camera control mode. This approach requires switching between two different modes of manipulation, is poorly real-time, and increases the extra burden on the surgeon.
The interventional surgical instrument with a plurality of marking points and the surgical field image processing technology are utilized to automatically adjust the posture of the laparoscope and the focal length of the camera, so that the method has the advantages of economy, convenience, accuracy and the like, the burden of doctors can be effectively reduced, and the surgical quality and efficiency are improved.
Disclosure of Invention
The invention aims to solve the technical problem of providing a porous laparoscope automatic navigation method for a robot for abdominal surgery, aiming at the defects related to the background technology.
The invention adopts the following technical scheme for solving the technical problems:
an automatic navigation method of a porous abdominal surgery robot laparoscope comprises the following steps:
step 1), a clinician adjusts the pose of a laparoscope according to the area of a focus of a patient, so that the focus is positioned in an operation visual field, the image is clear, and the distal ends of two interventional operation instruments are positioned in the visual field; for the far end of each interventional surgical instrument, at least two marking points used for marking the straight line where the interventional surgical instrument is positioned are arranged on the far end of each interventional surgical instrument, and the distance between each marking point and the tail end of each marking point is recorded;
step 2), calculating the intersection point coordinate O of the straight line where the distal ends of the two current interventional surgical instruments are located 0 Focal length f of current laparoscope 0 And an included angle alpha formed by connecting the tail ends of the two current interventional surgical instruments with the center of the surgical field image 0
Step 2.1), obtaining a current operation visual field image and a focal length f of a laparoscope 0 : acquiring an image of a current surgical field by using an image acquisition unit, and recording a focal length f of a current laparoscope 0
Step 2.2), obtaining image coordinates of all marking points on the distal ends of the two interventional surgical instruments: detecting all marking points at the distal ends of two interventional surgical instruments according to the surgical field image, and recording the image coordinates of all marking points;
step 2.3), obtaining equations and terminal coordinates of straight lines where two interventional surgical instruments are located: determining a linear equation of the distal ends of the two interventional surgical instruments in the surgical field image according to the coordinates of each marking point, and calculating the tail end coordinates of each interventional surgical instrument according to the distances between each marking point and the tail end of each interventional surgical instrument;
step 2.4), calculating the intersection point coordinate O of the straight line where the distal ends of the two interventional surgical instruments are located according to the straight line equation of the distal ends of the two interventional surgical instruments in the surgical field image 0
Step 2.5), calculating an included angle alpha formed by connecting the tail ends of the two interventional surgical instruments with the center of the surgical field image 0 Let the tail ends of two interventional surgical instruments be E respectively 1 、E 2 The center of the operation visual field image is O, which is connected with E 1 O and E 2 O, then E 1 O and E 2 The included angle between O is alpha 0
Step 3), after the distal ends of the two interventional surgical instruments are moved, calculating the intersection point coordinate O of the straight line where the distal ends of the two interventional surgical instruments are positioned 1 Included angle alpha formed by connecting two interventional surgical instrument ends and surgical field image center line 1
Step 3.1), acquiring a moved operation field image;
step 3.2), detecting all marking points at the far ends of the two interventional surgical instruments according to the moved operation field image, and recording the image coordinates of all marking points;
step 3.3), acquiring an equation of a straight line where the two moved interventional surgical instruments are located and an end coordinate: determining a linear equation of the distal ends of the two interventional surgical instruments in the surgical field image according to the coordinates of each moved marking point, and calculating the tail end coordinates of each interventional surgical instrument after movement according to the distances between each marking point and the tail end of each interventional surgical instrument;
step 3.4), calculating the intersection point of the straight lines of the distal ends of the two moving interventional surgical instruments according to the straight line equation of the distal ends of the two moving interventional surgical instruments in the surgical field imageCoordinates O 1
Step 3.5), calculating an included angle alpha formed by connecting the tail ends of the two interventional surgical instruments after movement with the center of the surgical field image 1
Step 4), calculating the included angle change ratio r, r=alpha 10 Adjusting the focal length f of the laparoscope according to r, wherein f=f 0 /r;
Step 5), adjusting the posture of the laparoscope to enable the center of the operation visual field image to move, and enabling the displacement and the intersection point coordinate O to be equal 0 、O 1 The displacement therebetween is the same.
As a further optimization scheme of the laparoscopic automatic navigation method of the multi-hole abdominal surgery robot, three spherical marking points are arranged at the distal ends of the two interventional surgical instruments.
Compared with the prior art, the technical scheme provided by the invention has the following technical effects:
the invention utilizes the surgical instruments carrying a plurality of mark points and the surgical field image processing technology, automatically adjusts the posture of the laparoscope and the focal length of the laparoscope through two steps of the initialization of the laparoscopic navigation and the real-time automatic navigation, has the advantages of economy, convenience, accuracy and the like, can effectively reduce the burden of doctors, improves the surgical quality and efficiency, and is beneficial to clinical diagnosis and treatment.
Drawings
FIG. 1 is a schematic flow chart of the present invention;
FIG. 2 shows the coordinates O of the intersection point at the center of the image of the surgical field at the end of two interventional surgical instruments according to the invention 0 Schematic of the relationship between the two;
fig. 3 is a schematic view of a distal marker point of one of the interventional surgical instruments of the present invention.
Detailed Description
The technical scheme of the invention is further described in detail below with reference to the accompanying drawings:
as shown in fig. 1, the invention discloses a laparoscopic automatic navigation method of a porous abdominal operation robot, which comprises the following steps:
step 1), a clinician adjusts the pose of a laparoscope according to the area of a focus of a patient, so that the focus is positioned in an operation visual field, the image is clear, and the distal ends of two interventional operation instruments are positioned in the visual field; for the far end of each interventional surgical instrument, at least two marking points used for marking the straight line where the interventional surgical instrument is positioned are arranged on the far end of each interventional surgical instrument, and the distance between each marking point and the tail end of each marking point is recorded;
step 2), calculating the intersection point coordinate O of the straight line where the distal ends of the two current interventional surgical instruments are located 0 Focal length f of current laparoscope 0 And an included angle alpha formed by connecting the tail ends of the two current interventional surgical instruments with the center of the surgical field image 0
Step 2.1), obtaining a current operation visual field image and a focal length f of a laparoscope 0 : acquiring an image of a current surgical field by using an image acquisition unit, and recording a focal length f of a current laparoscope 0
Step 2.2), obtaining image coordinates of all marking points on the distal ends of the two interventional surgical instruments: detecting all marking points at the distal ends of two interventional surgical instruments according to the surgical field image, and recording the image coordinates of all marking points;
step 2.3), obtaining equations and terminal coordinates of straight lines where two interventional surgical instruments are located: determining a linear equation of the distal ends of the two interventional surgical instruments in the surgical field image according to the coordinates of each marking point, and calculating the tail end coordinates of each interventional surgical instrument according to the distances between each marking point and the tail end of each interventional surgical instrument; when the tail end coordinates of each interventional surgical instrument are calculated, only the distance between one marking point and the tail end of the interventional surgical instrument is needed to be known, but the distances between other marking points and the tail end of the interventional surgical instrument can be used for correction, so that the correction is more accurate;
step 2.4), calculating the intersection point coordinate O of the straight line where the distal ends of the two interventional surgical instruments are located according to the straight line equation of the distal ends of the two interventional surgical instruments in the surgical field image 0
Step 2.5), calculating an included angle alpha formed by connecting the tail ends of the two interventional surgical instruments with the center of the surgical field image 0 As shown in FIG. 2, the two interventional surgical instruments are respectively provided with tail endsE 1 、E 2 The center of the operation visual field image is O, which is connected with E 1 O and E 2 O, then E 1 O and E 2 The included angle between O is alpha 0
Step 3), after the distal ends of the two interventional surgical instruments are moved, calculating the intersection point coordinate O of the straight line where the distal ends of the two interventional surgical instruments are positioned 1 Included angle alpha formed by connecting two interventional surgical instrument ends and surgical field image center line 1
Step 3.1), acquiring a moved operation field image;
step 3.2), detecting all marking points at the far ends of the two interventional surgical instruments according to the moved operation field image, and recording the image coordinates of all marking points;
step 3.3), acquiring an equation of a straight line where the two moved interventional surgical instruments are located and an end coordinate: determining a linear equation of the distal ends of the two interventional surgical instruments in the surgical field image according to the coordinates of each moved marking point, and calculating the tail end coordinates of each interventional surgical instrument after movement according to the distances between each marking point and the tail end of each interventional surgical instrument;
step 3.4), calculating the intersection point coordinate O of the straight line where the distal ends of the two moving interventional surgical instruments are located according to the straight line equation of the distal ends of the two moving interventional surgical instruments in the surgical field image 1
Step 3.5), calculating an included angle alpha formed by connecting the tail ends of the two interventional surgical instruments after movement with the center of the surgical field image 1
Step 4), calculating the included angle change ratio r, r=alpha 10 Adjusting the focal length f of the laparoscope according to r, wherein f=f 0 /r;
Step 5), adjusting the posture of the laparoscope to enable the center of the operation visual field image to move, and enabling the displacement and the intersection point coordinate O to be equal 0 、O 1 The displacement therebetween is the same.
As shown in fig. 3, the distal ends of the two interventional surgical instruments are each provided with three spherical marker points.
It will be understood by those skilled in the art that, unless otherwise defined, all terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs. It will be further understood that terms, such as those defined in commonly used dictionaries, should be interpreted as having a meaning that is consistent with their meaning in the context of the prior art and will not be interpreted in an idealized or overly formal sense unless expressly so defined herein.
While the foregoing is directed to embodiments of the present invention, other and further details of the invention may be had by the present invention, it should be understood that the foregoing description is merely illustrative of the present invention and that no limitations are intended to the scope of the invention, except insofar as modifications, equivalents, improvements or modifications are within the spirit and principles of the invention.

Claims (2)

1. The automatic navigation system of the laparoscope of the multi-hole abdominal operation robot is characterized by comprising a distance recording module, a pre-movement gesture obtaining module, a post-movement gesture obtaining module, a laparoscope focal length adjusting module and a laparoscope pose adjusting module;
the distance recording module is used for recording the distance between each marking point and the tail end of each marking point according to at least two marking points on the straight line of each interventional surgical instrument, which are used for marking the marking points;
the pre-movement gesture obtaining module is used for calculating the intersection point coordinates of the straight lines where the distal ends of the current two interventional surgical instruments are locatedO 0 Focal length of current laparoscopef 0 And an included angle alpha formed by connecting the tail ends of the two current interventional surgical instruments with the center of the surgical field image 0 The following steps 2.1) to 2.5) are sequentially executed by the pre-movement gesture obtaining module;
step 2.1), obtaining a current operation visual field image and a focal length of a laparoscopef 0 : the pre-movement posture obtaining module obtains an image of the current operation field by using the image acquisition unit and records the focal length of the current laparoscopef 0
Step 2.2), obtaining image coordinates of all marking points on the distal ends of the two interventional surgical instruments: the pre-movement gesture obtaining module detects all marking points at the far ends of two interventional surgical instruments according to the surgical field image and records the image coordinates of all marking points;
step 2.3), obtaining equations and terminal coordinates of straight lines where two interventional surgical instruments are located: the pre-movement gesture obtaining module determines a linear equation of the distal ends of the two interventional surgical instruments in the operation field image according to the coordinates of each marking point, and calculates the tail end coordinates of each interventional surgical instrument according to the distances between each marking point and the tail end of each interventional surgical instrument;
step 2.4), the pre-movement gesture obtaining module calculates the intersection point coordinates of the straight lines of the distal ends of the two interventional surgical instruments according to the straight line equation of the distal ends of the two interventional surgical instruments in the surgical field imageO 0
Step 2.5), the pre-movement gesture obtaining module calculates an included angle alpha formed by connecting the tail ends of the two interventional surgical instruments with the center line of the surgical field image 0 The tail ends of two interventional surgical instruments are respectivelyE 1E 2 The center of the operation visual field image isO,ConnectionE 1 OAndE 2 OThenE 1 OAndE 2 OThe included angle between them isα 0
The post-movement gesture obtaining module is used for calculating the intersection point coordinates of the straight lines where the distal ends of the two interventional surgical instruments are located after the distal ends of the two interventional surgical instruments moveO 1 Included angle formed by connecting two ends of interventional surgical instruments with center line of surgical field imageα 1 The method comprises the steps of carrying out a first treatment on the surface of the The following steps 3.1) to 3.5) are sequentially executed by the post-movement gesture obtaining module;
step 3.1), the moved gesture obtaining module obtains a moved operation vision image;
step 3.2), the moved gesture obtaining module detects all marking points at the far ends of the two interventional surgical instruments according to the moved operation field image, and records the image coordinates of all marking points;
step 3.3), the moved posture obtaining module obtains an equation of a straight line where the two moved interventional surgical instruments are located and terminal coordinates: determining a linear equation of the distal ends of the two interventional surgical instruments in the surgical field image according to the coordinates of each moved marking point, and calculating the tail end coordinates of each interventional surgical instrument after movement according to the distances between each marking point and the tail end of each interventional surgical instrument;
step 3.4), the post-movement posture obtaining module calculates the intersection point coordinates of the straight lines of the distal ends of the two post-movement interventional surgical instruments according to the straight line equation of the distal ends of the two post-movement interventional surgical instruments in the surgical field imageO 1
Step 3.5), the post-movement gesture obtaining module calculates an included angle formed by connecting the tail ends of the two post-movement interventional surgical instruments with the center line of the surgical field imageα 1
The laparoscopic focal length adjusting module is used for calculating the change proportion of the included angle after the distal ends of the two interventional surgical instruments are movedrr = α 1 /α 0 And according torAdjusting the focal length of a laparoscopeff= f 0 / r
The laparoscope pose adjusting module is used for adjusting the focal length of the laparoscope after the distal ends of the two interventional surgical instruments are movedfAfter adjustment, the posture of the laparoscope is adjusted to enable the center of the operation visual field image to move, the displacement and the intersection point coordinatesO 0O 1 The displacement therebetween is the same.
2. The multi-hole laparoscopic automatic surgical robotic navigation system of claim 1, wherein the two interventional surgical instruments are each provided with three spherical marker points at their distal ends.
CN201811285372.8A 2018-10-31 2018-10-31 Automatic navigation method for laparoscope of porous abdominal cavity surgical robot Active CN109330685B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811285372.8A CN109330685B (en) 2018-10-31 2018-10-31 Automatic navigation method for laparoscope of porous abdominal cavity surgical robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811285372.8A CN109330685B (en) 2018-10-31 2018-10-31 Automatic navigation method for laparoscope of porous abdominal cavity surgical robot

Publications (2)

Publication Number Publication Date
CN109330685A CN109330685A (en) 2019-02-15
CN109330685B true CN109330685B (en) 2024-02-02

Family

ID=65313159

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811285372.8A Active CN109330685B (en) 2018-10-31 2018-10-31 Automatic navigation method for laparoscope of porous abdominal cavity surgical robot

Country Status (1)

Country Link
CN (1) CN109330685B (en)

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3946130A4 (en) * 2019-03-27 2023-05-10 Sina Robotics & Medical Innovators Co., Ltd Controlling a laparoscopic instrument
CN112587244A (en) * 2020-12-15 2021-04-02 深圳市精锋医疗科技有限公司 Surgical robot and control method and control device thereof
CN115120353A (en) * 2020-12-15 2022-09-30 深圳市精锋医疗科技股份有限公司 Surgical robot, computer-readable storage medium, and control device
CN114652449A (en) * 2021-01-06 2022-06-24 深圳市精锋医疗科技股份有限公司 Surgical robot and method and control device for guiding surgical arm to move
WO2022166929A1 (en) * 2021-02-03 2022-08-11 上海微创医疗机器人(集团)股份有限公司 Computer-readable storage medium, electronic device, and surgical robot system
CN113633387B (en) * 2021-06-21 2024-01-26 安徽理工大学 Surgical field tracking supporting laparoscopic minimally invasive robot touch interaction method and system
CN114366313B (en) * 2022-03-21 2022-08-02 杭州华匠医学机器人有限公司 Endoscope holding robot control method based on laparoscopic surgical instrument pose

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2001089405A1 (en) * 2000-05-22 2001-11-29 Siemens Aktiengesellschaft Fully-automatic, robot-assisted camera guidance using position sensors for laparoscopic interventions
CN1543916A (en) * 2003-11-19 2004-11-10 星 周 Mirror supporting device
CN106256310A (en) * 2016-08-18 2016-12-28 中国科学院深圳先进技术研究院 It is automatically adjusted the method and system of nasal endoscopes pose
CN108524011A (en) * 2018-05-09 2018-09-14 杨琨 Visual field focus based on eye tracker principle indicates system and method

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8079950B2 (en) * 2005-09-29 2011-12-20 Intuitive Surgical Operations, Inc. Autofocus and/or autoscaling in telesurgery
WO2012078989A1 (en) * 2010-12-10 2012-06-14 Wayne State University Intelligent autonomous camera control for robotics with medical, military, and space applications

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2001089405A1 (en) * 2000-05-22 2001-11-29 Siemens Aktiengesellschaft Fully-automatic, robot-assisted camera guidance using position sensors for laparoscopic interventions
CN1543916A (en) * 2003-11-19 2004-11-10 星 周 Mirror supporting device
CN106256310A (en) * 2016-08-18 2016-12-28 中国科学院深圳先进技术研究院 It is automatically adjusted the method and system of nasal endoscopes pose
CN108524011A (en) * 2018-05-09 2018-09-14 杨琨 Visual field focus based on eye tracker principle indicates system and method

Also Published As

Publication number Publication date
CN109330685A (en) 2019-02-15

Similar Documents

Publication Publication Date Title
CN109330685B (en) Automatic navigation method for laparoscope of porous abdominal cavity surgical robot
JP6891244B2 (en) Medical devices, systems, and methods that use eye tracking
US11963666B2 (en) Overall endoscopic control system
KR102437404B1 (en) Systems and methods for controlling surgical instruments
CN110200702B9 (en) Medical device, system and method for a stereoscopic viewer with integrated eye gaze tracking
JP2020039934A (en) Robot control of surgical instrument visibility
EP2442743B1 (en) Virtual measurement tool for minimally invasive surgery
CN113180828B (en) Surgical robot constraint motion control method based on rotation theory
JP2012529971A (en) Virtual measurement tool for minimally invasive surgery
WO2018059036A1 (en) Laparoscopic surgery system
US20180256008A1 (en) System for moving first insertable instrument and second insertable instrument, controller, and computer-readable storage device
Noonan et al. Gaze contingent articulated robot control for robot assisted minimally invasive surgery
JP2006312079A (en) Medical manipulator
KR101284087B1 (en) Surgical robot using visual sensor and system and method for analyzing of the surgical robot and system and method for controling of he surgical robot
CN107997822B (en) Minimally invasive surgery positioning system
Dumpert et al. Semi-autonomous surgical tasks using a miniature in vivo surgical robot
Clancy et al. Gaze-contingent autofocus system for robotic-assisted minimally invasive surgery
US20220031502A1 (en) Medical device for eye surgery
CN116350319A (en) Navigation robot system for high-precision neurosurgery minimally invasive puncture operation
CN117860379A (en) Endoscope guiding method under navigation system, electronic equipment and navigation system
JP2023507434A (en) Selection of cursor position on medical images by direction from the distal tip of the probe
CN115005979A (en) Computer-readable storage medium, electronic device, and surgical robot system
CN114098993A (en) Method for acquiring pitching information of master hand
CN117651533A (en) Tubular device navigation method, apparatus and storage medium in multi-bifurcation channel
Looi et al. Image guidance framework with endoscopic video for automated robotic anastomosis in a paediatric setting

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant