CN110742691A - Motion control method for flexible endoscope operation robot - Google Patents

Motion control method for flexible endoscope operation robot Download PDF

Info

Publication number
CN110742691A
CN110742691A CN201910997683.5A CN201910997683A CN110742691A CN 110742691 A CN110742691 A CN 110742691A CN 201910997683 A CN201910997683 A CN 201910997683A CN 110742691 A CN110742691 A CN 110742691A
Authority
CN
China
Prior art keywords
organ
displacement
endoscope
image
time
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201910997683.5A
Other languages
Chinese (zh)
Inventor
代煜
陈通
张建勋
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nankai University
Original Assignee
Nankai University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nankai University filed Critical Nankai University
Priority to CN201910997683.5A priority Critical patent/CN110742691A/en
Publication of CN110742691A publication Critical patent/CN110742691A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/70Manipulators specially adapted for use in surgery
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/101Computer-aided simulation of surgical operations
    • A61B2034/105Modelling of the patient, e.g. for ligaments or bones
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • A61B2034/301Surgical robots for introducing or steering flexible instruments inserted into the body, e.g. catheters or endoscopes

Landscapes

  • Health & Medical Sciences (AREA)
  • Surgery (AREA)
  • Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Biomedical Technology (AREA)
  • Robotics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Endoscopes (AREA)

Abstract

A motion control method for a flexible endoscope operation robot firstly needs to acquire image information in a complete respiratory motion cycle of a human body. The edge contour of the organ tissue is accurately determined by using medical image filtering and segmentation algorithms. And determining the displacement change condition of the organ along with time by utilizing the edge contour information, and drawing a displacement-time curve of the organ under the action of respiration. In the case of the organ displacement-time curve determination, the endoscope operation robot is motion-compensated with reference to the period and the fluctuation amplitude of the displacement-time curve. According to the invention, the real-time ultrasonic image is used for directly tracking the movement of organs, so that the breathing movement rules of different human bodies and the same human body in different time periods can be directly displayed; the organ displacement-time curve in the respiratory state obtained by collecting the ultrasonic image is registered with the cavity hole image captured by the endoscope, so that the displacement deviation between the endoscope and the human body organ caused by respiration can be compensated.

Description

Motion control method for flexible endoscope operation robot
Technical Field
The invention belongs to the technical field of medical robots, and relates to a motion compensation method of a medical robot under the action of respiratory motion when an endoscope enters a human body.
Background
In recent years, minimally invasive surgical robots represented by flexible endoscopes have played a great role in practical applications, and have the advantages of small human body trauma, fast postoperative recovery, repeatable operation and the like.
Before the flexible endoscope enters a human body cavity, a corresponding three-dimensional model is established by acquiring a human body ultrasonic image to determine a target position and plan a path, and then the end manipulator is guided to the target position by matching with a navigation system endoscope operation robot.
At present, a static three-dimensional model is generally constructed according to a CT image, the size and the shape of an organ are generally assumed to be static, however, in the actual process, the breathing action of a human body drives the organ to move, so that the target position regularly changes along with time, and a certain error exists between planning before operation and actual planning; and organ movement caused by human respiration is generally displacement movement in the up-down direction.
And the respiratory motion of the human body has periodicity, which is more obvious under the action of a breathing machine. Meanwhile, the organ displacement change and the period caused by respiratory motion are different due to individual difference, and the respiratory motion curve is changed at different time of the same individual.
Aiming at displacement deviation between an organ and an endoscope tail end manipulator caused by breathing, the traditional method is to construct a human body breathing motion model so as to help an endoscope operation robot to perform displacement compensation on the endoscope tail end manipulator, so that the relative displacement between the manipulator and the organ is kept basically unchanged. However, the traditional respiratory motion model constructed according to the ultrasound image can only reflect the general rule of respiratory motion, and can not directly track organs or specific targets, so that it is difficult to establish personalized respiratory motion models for different human bodies; or the traditional method needs to arrange an additional sensor to collect displacement information, which may bring inconvenience to the robot to operate the endoscope; there is currently no effective method for motion compensation of a robot operating an endoscope based on a model of the respiratory motion of a body organ.
The current ultrasonic imaging technology can capture thousands of dynamic images of human organs in a short time by adding a time concept on the basis of common two-dimensional imaging, so that the obtained images can completely record the movement rule of the organs, complete image data can be provided by only scanning the human body once, and the size, the position and the form of the organs or specified targets can be clearly displayed. Compared with CT and MRI, the real-time ultrasound is adopted to collect the internal image information of the human body, so that the damage to the human body is small, and the possibility is provided for collecting the image information of the human body in real time and constructing a corresponding respiratory motion model by adopting a real-time ultrasound imaging technology.
Disclosure of Invention
The invention aims to provide a method for constructing a respiratory motion model of human organ tissues according to real-time ultrasonic images of a human body, and then compensating displacement deviation between a human organ and a manipulator at the tail end of an endoscope, which is generated due to the respiration of the human body, by referring to the constructed respiratory motion model and organ cavity image information acquired by an endoscope under the control action of a flexible endoscope operation robot.
Technical scheme of the invention
A motion control method for a flexible endoscope operating robot, the steps of the method comprising:
1, acquiring an ultrasonic image;
in order to obtain a respiratory motion model of a human organ, image information in a complete respiratory motion cycle of a human body needs to be obtained first. The respiratory motion of the human body is closely related to the body state and the body position of the human body, however, the real-time acquisition of ultrasonic information may influence the operation of the endoscope, so that the organ ultrasonic image information under the condition of stable respiratory state of the human body can be acquired before the endoscope enters the cavity of the human body. During the image information acquisition process, the body position and the endoscope operation need to be kept basically consistent.
2, preprocessing the ultrasonic image;
because of the limitation of the ultrasound imaging principle, the unprocessed ultrasound image has more noise spots, and it is difficult to directly obtain the edge contour of the read organ or directly extract the target position. Therefore, the invention provides that the edge contour of the tissue is accurately determined by utilizing a medical image filtering and segmentation algorithm. Since the real-time ultrasound can obtain multi-frame image data in a short time, a reference can be provided for image segmentation of a next frame by using a segmentation result of image data of a previous frame in an image segmentation process, so that the segmentation accuracy is improved, and the segmentation time is reduced.
3, extracting organ tissues or targets to be tracked from the image;
in the case of organ tissue edge contour determination, the edge contour information is used to determine the displacement variation of the organ over time. For complete image information of the organ contour, organ displacement information can be determined by calculating the center point or centroid of the organ. The ultrasound imaging quality problem may cause that the image cannot completely display the organ contour information, and at this time, the organ part edge contour information needs to be intercepted according to the specific area size to calculate the intercepted part center point to determine the organ displacement condition. Or extracting tissue feature points such as organ lumen holes.
4, determining a displacement-time curve;
then, the central point or the position of the characteristic point on the contour line is recorded, the displacement information of the characteristic point in the up-and-down direction is drawn under a displacement-time coordinate system according to the time sequence and is connected into a continuous curve, and then the displacement-time curve of the organ under the action of respiration can be obtained. In consideration of errors caused by image preprocessing, the directly extracted displacement-time curve is not convenient to be directly used for motion compensation of the endoscope operation robot, so that the displacement-time curve can be smoothed before compensation so as to facilitate subsequent compensation operation.
5, compensating the motion of the endoscope operation robot;
in the case of the organ displacement-time curve determination, the endoscope operation robot is motion-compensated with reference to the period and the fluctuation amplitude of the displacement-time curve. The compensation process is realized by matching with an endoscope, and the robot acquires the size change information of the organ cavity and tract hole through the endoscope to match with an organ displacement and breathing motion curve model under the breathing action. As shown in fig. 2, from left to right, the sizes of the cavities of the organ cavity are respectively smaller, moderate and larger, when the cavities are smaller, the breathing movement of the human body drives the organ to move, so that the distance between the organ cavity and the end device of the endoscope is farther, and when the cavities are larger, the breathing movement of the human body drives the organ to move, so that the distance between the organ cavity and the end device of the endoscope is closer, the robot controls the endoscope to be far away from the organ cavity according to the amplitude of the displacement-time curve, so that the relative positions of the end device of the endoscope and the organ are kept basically unchanged; besides respiratory movement, external disturbance and other factors can also cause displacement deviation between an organ cavity and the endoscope, and the robot can control the displacement of the end manipulator by referring to the size change of the organ cavity of the endoscope so as to keep the relative position of the end device of the endoscope and the organ basically unchanged.
The invention has the advantages and beneficial effects that:
the invention directly tracks the movement of organs or targets more accurately through real-time ultrasonic images; the breathing motion rules of different human bodies and different time periods of the same human body can be directly displayed; the organ displacement-time curve in the respiratory state obtained by collecting the ultrasonic image is registered with the cavity hole image captured by the endoscope, so that the displacement deviation between the endoscope and the human body organ caused by respiration can be compensated.
Drawings
Fig. 1 is a flow chart of a method for acquiring organ displacement-time curves under the action of respiratory motion and compensating motion of an endoscope operation robot.
FIG. 2 is a schematic view of an organ lumen hole viewed through an endoscope view, 1 organ lumen hole; 2, 3 and 4 are the views of the cavities of the organ cavity observed by the endoscope under different respiratory states.
Fig. 3 is a graph of displacement versus time in the up-down direction of a human organ.
Fig. 4 is a schematic diagram of an embodiment. 5 motion control system of medical robot, 6 endoscope end device; 7 endoscope hose; 8 organ cavities.
FIG. 5 is a schematic view of the operation robot controlling the downward movement of the endoscope.
Fig. 6 is a schematic view of the manipulation robot controlling the upward movement of the endoscope.
Detailed Description
Example 1:
the invention provides a motion control method for a flexible endoscope operation robot, which is used for eliminating displacement deviation between an organ and an endoscope end manipulator caused by respiratory motion after an endoscope enters a human body cavity.
[1] Firstly, before an endoscope enters a human body cavity and after various work preparations are finished, organ ultrasonic images under the condition of stable human body breathing state are collected.
[2] And carrying out filtering pretreatment on the acquired ultrasonic image.
[3] And extracting the contour of the target organ or key characteristic points in the tissue organ from the ultrasonic image.
[4] And then recording the center points or the positions of the characteristic points of the organ contour at different times so as to determine the displacement-time curve of the human organ in the up-and-down direction as shown in figure 3.
[5] As shown in fig. 4, after the endoscope enters the body cavity, the endoscope collects the images of the organ cavity and determines the size of the orifice of the organ cavity, then the displacement-time curve is matched with the organ breathing movement, and then the robot controls the movement of the device hand at the end of the endoscope. When the hole 8 is observed through the endoscope to be small, it is indicated that the organ moves upward relative to the endoscope end device 6 under the respiration action, and then as shown in fig. 5, the operation robot gives compensation for the upward displacement of the endoscope end device; when the orifice 8 of the lumen is observed to be large, the organ moves downward relative to the endoscope tip apparatus 6, and the robot gives compensation for the downward displacement of the endoscope tip apparatus as shown in fig. 6, thereby reducing the relative displacement between the organ and the endoscope tip apparatus under the respiration action. The relative displacement between the endoscope end device and the human organ under the breath-preserving state can be basically eliminated through the control action of the endoscope operation robot.

Claims (3)

1. A motion control method for a flexible endoscope operating robot, the steps of the method comprising:
1, acquiring an ultrasonic image;
firstly, acquiring image information in a complete respiratory motion cycle of a human body; before the endoscope enters a human body cavity, acquiring ultrasonic image information under the condition of stable respiratory state of a human body, and keeping the body position basically consistent with that in the actual operation process in the image information acquisition process;
2, preprocessing the ultrasonic image;
because of the limitation of the ultrasonic imaging principle, the unprocessed ultrasonic image has more speckle noise, and the edge contour of a target organ is difficult to directly obtain or the selected target position is directly extracted, so that the edge contour of the tissue is accurately determined by utilizing a medical image filtering and segmenting algorithm; because real-time ultrasound can obtain multi-frame image data in a short time, a segmentation result of the image data of the previous frame is utilized to provide reference for image segmentation of the next frame in the image segmentation process, so that the segmentation accuracy is improved, and the segmentation time is shortened;
3, extracting organ tissues or targets to be tracked from the image;
extracting the edge contour of the selected organ according to the image preprocessing result; for the complete image information of the organ contour, determining organ displacement information by calculating the center point of the organ contour; because of the quality problem of ultrasonic imaging, the ultrasonic image may not be able to completely display the organ outline information, at this time, the edge outline information of the organ part is intercepted according to the size of a specific area, and the center point of the intercepted part is calculated to determine the displacement condition of the organ; or extracting tissue characteristic points;
4, determining an organ displacement-time curve;
then recording the central point or the position of the characteristic point on the organ contour line, marking the displacement information of the characteristic point in the vertical direction under a displacement-time coordinate system according to the time sequence, and connecting into a continuous curve to obtain a displacement-time curve of the organ in the vertical direction as a whole; in consideration of errors caused by image preprocessing, the directly extracted displacement-time curve is not convenient for a robot operating the endoscope to perform motion compensation directly, so that the displacement-time curve is smoothed before compensation to facilitate motion compensation operation;
5, compensating the motion of the endoscope operation robot;
under the condition that an organ displacement-time curve is determined, motion compensation is carried out on a robot for operating the endoscope through observing organ displacement-time curve information and an organ cavity image observed by the endoscope; the organ displacement-time curves and organ motion are registered using the observed lumen opening size prior to compensation.
2. The motion control method for a flexible endoscope operating robot according to claim 1, characterized in that the characteristic point in step 3 is an organ cavity.
3. The motion control method for a flexible endoscope operating robot according to claim 1, characterized in that said respiratory motion compensation process for the robot operating the endoscope of step 5 is as follows: under the condition that the organ respiratory motion curve is determined through the ultrasonic image, the size change of the organ cavity hole is observed from the endoscope to match the organ displacement-time curve; when the cavity hole is observed to be smaller or larger, the organ cavity and the endoscope end device are considered to be far away or close, and then the endoscope operation robot adjusts the displacement of the endoscope to enable the relative position between the endoscope end device and the organ to be basically unchanged.
CN201910997683.5A 2019-10-21 2019-10-21 Motion control method for flexible endoscope operation robot Pending CN110742691A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910997683.5A CN110742691A (en) 2019-10-21 2019-10-21 Motion control method for flexible endoscope operation robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910997683.5A CN110742691A (en) 2019-10-21 2019-10-21 Motion control method for flexible endoscope operation robot

Publications (1)

Publication Number Publication Date
CN110742691A true CN110742691A (en) 2020-02-04

Family

ID=69278983

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910997683.5A Pending CN110742691A (en) 2019-10-21 2019-10-21 Motion control method for flexible endoscope operation robot

Country Status (1)

Country Link
CN (1) CN110742691A (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111658149A (en) * 2020-06-19 2020-09-15 张学斌 Percutaneous nephroscope operation robot execution system and executor
CN111887988A (en) * 2020-07-06 2020-11-06 罗雄彪 Positioning method and device of minimally invasive interventional operation navigation robot
CN112998853A (en) * 2021-02-25 2021-06-22 四川大学华西医院 2D modeling method, 3D modeling method and detection system for abdominal vascular dynamic angiography
CN114220060A (en) * 2021-12-24 2022-03-22 萱闱(北京)生物科技有限公司 Instrument marking method, device, medium and computing equipment based on artificial intelligence

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111658149A (en) * 2020-06-19 2020-09-15 张学斌 Percutaneous nephroscope operation robot execution system and executor
CN111658149B (en) * 2020-06-19 2021-06-01 张学斌 Percutaneous nephroscope operation robot execution system and executor
CN111887988A (en) * 2020-07-06 2020-11-06 罗雄彪 Positioning method and device of minimally invasive interventional operation navigation robot
CN112998853A (en) * 2021-02-25 2021-06-22 四川大学华西医院 2D modeling method, 3D modeling method and detection system for abdominal vascular dynamic angiography
CN114220060A (en) * 2021-12-24 2022-03-22 萱闱(北京)生物科技有限公司 Instrument marking method, device, medium and computing equipment based on artificial intelligence

Similar Documents

Publication Publication Date Title
CN110742691A (en) Motion control method for flexible endoscope operation robot
US11931141B2 (en) Hybrid registration method
US20210059762A1 (en) Motion compensation platform for image guided percutaneous access to bodily organs and structures
KR20070026135A (en) Segmentation and registration of multimodal images using physiological data
US10390892B2 (en) System and methods for updating patient registration during surface trace acquisition
CN106236264B (en) Gastrointestinal surgery navigation method and system based on optical tracking and image matching
US20230346479A1 (en) Alerting and mitigating divergence of anatomical feature locations from prior images to real-time interrogation
US20100001996A1 (en) Apparatus for guiding towards targets during motion using gpu processing
EP4271305A1 (en) Systems for image-based registration and associated methods
CN114820855A (en) Lung respiration process image reconstruction method and device based on patient 4D-CT
CN115380309A (en) Mitigating registration data oversampling
CN115462903B (en) Human body internal and external sensor cooperative positioning system based on magnetic navigation
CN112535519A (en) Puncture robot control method based on real-time ultrasonic image
JP2022517807A (en) Systems and methods for medical navigation
US20200390417A1 (en) Ultrasound tracking and visualization
US20240099776A1 (en) Systems and methods for integrating intraoperative image data with minimally invasive medical techniques
US20240216010A1 (en) Method and device for registration and tracking during a percutaneous procedure
US20230215059A1 (en) Three-dimensional model reconstruction
CN114795473A (en) Respiration tracking method based on spliced image
WO2022229937A1 (en) Method and device for registration and tracking during a percutaneous procedure
Nicolau et al. In vivo evaluation of a guidance system for computer assisted robotized needle insertion devoted to small animals
CN118234422A (en) Method and apparatus for registration and tracking during percutaneous surgery
EP4171421A1 (en) Systems for evaluating registerability of anatomic models and associated methods
CN117137624A (en) Medical image processing method for dynamic tracking of target tissue
CN118141360A (en) Electrophysiology three-dimensional mapping system and non-contact respiration measurement gating method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20200204

RJ01 Rejection of invention patent application after publication