CN116076995B - Scanning control method and scanning control system for capsule endoscope - Google Patents

Scanning control method and scanning control system for capsule endoscope Download PDF

Info

Publication number
CN116076995B
CN116076995B CN202310100128.4A CN202310100128A CN116076995B CN 116076995 B CN116076995 B CN 116076995B CN 202310100128 A CN202310100128 A CN 202310100128A CN 116076995 B CN116076995 B CN 116076995B
Authority
CN
China
Prior art keywords
moment
capsule endoscope
point cloud
pose
current
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202310100128.4A
Other languages
Chinese (zh)
Other versions
CN116076995A (en
Inventor
胡峰
李鹏
张澍田
马婷
孙虎成
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang Shitong Robot Technology Co ltd
Original Assignee
Zhejiang Shitong Robot Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang Shitong Robot Technology Co ltd filed Critical Zhejiang Shitong Robot Technology Co ltd
Priority to CN202310100128.4A priority Critical patent/CN116076995B/en
Publication of CN116076995A publication Critical patent/CN116076995A/en
Application granted granted Critical
Publication of CN116076995B publication Critical patent/CN116076995B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00002Operational features of endoscopes
    • A61B1/00004Operational features of endoscopes characterised by electronic signal processing
    • A61B1/00006Operational features of endoscopes characterised by electronic signal processing of control signals
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00002Operational features of endoscopes
    • A61B1/0002Operational features of endoscopes provided with data storages
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/04Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
    • A61B1/041Capsule endoscopes for imaging
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/04Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
    • A61B1/045Control thereof
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/273Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor for the upper alimentary canal, e.g. oesophagoscopes, gastroscopes
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T10/00Road transport of goods or passengers
    • Y02T10/10Internal combustion engine [ICE] based vehicles
    • Y02T10/40Engine management systems

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Surgery (AREA)
  • Engineering & Computer Science (AREA)
  • Biophysics (AREA)
  • Medical Informatics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Optics & Photonics (AREA)
  • Pathology (AREA)
  • Radiology & Medical Imaging (AREA)
  • Veterinary Medicine (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Physics & Mathematics (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Signal Processing (AREA)
  • Gastroenterology & Hepatology (AREA)
  • Endoscopes (AREA)

Abstract

The application discloses a scanning control method and a scanning control system for a capsule endoscope. The scanning control method comprises the following steps: calculating to obtain a first pose transformation matrix of the capsule endoscope according to the current moment control signal and the last moment control signal of the adjacent moment; calculating a second pose transformation matrix of the capsule endoscope according to the current moment image and the last moment image of the capsule endoscope, which are obtained by shooting and adjacent moment; calculating to obtain the current moment pose parameters of the capsule endoscope according to the first pose transformation matrix, the second pose transformation matrix, the current moment image and the previous moment image; updating according to the current time image and the current time pose parameter to obtain a current time point cloud grid; obtaining a motion direction and a motion distance of the capsule endoscope at the next moment according to the current moment point cloud grid planning; and updating to obtain a control signal of the next moment based on the movement direction and the movement distance of the next moment. The method can realize automatic scanning and greatly improve the inspection efficiency.

Description

Scanning control method and scanning control system for capsule endoscope
Technical Field
The application belongs to the technical field of medical equipment imaging, and particularly relates to a scanning control method and a scanning control system for a capsule endoscope.
Background
The capsule endoscope is a medical instrument device, the capsule endoscope integrates the core functions of image acquisition, wireless transmission and the like into a capsule which can be swallowed by a human body, and in the process of examination, the capsule endoscope is swallowed into the body, and the endoscope acquires images of the digestive tract in the body and synchronously transmits the images to the outside of the body so as to carry out medical examination according to the acquired image data.
The capsule endoscope is pulled by an external magnetic field in the stomach of a human body to realize active movement, and the inner wall of the stomach of the human body is photographed in the moving process. The traditional magnetic field control mode is to erect a mechanical control arm carrying a strong magnet outside a human body, and an operator controls the movement of the mechanical control arm by observing images transmitted from a capsule endoscope to the human body, so as to achieve the inspection of the inner wall of the stomach of the human body. The whole process is long in time consumption, the checking effect greatly depends on the experience of an operator, and due to the fact that the field of view of the capsule endoscope is small, the operator is difficult to judge the real-time position of the capsule through the capsule endoscope image, and the problem of missed diagnosis is easy to occur.
Disclosure of Invention
The application solves the technical problems that: how to conveniently and effectively control the scanning process of the capsule endoscope.
The application discloses a scanning control method for a capsule endoscope, which comprises the following steps:
calculating to obtain a first pose transformation matrix of the capsule endoscope according to the current moment control signal and the last moment control signal of the adjacent moment;
calculating a second pose transformation matrix of the capsule endoscope according to the current moment image and the last moment image of the capsule endoscope, which are obtained by shooting and adjacent moment;
calculating to obtain the current moment pose parameters of the capsule endoscope according to the first pose transformation matrix, the second pose transformation matrix, the current moment image and the last moment image;
updating according to the current time image and the current time pose parameter to obtain a current time point cloud grid;
obtaining a motion direction and a motion distance of the capsule endoscope at the next moment according to the current moment point cloud grid planning;
and updating and obtaining a next-moment control signal based on the movement direction and the movement distance at the next moment, wherein the next-moment control signal is used for enabling an external control device for pulling the capsule endoscope to generate a pulling action.
Preferably, the method for calculating the first pose transformation matrix of the capsule endoscope according to the current time control signal and the last time control signal of the adjacent time comprises the following steps:
calculating according to the current time control signal to obtain a current time transformation matrix of a control coordinate system of the capsule endoscope relative to a world coordinate system, and calculating according to the last time control signal to obtain a next time transformation matrix of the control coordinate system relative to the world coordinate system;
and calculating according to the current moment transformation matrix and the next moment transformation matrix to obtain a first pose transformation matrix of the capsule endoscope.
Preferably, the method for calculating the second pose transformation matrix of the capsule endoscope according to the current moment image and the last moment image of the capsule endoscope, which are shot by the capsule endoscope and are adjacent to each other, comprises the following steps:
performing feature point matching on the current moment image and the previous moment image to obtain a plurality of pairs of matching feature points;
constructing a epipolar constraint equation set according to pixel coordinates corresponding to the pairs of matching feature points;
solving the epipolar constraint equation set by adopting a least square method;
constructing an essential matrix according to the solution of the epipolar constraint equation set;
obtaining a plurality of estimated values of a rotation matrix and a translation vector by using the essence matrix;
and verifying each estimated value by using the matched feature points to obtain final values of the rotation matrix and the translation vector, wherein the rotation matrix and the translation vector form a second pose transformation matrix.
Preferably, the method for calculating the current time pose parameter of the capsule endoscope according to the first pose transformation matrix, the second pose transformation matrix, the current time image and the previous time image comprises the following steps:
weighting the first pose transformation matrix and the second pose transformation matrix, and constructing to obtain a pose transformation fusion matrix;
constructing an objective function of the pose parameter at the current moment according to the pose transformation fusion matrix and the pairs of matching feature points;
and optimizing an objective function by adopting an error minimization algorithm to obtain the pose parameter of the capsule endoscope at the current moment.
Preferably, the method for updating the current moment point cloud grid according to the current moment image and the current moment pose parameter comprises the following steps:
generating a current moment sparse point cloud according to a plurality of feature matching points of a current moment image, and calculating coordinates of the current moment sparse point cloud in a world coordinate system according to the current moment pose parameters;
and updating the pre-constructed point cloud grid at the previous moment according to the space point cloud at the current moment to obtain the point cloud grid at the current moment.
Preferably, the method for obtaining the motion direction and the motion distance of the capsule endoscope at the next moment according to the current moment point cloud grid planning comprises the following steps:
searching a boundary in the point cloud grid at the current moment;
calculating to obtain a vertical vector of the boundary, wherein the vertical vector points out of the point cloud grid at the current moment;
and taking the direction of the vertical vector as the movement direction of the next moment, and determining the movement distance of the next moment according to the distance between the two short points of the boundary.
Preferably, the scan control method further includes: after the in-vitro control device controls the capsule endoscope to complete scanning, pixel filling processing is carried out on the point cloud grid at the final moment obtained by construction:
selecting a point cloud triangular grid in the point cloud grids at the final moment;
filling pixels of each point in the point cloud triangular mesh according to pixel values of the matched characteristic points corresponding to the three vertexes of the point cloud triangular mesh;
and traversing each point cloud triangular grid of the point cloud grids at the final moment until all the point cloud grids are subjected to pixel filling.
The application also discloses a scanning control system for the capsule endoscope, which comprises:
the pose estimation module is used for calculating to obtain a first pose transformation matrix of the capsule endoscope according to the current moment control signal and the last moment control signal of the adjacent moment;
the pose measurement module is used for calculating a second pose transformation matrix of the capsule endoscope according to the current moment image and the last moment image which are obtained by shooting the capsule endoscope and are adjacent to each other;
the pose fusion module is used for calculating and obtaining the pose parameters of the capsule endoscope at the current moment according to the first pose transformation matrix, the second pose transformation matrix, the current moment image and the last moment image;
the point cloud grid construction module is used for updating and obtaining a point cloud grid at the current moment according to the current moment image and the current moment pose parameter;
the path planning module is used for planning according to the point cloud grid at the current moment to obtain the movement direction and the movement distance of the capsule endoscope at the next moment;
the control signal generation module is used for updating and obtaining a next-moment control signal based on the movement direction and the movement distance of the next moment, and the next-moment control signal is used for enabling an external control device for pulling the capsule endoscope to generate pulling action.
The application also discloses a computer readable storage medium storing a scanning control program for the capsule endoscope, which when executed by a processor, implements the scanning control method for the capsule endoscope.
The application also discloses a computer device, which comprises a computer readable storage medium, a processor and a scanning control program for the capsule endoscope, wherein the scanning control program for the capsule endoscope is stored in the computer readable storage medium, and the scanning control method for the capsule endoscope is realized when the scanning control program for the capsule endoscope is executed by the processor.
(III) beneficial effects
Compared with the prior art, the scanning control method for the capsule endoscope has the following technical effects:
the path planning is carried out through the control signals at each moment and the images shot in real time, so that the inspection of the capsule endoscope gets rid of manual operation, automatic scanning is realized, and the inspection efficiency is greatly improved. Meanwhile, the panoramic image of the digestive tract of the examinee generated by panoramic reconstruction greatly saves the examination time of doctors, and the three-dimensional positioning function of the panoramic image can assist the doctors to judge the focus positions, so that the subsequent diagnosis and treatment are convenient.
Drawings
FIG. 1 is a flow chart of a scan control method for a capsule endoscope according to a first embodiment of the present application;
FIG. 2 is a schematic view of an extracorporeal control apparatus according to a first embodiment of the present application;
FIG. 3 is a control coordinate system diagram according to a first embodiment of the present application;
FIG. 4 is a partial schematic view of a point cloud grid according to a first embodiment of the present application;
FIG. 5 is a schematic diagram illustrating pixel point filling in a triangle mesh of a point cloud according to a first embodiment of the present application;
FIG. 6 is a functional block diagram of a scan control system for a capsule endoscope according to a second embodiment of the present application;
fig. 7 is a schematic diagram of a computer device according to a fourth embodiment of the present application.
Detailed Description
The present application will be described in further detail with reference to the drawings and examples, in order to make the objects, technical solutions and advantages of the present application more apparent. It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the scope of the application.
Before describing various embodiments of the present application in detail, the technical idea of the present application will be briefly described first: in the prior art, when controlling the movement of the capsule endoscope, an operator is often required to observe a photographed image in real time and combine the operation experience of the operator to adjust the external control device so as to complete the examination, and the whole process is difficult and easy to leak diagnosis. Therefore, the scanning control method provided by the application respectively utilizes the control signals of adjacent moments to estimate the pose conversion parameters of the capsule endoscope and utilizes the images of the adjacent moments to calculate the other pose conversion parameters of the capsule endoscope, then fuses the two pose conversion parameters to obtain more accurate pose parameters of the capsule endoscope, further constructs a point cloud grid in real time, performs path planning according to the point cloud grid to obtain the movement direction and the movement distance of the capsule endoscope at the next moment, and regenerates the control signals at the next moment so that the external control device generates traction action to draw the capsule endoscope to continue scanning. The whole scanning process has high automation degree, does not need excessive intervention of operators, and can realize complete scanning of the digestive tract.
Specifically, as shown in fig. 1, the scanning control method for a capsule endoscope of the first embodiment includes the steps of:
step S10: calculating to obtain a first pose transformation matrix of the capsule endoscope according to the current moment control signal and the last moment control signal of the adjacent moment;
step S20: calculating a second pose transformation matrix of the capsule endoscope according to the current moment image and the last moment image of the capsule endoscope, which are obtained by shooting and adjacent moment;
step S30: calculating to obtain the current moment pose parameters of the capsule endoscope according to the first pose transformation matrix, the second pose transformation matrix, the current moment image and the last moment image;
step S40: updating according to the current time image and the current time pose parameter to obtain a current time point cloud grid;
step S50: obtaining a motion direction and a motion distance of the capsule endoscope at the next moment according to the current moment point cloud grid planning;
step S60: and updating based on the movement direction and the movement distance at the next moment to obtain a control signal at the next moment, wherein the control signal at the next moment is used for enabling an external control device for pulling the capsule endoscope to generate a pulling action.
Before the above steps are described, the structure of the external control device and the relative relationship of the external control device will be described.
As shown in fig. 2, the extracorporeal control apparatus 10 includes a first degree-of-freedom carrier 101b, a second degree-of-freedom carrier 102b, a third degree-of-freedom carrier 103b, a guide rail 106, and a magnetic field generating device fixed to the third degree-of-freedom carrier 103 b. Wherein, the first degree of freedom carrier 101b can carry the second and third degree of freedom carriers to rotate around the first rotation shaft 101 a; the second degree of freedom carrier 102b may rotate about the second axis of rotation 102a with the third degree of freedom carrier; the third degree of freedom carrier 103b may carry the magnetic field generating device to rotate around the third rotation axis 103a, and the first degree of freedom carrier may also carry the second and third degree of freedom carriers to integrally move along the guide rail 106. Wherein the magnetic field generating means is constituted by two magnetic poles 104a and 104b fixedly arranged at symmetrical positions of the third degree of freedom carrier 103 b.
As shown in fig. 3, a control coordinate system O bound to the third degree of freedom carrier 103b is defined C -X C -Y C -Z C . In the initial state, the three axes of the control coordinate system are overlapped with the first rotating shaft, the second rotating shaft and the third rotating shaft, and the positive direction of rotation along each axis is defined to follow the right-hand spiral rule (the thumb of the right hand points to the positive direction of the rotating shaft to hold, and the directions pointed by the other four fingers are the positive rotation directions). Wherein, the control coordinate system is only coincident with the first, second and third rotating shafts in the initial state, and the control coordinate system is bound with the third degree-of-freedom carrier in operation.
Initial state, control coordinate system and world coordinate system O W -X W -Y W -Z W And overlapping and fixing the world coordinate system. External control deviceThe device 10 is pulled and controlled by the magnetic poles 104a and 104b to locate a capsule endoscope 105 having magnetic poles within a human body. Due to the adoption of the opposite magnetic poles with strong magnetic fields, the capsule endoscope 105 is constrained by the magnetic field lines between the magnetic poles and is blocked by the stomach wall of a human body, so that the stress balance is achieved. In the initial state, the capsule endoscope 105 is closer to the magnetic pole 104 a.
The in vitro control device has four degrees of control freedom: rotation angle θ along the first axis of rotation and rotation angle along the second axis of rotationA rotation angle phi along the third rotation axis and a movement distance l along the guide rail. The current time control signal of the current time k can be expressed asDue to the strong magnetic field between the poles 104a and 104b, the capsule endoscope in the body tends to stay along the line between 104a and 104 b. Then relative to the control coordinate system O C -X C -Y C -Z C At the current time k, the position coordinates of the capsule endoscope are:
definition of axial Unit vector of Capsule endoscope under control coordinate SystemThen
The unit vector is also the shooting view angle direction of the capsule endoscope image acquisition.
In step S10, a current time transformation matrix of the control coordinate system of the capsule endoscope relative to the world coordinate system is calculated according to the current time control signal. Specifically, the control coordinate system is now located relative to the worldThe pose of the target system can be controlled by the current moment control signalAnd (3) estimating to obtain:
wherein the method comprises the steps ofFor controlling the signal according to the current moment>Rotation matrix of the calculated control coordinate system relative to the world coordinate system, < >>For controlling the signal by the current moment->And calculating a translation matrix of the obtained control coordinate system relative to the world coordinate system. />Along X C Shaft rotation theta k A matrix of rotations of the angle(s),to be along Y C Shaft rotation->Rotation matrix of angles>Along Z C Shaft rotation phi k A rotation matrix of angles. I.e.
The current time transformation matrix of the control coordinate system calculated from the current time control signal with respect to the world coordinate system may be expressed as:
further, at the current time, the position coordinates of the capsule endoscope in the world coordinate system can be expressed as:
the capsule viewing angle, i.e. the unit vector of the image acquisition positive viewing angle direction, can be expressed as:
similarly, in step S10, the signal can be controlled according to the last timeCalculating to obtain a next moment transformation matrix of the control coordinate system relative to the world coordinate system:
further, in step S10, a first pose transformation matrix of the capsule endoscope is calculated according to the current moment transformation matrix and the next moment transformation matrix:
first pose transformation matrixAnd the posture conversion relation of the capsule endoscope relative to the world coordinate system at the current moment k is shown.
In step S20, the method for calculating the second pose transformation matrix of the capsule endoscope according to the current moment image and the last moment image of the capsule endoscope, which are obtained by shooting and adjacent moment, includes the following steps:
step S201: and performing feature matching on the current moment image and the previous moment image to obtain a plurality of pairs of matching feature points. Specifically, for the image I at the previous time k-1 And the current time image I k Feature point matching is carried out, the matching method can adopt SIFT, SURF or ORB algorithm, and RANSAC is adopted to screen the obtained matching feature points, and finally P pairs of matching feature points { P } are obtained i,k-1 I is greater than or equal to 1 and less than or equal to P and { P } i,k I is more than or equal to 1 and less than or equal to P, respectively belongs to the image I at the previous moment k-1 And the current time image I k And the image pixel coordinates of the matching feature points are expressed as:
step S202: constructing a epipolar constraint equation set according to pixel coordinates corresponding to the plurality of pairs of matching feature points:
e=[e 1 e 2 e 3 e 4 e 5 e 6 e 7 e 8 e 9 ] T
wherein e n N is more than or equal to 1 and less than or equal to 9, and n is a real number.
Step S203: and solving the epipolar constraint equation set by adopting a least square method. I.e. solving e using least squares
Step S204: constructing an essential matrix according to the solution of the epipolar constraint equation set:
step S205: obtaining a plurality of estimated values of a rotation matrix and a translation vector by using the essence matrix:
t=±UR a (±90 O )ΣU T
wherein U is a left singular matrix of E, U T For the right singular matrix of E, Σ represents the feature matrix of the essential matrix E, R a (gamma) represents a rotation matrix rotated by an angle gamma along the a-axis.
Step S206: and verifying each estimated value by using the matched feature points to obtain final values of the rotation matrix and the translation vector, wherein the rotation matrix and the translation vector form a second pose transformation matrix.
Screening a rotation matrix and a translation matrix, and selecting a matched characteristic point pair p i,n-1 And p i,n Different R, t is brought into the following calculation:
λ i =(Rp i,k-1 +t)/p i,k
if the obtained matrix lambda i All elements are positive, the selected R, t is retained as the current instant image I k Image I at the previous time k-1 Frame rotation matrix and translation matrix of (a)Then I k Relative I k-1 The second pose transformation matrix of the image, namely the current moment k relative to the pose of the capsule endoscope at the last moment k-1 is:
it should be noted that, the steps S10 and S20 may be exchanged, and the sequence of the two is not limited in the first embodiment.
In step S30, the method for calculating the current pose parameter of the capsule endoscope according to the first pose transformation matrix, the second pose transformation matrix, the current moment image and the previous moment image includes the following steps:
step S301: the first pose transformation matrix and the second pose transformation matrix are weighted and then constructed to obtain a pose transformation fusion matrix:
wherein, the liquid crystal display device comprises a liquid crystal display device,0≤λ k,rk,t ≤1
step S302: fusing the matrix and the pairs of matching feature points { p } according to pose transformation i,k-1 I is greater than or equal to 1 and less than or equal to P and { P } i,k Building an objective function of pose parameters at the current moment, wherein i is more than or equal to 1 and less than or equal to P:
step S303: and optimizing an objective function by adopting an error minimization algorithm to obtain the current-moment pose parameters of the capsule endoscope.
Optimized parameter lambda k For confidence, i.e. the distribution of the first and second pose transformation matrices in the weighting process, the error is the image I at the previous moment k-1 And the current time image I k Error generated after the re-projection of the matching characteristic points in the model (a). K in the above formula represents an internal parameter of a camera in the capsule endoscope.
In step S40, according to the current time image I k And the pose parameter at the current momentThe method for updating the point cloud grid at the current moment comprises the following steps:
step S401: according to the current moment image I k Generating a sparse point cloud at the current moment, and calculating the coordinates of the sparse point cloud at the current moment in a world coordinate system according to the pose parameters at the current moment.
At the current time k, through the current time image I k The identified P feature points { P } i,k Generating sparse point cloud { at the current moment by using i and P which are more than or equal to 1 W p i,k And the coordinate of the sparse point cloud in the world coordinate system at the current moment is that i is more than or equal to 1 and less than or equal to P:
step S402: and updating the pre-constructed point cloud grid at the previous moment according to the space point cloud at the current moment to obtain the point cloud grid at the current moment.
At the previous moment k-1, the point cloud grid M at the previous moment is constructed k-1 ,(M 0 =null). Based on M k-1 Current time space point cloud { generated at current time k W p i,k Establishing sparse point cloud grid M at k moment when i is more than or equal to 1 and less than or equal to P k
In { overs (r) W p i,k Selecting point in P with i being more than or equal to 1 W p i,k Search points W p i,k The triangle m where it is located. Starting from triangle m, searching for the adjacent triangle of the triangle, and performing empty circle detection. Finding the inclusion point of the circumscribing circle W p i,k And delete the triangles to form a packagePolygonal cavity containing P. Then connect W p i,k And (3) with W p i,k Forming a new triangular grid at each vertex of the cavity, merging the new triangular grid into a coefficient point cloud grid and updating M k Repeating the steps until the traverse point set { W p i,k I is more than or equal to 1 and less than or equal to P, and the point cloud grid M at the current moment is obtained k
In step S50, according to the current time point cloud grid M k The method for planning and obtaining the movement direction and the movement distance of the capsule endoscope at the next moment comprises the following steps:
step S501: searching for boundaries in the point cloud grid at the current moment.
As shown in fig. 4, for constructing the current time point cloud grid M k Is a certain point p of a Find the adjacent B points p b 1 < B < B, and from point p a And point p b Line segment S of constitution a,b B is more than 1 and less than B. Search line segment S a,b Triangle to which the number N is counted a,b . If N a,b < 2, segment S a,b As a boundary.
Specifically, after a part of the point cloud grids are established, it is required to determine whether the established point cloud grids form a closed curved surface (for example, if stomach modeling is performed, whether the whole stomach is scanned) so that by searching a certain side, when the side only belongs to 0 or 1 triangle, the side is the boundary of the curved surface formed by the current point cloud grid, and at the moment, the curved surface is not closed yet.
Step S502: and calculating to obtain a vertical vector of the boundary, wherein the vertical vector points out of the point cloud grid at the current moment. Calculating vectors perpendicular to the boundaryPointing to the external direction of the point cloud grid at the current moment, the vector is generated by a solution of the following formula:
step S503: and taking the direction of the vertical vector as the motion direction of the next moment, and determining the motion distance of the next moment according to the distance between two short points of the boundary.
Wherein the motion direction at the next moment isThe movement distance at the next moment is set to +.>
Finally, in step S60, a next time control signal is updated according to the movement direction and the movement distance, and the next time control signal is used to make the external control device for pulling the capsule endoscope generate a pulling action, so that the movement direction of the capsule endoscope along the next time isThe movement distance is set to +.>
And further, the scanning control method further comprises the step of performing pixel filling processing on the point cloud grid at the final moment after the external control device controls the capsule endoscope to complete scanning.
Specifically, as shown in fig. 5, a point cloud triangle mesh is selected, and three vertexes thereof are set as W p i,lW p j,mW p k,n . The second subscript represents the image serial number of the vertex, and the first subscript represents the characteristic point serial number in the image of the vertex. For example W p i,l Indicating that the vertex belongs to image I l The i-th feature point of the image, which is matched with the feature point p of the image i,l Corresponding to the above.
For a certain spatial point in the selected triangular mesh W p t Firstly, calculating the coordinates of reverse mapping points of an image where three vertexes are located:
further, filling pixels into each point in the point cloud triangular mesh according to the pixel values of the matched characteristic points corresponding to the three vertexes of the point cloud triangular mesh. Wherein, each point in the point cloud triangular mesh W p t Is calculated from the following formula:
wherein, the liquid crystal display device comprises a liquid crystal display device,and I (p) t,s ) For image I s Midpoint p t,s Pixel values at s=l, m, n, the symbol x represents the modulus value of the vector x.
Further, traversing each point cloud triangular grid of the point cloud grids at the final moment until all the point cloud triangular grids are subjected to pixel filling, and finally obtaining a panoramic image.
The scanning control method for the capsule endoscope disclosed by the embodiment can enable the inspection of the capsule endoscope to get rid of manual operation, realize automatic scanning and greatly improve the inspection efficiency. Meanwhile, the panoramic image of the digestive tract of the examinee generated by panoramic reconstruction greatly saves the examination time of doctors, and the three-dimensional positioning function of the panoramic image can assist the doctors to judge the focus positions, so that the subsequent diagnosis and treatment are convenient.
Further, as shown in fig. 6, the second embodiment also discloses a scanning control system for a capsule endoscope, where the scanning control system includes a pose estimation module 100, a pose measurement module 200, a pose fusion module 300, a point cloud grid construction module 400, a path planning module 500, and a control signal generation module 600. The pose estimation module 100 is configured to calculate a first pose transformation matrix of the capsule endoscope according to a current moment control signal and a last moment control signal of adjacent moments; the pose measurement module 200 is used for calculating a second pose transformation matrix of the capsule endoscope according to the current moment image and the last moment image of the capsule endoscope, which are obtained by shooting and adjacent moment; the pose fusion module 300 is configured to calculate a current moment pose parameter of the capsule endoscope according to the first pose transformation matrix, the second pose transformation matrix, the current moment image and the previous moment image; the point cloud grid construction module 400 is configured to update and obtain a point cloud grid at the current moment according to the current moment image and the pose parameter at the current moment; the path planning module 500 is used for planning according to the point cloud grid at the current moment to obtain the movement direction and movement distance of the capsule endoscope at the next moment; the control signal generation module 600 is configured to update a next time control signal based on a motion direction and a motion distance at the next time, where the next time control signal is used to enable an external control device for pulling the capsule endoscope to generate a pulling action.
The more detailed operation of each module of the scan control system may refer to the related description of the first embodiment, and will not be described herein.
Further, the third embodiment also discloses a computer-readable storage medium storing a scan control program for a capsule endoscope, which when executed by a processor implements the scan control method for a capsule endoscope of the first embodiment.
The fourth embodiment also discloses a computer device, which includes, at the hardware level, as shown in fig. 7, a processor 12, an internal bus 13, a network interface 14, and a computer readable storage medium 11. The processor 12 reads the corresponding computer program from the computer-readable storage medium and then runs to form the request processing means at a logic level. Of course, in addition to software implementation, one or more embodiments of the present disclosure do not exclude other implementation manners, such as a logic device or a combination of software and hardware, etc., that is, the execution subject of the following processing flow is not limited to each logic unit, but may also be hardware or a logic device. The computer-readable storage medium 11 stores thereon a scanning control program for a capsule endoscope, which when executed by a processor, implements the scanning control method for a capsule endoscope described above.
Computer-readable storage media, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of computer-readable storage media include, but are not limited to, phase-change memory (PRAM), static Random Access Memory (SRAM), dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), read Only Memory (ROM), electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), digital Versatile Disks (DVD) or other optical storage, magnetic cassettes, magnetic disk storage, quantum memory, graphene-based storage or other magnetic storage devices, or any other non-transmission medium, which can be used to store information that can be accessed by a computing device.
While certain embodiments have been shown and described, it would be appreciated by those skilled in the art that changes and modifications may be made without departing from the principles and spirit of the application, the scope of which is defined in the claims and their equivalents.

Claims (4)

1. A scan control system for a capsule endoscope, the scan control system comprising:
the pose estimation module is used for calculating a first pose transformation matrix of the capsule endoscope according to a current moment control signal and a last moment control signal of adjacent moments, and comprises the following steps: calculating according to the current time control signal to obtain a current time transformation matrix of a control coordinate system of the capsule endoscope relative to a world coordinate system, and calculating according to the last time control signal to obtain a next time transformation matrix of the control coordinate system relative to the world coordinate system; calculating to obtain a first pose transformation matrix of the capsule endoscope according to the current moment transformation matrix and the next moment transformation matrix;
the pose measurement module is used for calculating a second pose transformation matrix of the capsule endoscope according to a current moment image and a last moment image which are shot by the capsule endoscope and are adjacent to each other, and comprises the following components: performing feature point matching on the current moment image and the previous moment image to obtain a plurality of pairs of matching feature points; constructing a epipolar constraint equation set according to pixel coordinates corresponding to the pairs of matching feature points; solving the epipolar constraint equation set by adopting a least square method; constructing an essential matrix according to the solution of the epipolar constraint equation set; obtaining a plurality of estimated values of a rotation matrix and a translation vector by using the essence matrix; verifying each estimated value by using the matched feature points to obtain final values of the rotation matrix and the translation vector, wherein the rotation matrix and the translation vector form a second pose transformation matrix;
the pose fusion module is configured to calculate a current moment pose parameter of the capsule endoscope according to the first pose transformation matrix, the second pose transformation matrix, the current moment image and the previous moment image, and includes: weighting the first pose transformation matrix and the second pose transformation matrix, and constructing to obtain a pose transformation fusion matrix; constructing an objective function of the pose parameter at the current moment according to the pose transformation fusion matrix and the pairs of matching feature points; optimizing an objective function by adopting an error minimization algorithm to obtain the current-moment pose parameter of the capsule endoscope;
the point cloud grid construction module is used for updating and obtaining a point cloud grid at the current moment according to the current moment image and the current moment pose parameter, and comprises the following steps: generating a current moment sparse point cloud according to a plurality of feature matching points of a current moment image, and calculating coordinates of the current moment sparse point cloud in a world coordinate system according to the current moment pose parameters; updating a pre-constructed point cloud grid at the previous moment according to the space point cloud at the current moment to obtain the point cloud grid at the current moment;
the path planning module is used for planning and obtaining the movement direction and the movement distance of the capsule endoscope at the next moment according to the point cloud grid at the current moment, and comprises the following steps: searching a boundary in the point cloud grid at the current moment; calculating to obtain a vertical vector of the boundary, wherein the vertical vector points out of the point cloud grid at the current moment; taking the direction of the vertical vector as the motion direction of the next moment, and determining the motion distance of the next moment according to the distance between two short points of the boundary;
the control signal generation module is used for updating and obtaining a next-moment control signal based on the movement direction and the movement distance of the next moment, and the next-moment control signal is used for enabling an external control device for pulling the capsule endoscope to generate pulling action.
2. The scan control system of claim 1, wherein the point cloud grid construction module is further configured to: after the in-vitro control device controls the capsule endoscope to complete scanning, pixel filling processing is carried out on the point cloud grid at the final moment obtained by construction:
selecting a point cloud triangular grid in the point cloud grids at the final moment;
filling pixels of each point in the point cloud triangular mesh according to pixel values of the matched characteristic points corresponding to the three vertexes of the point cloud triangular mesh;
and traversing each point cloud triangular grid of the point cloud grids at the final moment until all the point cloud grids are subjected to pixel filling.
3. A computer-readable storage medium storing a scan control program for a capsule endoscope, which when executed by a processor, implements a scan control method for a capsule endoscope, the scan control method comprising:
according to the current time control signal and the last time control signal of the adjacent time, a first pose transformation matrix of the capsule endoscope is obtained by calculation, and the method comprises the following steps: calculating according to the current time control signal to obtain a current time transformation matrix of a control coordinate system of the capsule endoscope relative to a world coordinate system, and calculating according to the last time control signal to obtain a next time transformation matrix of the control coordinate system relative to the world coordinate system; calculating to obtain a first pose transformation matrix of the capsule endoscope according to the current moment transformation matrix and the next moment transformation matrix;
the second pose transformation matrix of the capsule endoscope is obtained by calculation according to the current moment image and the last moment image of the capsule endoscope, which are shot by the capsule endoscope and are adjacent to each other, and the second pose transformation matrix comprises the following components: performing feature point matching on the current moment image and the previous moment image to obtain a plurality of pairs of matching feature points; constructing a epipolar constraint equation set according to pixel coordinates corresponding to the pairs of matching feature points; solving the epipolar constraint equation set by adopting a least square method; constructing an essential matrix according to the solution of the epipolar constraint equation set; obtaining a plurality of estimated values of a rotation matrix and a translation vector by using the essence matrix; verifying each estimated value by using the matched feature points to obtain final values of the rotation matrix and the translation vector, wherein the rotation matrix and the translation vector form a second pose transformation matrix;
calculating the current time pose parameters of the capsule endoscope according to the first pose transformation matrix, the second pose transformation matrix, the current time image and the last time image, wherein the current time pose parameters comprise: weighting the first pose transformation matrix and the second pose transformation matrix, and constructing to obtain a pose transformation fusion matrix; constructing an objective function of the pose parameter at the current moment according to the pose transformation fusion matrix and the pairs of matching feature points; optimizing an objective function by adopting an error minimization algorithm to obtain the current-moment pose parameter of the capsule endoscope;
updating according to the current time image and the current time pose parameter to obtain a current time point cloud grid, wherein the method comprises the following steps: generating a current moment sparse point cloud according to a plurality of feature matching points of a current moment image, and calculating coordinates of the current moment sparse point cloud in a world coordinate system according to the current moment pose parameters; updating a pre-constructed point cloud grid at the previous moment according to the space point cloud at the current moment to obtain the point cloud grid at the current moment;
obtaining a motion direction and a motion distance of the capsule endoscope at the next moment according to the current moment point cloud grid planning, wherein the method comprises the following steps: searching a boundary in the point cloud grid at the current moment; calculating to obtain a vertical vector of the boundary, wherein the vertical vector points out of the point cloud grid at the current moment; taking the direction of the vertical vector as the motion direction of the next moment, and determining the motion distance of the next moment according to the distance between two short points of the boundary;
and updating and obtaining a next-moment control signal based on the movement direction and the movement distance at the next moment, wherein the next-moment control signal is used for enabling an external control device for pulling the capsule endoscope to generate a pulling action.
4. A computer apparatus comprising a computer-readable storage medium, a processor, and a scan control program for a capsule endoscope stored in the computer-readable storage medium, the scan control program for a capsule endoscope implementing a scan control method for a capsule endoscope when executed by the processor, the scan control method comprising:
according to the current time control signal and the last time control signal of the adjacent time, a first pose transformation matrix of the capsule endoscope is obtained by calculation, and the method comprises the following steps: calculating according to the current time control signal to obtain a current time transformation matrix of a control coordinate system of the capsule endoscope relative to a world coordinate system, and calculating according to the last time control signal to obtain a next time transformation matrix of the control coordinate system relative to the world coordinate system; calculating to obtain a first pose transformation matrix of the capsule endoscope according to the current moment transformation matrix and the next moment transformation matrix;
the second pose transformation matrix of the capsule endoscope is obtained by calculation according to the current moment image and the last moment image of the capsule endoscope, which are shot by the capsule endoscope and are adjacent to each other, and the second pose transformation matrix comprises the following components: performing feature point matching on the current moment image and the previous moment image to obtain a plurality of pairs of matching feature points; constructing a epipolar constraint equation set according to pixel coordinates corresponding to the pairs of matching feature points; solving the epipolar constraint equation set by adopting a least square method; constructing an essential matrix according to the solution of the epipolar constraint equation set; obtaining a plurality of estimated values of a rotation matrix and a translation vector by using the essence matrix; verifying each estimated value by using the matched feature points to obtain final values of the rotation matrix and the translation vector, wherein the rotation matrix and the translation vector form a second pose transformation matrix;
calculating the current time pose parameters of the capsule endoscope according to the first pose transformation matrix, the second pose transformation matrix, the current time image and the last time image, wherein the current time pose parameters comprise: weighting the first pose transformation matrix and the second pose transformation matrix, and constructing to obtain a pose transformation fusion matrix; constructing an objective function of the pose parameter at the current moment according to the pose transformation fusion matrix and the pairs of matching feature points; optimizing an objective function by adopting an error minimization algorithm to obtain the current-moment pose parameter of the capsule endoscope;
updating according to the current time image and the current time pose parameter to obtain a current time point cloud grid, wherein the method comprises the following steps: generating a current moment sparse point cloud according to a plurality of feature matching points of a current moment image, and calculating coordinates of the current moment sparse point cloud in a world coordinate system according to the current moment pose parameters; updating a pre-constructed point cloud grid at the previous moment according to the space point cloud at the current moment to obtain the point cloud grid at the current moment;
obtaining a motion direction and a motion distance of the capsule endoscope at the next moment according to the current moment point cloud grid planning, wherein the method comprises the following steps: searching a boundary in the point cloud grid at the current moment; calculating to obtain a vertical vector of the boundary, wherein the vertical vector points out of the point cloud grid at the current moment; taking the direction of the vertical vector as the motion direction of the next moment, and determining the motion distance of the next moment according to the distance between two short points of the boundary;
and updating and obtaining a next-moment control signal based on the movement direction and the movement distance at the next moment, wherein the next-moment control signal is used for enabling an external control device for pulling the capsule endoscope to generate a pulling action.
CN202310100128.4A 2023-02-03 2023-02-03 Scanning control method and scanning control system for capsule endoscope Active CN116076995B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310100128.4A CN116076995B (en) 2023-02-03 2023-02-03 Scanning control method and scanning control system for capsule endoscope

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310100128.4A CN116076995B (en) 2023-02-03 2023-02-03 Scanning control method and scanning control system for capsule endoscope

Publications (2)

Publication Number Publication Date
CN116076995A CN116076995A (en) 2023-05-09
CN116076995B true CN116076995B (en) 2023-09-01

Family

ID=86199003

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310100128.4A Active CN116076995B (en) 2023-02-03 2023-02-03 Scanning control method and scanning control system for capsule endoscope

Country Status (1)

Country Link
CN (1) CN116076995B (en)

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20080061211A (en) * 2006-12-27 2008-07-02 충북대학교 산학협력단 Method for orientation measurement of an capsule endoscope and the system performing the same methode
CN101297756A (en) * 2008-06-19 2008-11-05 大连理工大学 Combined method of magnetic field and vision for locating gesture of medical micro type robot in vivo
CN107567315A (en) * 2015-03-06 2018-01-09 英国质谱公司 The open type MALDI-MS of chemistry guiding
CN108451490A (en) * 2018-01-29 2018-08-28 重庆金山医疗器械有限公司 A kind of system and method for searching capsule endoscope in gastrovascular cavity body
CN109766784A (en) * 2018-12-21 2019-05-17 北京理工大学 Capsule robot interaction control method based on monocular image
CN112089392A (en) * 2020-10-14 2020-12-18 深圳市资福医疗技术有限公司 Capsule endoscope control method, device, equipment, system and storage medium
CN114066781A (en) * 2022-01-18 2022-02-18 浙江鸿禾医疗科技有限责任公司 Capsule endoscope intestinal tract image identification and positioning method, storage medium and equipment
CN114299250A (en) * 2021-12-28 2022-04-08 中国矿业大学 Three-dimensional reconstruction method for working environment of stomach part of magnetorheological medical capsule robot
CN114617548A (en) * 2022-05-13 2022-06-14 广州思德医疗科技有限公司 Pose adjustment reminding method, device and equipment for examinee and readable storage medium
CN114916898A (en) * 2022-07-20 2022-08-19 广州华友明康光电科技有限公司 Automatic control inspection method, system, equipment and medium for magnetic control capsule

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR2962647B1 (en) * 2010-07-19 2013-05-24 Duo Ge DEVICE AND INSTALLATION FOR ASSEMBLING AT LEAST TWO MEDICINAL CAPSULES BY COLLAGE
US11116419B2 (en) * 2016-06-01 2021-09-14 Becton, Dickinson And Company Invasive medical devices including magnetic region and systems and methods

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20080061211A (en) * 2006-12-27 2008-07-02 충북대학교 산학협력단 Method for orientation measurement of an capsule endoscope and the system performing the same methode
CN101297756A (en) * 2008-06-19 2008-11-05 大连理工大学 Combined method of magnetic field and vision for locating gesture of medical micro type robot in vivo
CN107567315A (en) * 2015-03-06 2018-01-09 英国质谱公司 The open type MALDI-MS of chemistry guiding
CN108451490A (en) * 2018-01-29 2018-08-28 重庆金山医疗器械有限公司 A kind of system and method for searching capsule endoscope in gastrovascular cavity body
CN109766784A (en) * 2018-12-21 2019-05-17 北京理工大学 Capsule robot interaction control method based on monocular image
CN112089392A (en) * 2020-10-14 2020-12-18 深圳市资福医疗技术有限公司 Capsule endoscope control method, device, equipment, system and storage medium
CN114299250A (en) * 2021-12-28 2022-04-08 中国矿业大学 Three-dimensional reconstruction method for working environment of stomach part of magnetorheological medical capsule robot
CN114066781A (en) * 2022-01-18 2022-02-18 浙江鸿禾医疗科技有限责任公司 Capsule endoscope intestinal tract image identification and positioning method, storage medium and equipment
CN114617548A (en) * 2022-05-13 2022-06-14 广州思德医疗科技有限公司 Pose adjustment reminding method, device and equipment for examinee and readable storage medium
CN114916898A (en) * 2022-07-20 2022-08-19 广州华友明康光电科技有限公司 Automatic control inspection method, system, equipment and medium for magnetic control capsule

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
Gastric transit and small intestinal transit time and motility assessed by a magnet tracking system;WORSOE, J.等;BME Gastroenterology;第145卷(第11期);全文 *

Also Published As

Publication number Publication date
CN116076995A (en) 2023-05-09

Similar Documents

Publication Publication Date Title
CN108335353B (en) Three-dimensional reconstruction method, device and system of dynamic scene, server and medium
JP5153620B2 (en) System for superimposing images related to a continuously guided endoscope
Song et al. Dynamic reconstruction of deformable soft-tissue with stereo scope in minimal invasive surgery
CN112075914B (en) Capsule endoscopy system
CN109118545A (en) 3-D imaging system scaling method and system based on rotary shaft and binocular camera
JP4631057B2 (en) Endoscope system
CN106504321A (en) Method using the method for photo or video reconstruction three-dimensional tooth mould and using RGBD image reconstructions three-dimensional tooth mould
WO2023138544A1 (en) Capsule endoscope intestinal image-based recognition and positioning method, storage medium, and device
CN114782470B (en) Three-dimensional panoramic recognition positioning method of alimentary canal, storage medium and equipment
CN110288653B (en) Multi-angle ultrasonic image fusion method and system and electronic equipment
WO2023138619A1 (en) Endoscope image processing method and apparatus, readable medium, and electronic device
Dimas et al. Endoscopic single-image size measurements
WO2017180097A1 (en) Deformable registration of intra and preoperative inputs using generative mixture models and biomechanical deformation
Liu et al. Capsule endoscope localization based on computer vision technique
CN114519742A (en) Three-dimensional target automatic positioning and attitude determination method based on monocular optical photography and application thereof
CN116076995B (en) Scanning control method and scanning control system for capsule endoscope
CN111477318B (en) Virtual ultrasonic probe tracking method for remote control
JP4512833B2 (en) Intra-object site measurement system, intra-object site measurement computing device, in-object site measurement program, and computer-readable recording medium recording the program
CN112633113A (en) Cross-camera human face living body detection method and system
CN116687328A (en) Catheter movement control device, catheter movement control method, and storage medium
CN116616812A (en) NeRF positioning-based ultrasonic autonomous navigation method
CN115953377A (en) Digestive tract ultrasonic endoscope image fusion method and system
CN112581460B (en) Scanning planning method, device, computer equipment and storage medium
CN113674333B (en) Precision verification method and medium for calibration parameters and electronic equipment
Bao et al. An Improved QPSO Algorithm Based on EXIF for Camera Self-calibration

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant