CN117796745B - Method for estimating advancing and retreating distance of digestive endoscope lens - Google Patents

Method for estimating advancing and retreating distance of digestive endoscope lens Download PDF

Info

Publication number
CN117796745B
CN117796745B CN202410226117.5A CN202410226117A CN117796745B CN 117796745 B CN117796745 B CN 117796745B CN 202410226117 A CN202410226117 A CN 202410226117A CN 117796745 B CN117796745 B CN 117796745B
Authority
CN
China
Prior art keywords
frame
displacement
lens
digestive endoscope
optical flow
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202410226117.5A
Other languages
Chinese (zh)
Other versions
CN117796745A (en
Inventor
宋万忠
贾涵
胡兵
陈欧
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sichuan University
Original Assignee
Sichuan University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sichuan University filed Critical Sichuan University
Priority to CN202410226117.5A priority Critical patent/CN117796745B/en
Publication of CN117796745A publication Critical patent/CN117796745A/en
Application granted granted Critical
Publication of CN117796745B publication Critical patent/CN117796745B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Endoscopes (AREA)

Abstract

The invention provides a method for estimating the advancing and retreating distance of a digestive endoscope lens, which comprises the following steps: (1) Acquiring a camera position which takes 10mm as a minimum measurement unit and is output by a digestive endoscope with a positioning function; (2) Acquiring each frame of image of the digestive endoscope in real time by adopting screen video equipment; (3) Optionally selecting a frame of image as a current frame of image, and extracting feature points to be tracked; (4) calculating optical flow of the adjacent frame images using the L-K optical flow; (5) Calculating the ratio relation of the displacement of all frames between a starting frame and an ending frame of a section of endoscopic video; (6) Distributing and solving displacement according to the ratio relation obtained by calculation; (7) The advance and retreat amount of each frame can be obtained by performing the above operation on all frames. The invention realizes the accurate positioning of the displacement of the digestive endoscope lens based on the camera position output by the existing digestive endoscope with the positioning function and combines with the processing of algorithms such as L-K optical flow and the like, thereby providing accurate data support for accurate medical treatment.

Description

Method for estimating advancing and retreating distance of digestive endoscope lens
Technical Field
The invention relates to the field of digestive endoscopy positioning, in particular to a method for estimating the advancing and retreating distance of a digestive endoscopy lens.
Background
Precision medical treatment is already a current medical trend, and when using a digestive endoscope to perform an examination or operation on the digestive tract, other medical images (including CT, MRI, ultrasound, etc.) of the digestive tract need to be synchronously displayed according to the actual position of the digestive endoscope lens in the digestive tract, which needs to obtain the distance from the digestive endoscope lens to a certain fixed position (such as the top of esophagus, cardia, pylorus, anus, etc.). Currently, a digestive endoscope with a positioning function can display the advancing distance of the lens of the digestive endoscope in the digestive tract, but the minimum displayable distance is usually 10mm, that is, only a distance number is displayed every 10 mm. However, outputting the position information of one lens every 10mm cannot meet the requirements of personalized and accurate diagnosis and treatment of the endoscope, so that a method or device capable of accurately positioning the position of the digestive endoscope needs to be found.
Disclosure of Invention
The invention aims at: aiming at the problems in the prior art, the method for estimating the advancing and retreating distance of the digestive endoscope lens is provided, and the problem that the existing digestive endoscope position cannot be accurately positioned is solved.
The invention aims at realizing the following technical scheme:
a method of estimating the forward and backward distance of a digestive endoscope lens, the method comprising the steps of:
(1) Acquiring a camera position which takes 10mm as a minimum measurement unit and is output by a digestive endoscope with a positioning function;
(2) Acquiring each frame of image of the digestive endoscope in real time by adopting screen video equipment;
(3) Optionally selecting a frame of image as a current frame of image, and extracting feature points to be tracked;
(4) Calculating optical flow of the adjacent frame images using the L-K optical flow;
(5) Calculating the ratio relation of the displacement of all frames between a starting frame and an ending frame of a section of endoscopic video;
(6) Distributing and solving displacement according to the ratio relation obtained by calculation;
(7) The advance and retreat amount of each frame can be obtained by performing the above operation on all frames.
As a further technical scheme, the step (3) specifically comprises: and extracting a feature point set to be tracked by using a feature point detection algorithm on the current frame image.
As a further technical scheme, the step (5) specifically includes the steps of:
(51) The difference OF pixel coordinates OF projection points OF the same spatial point in two lens poses,
Wherein the method comprises the steps ofIs the rotation angle of the lens around the z-axis,/>For displacement of the lens along the z-axis,/>For the focal length of the camera lens in the x-direction,/>For the focal length of the camera lens in the y direction,/>Is the x component of the pixel coordinate of the principal point,/>The pixel coordinate y component of the principal point,/>Is the coordinates of the three-dimensional point in the lens coordinate system of the previous frame and is recorded
; In/>Is a notation for convenience of description, and has no practical meaning;
(52) The relation between the displacement of the endoscope lens on the z axis and the optical flow is obtained after simplifying the formula as follows: Wherein/> OF x and OF y are constant, the components OF the optical flow in the x and y directions, respectively;
(53) Obtaining the first according to the formula of the step (52) Ratio of individual feature points/>In the followingRepresenting the z-coordinate of the ith feature point in the first frame camera coordinate system,/>Is the z-direction displacement of the mth frame relative to the first frame, m represents the total number of frames in a minimum measurement unit of 10 mm; then combine all items with item 1/>Comparing to obtain; Averaging all the tracked characteristic points to obtain a set of weight of each frameWherein/>I.e. the ratio of the displacement of the 1 st to the m-th frame and the displacement of the 1 st to the 2 nd frame.
As a further technical scheme, the step (6) specifically includes the steps of: according to the formulaCalculating the displacement of all intermediate frames relative to the first frame, wherein/>Is the z-direction displacement of the m-th frame relative to the 1 st frame,/>Is the camera displacement in 10mm as the minimum unit of measure.
Compared with the prior art, the invention realizes the accurate positioning of the displacement of the lens of the digestive endoscope based on the camera position output by the existing digestive endoscope with the positioning function and combined with the processing of algorithms such as L-K optical flow and the like, and provides very accurate data support for accurate medical treatment.
Drawings
FIG. 1 is a definition of a coordinate system;
FIG. 2 is a definition of a lens coordinate system;
FIG. 3 is a view of lens positions (distance from the starting point) corresponding to frames of an endoscope video during the process of entering the endoscope estimated by the method;
FIG. 4 is a view of lens positions (distance from the starting point) corresponding to frames of an endoscope video during the lens withdrawal estimated by the method;
fig. 5 is a view showing an endoscopic image and a CT image synchronously displayed according to an estimated digestive endoscopic position.
Detailed Description
The invention will now be described in detail with reference to the drawings and specific examples.
The embodiment provides a method for estimating the advancing and retreating distance of a digestive endoscope lens, which adopts an optical flow method to estimate the advancing and retreating displacement of the digestive endoscope lens in a digestive tract. The method comprises the following specific steps:
1. And acquiring the camera position with the minimum measurement unit of 10mm of the digestive endoscope output with the positioning function.
The acquisition method can be directly output from the digestive endoscope host or adopt an optical character recognition technology to recognize the traveling distance of the displayed lens from the display screen of the digestive endoscope host in real time.
2. And acquiring each frame of image of the digestive endoscope in real time by adopting screen video equipment.
3. Optionally, taking a frame image as a current frame image, and extracting feature points to be tracked on the current frame image.
Extracting a feature point set to be tracked from a current frame image by using a feature point detection algorithm, wherein the feature point set is defined as. The feature point detection algorithm can select traditional feature point detection algorithms such as SIFT, ORB or SURF, and can also select feature point detection algorithms based on deep learning such as SuperPoint.
4. The L-K (Lucas-Kanade) optical flow is used for calculating the optical flow of the adjacent frame endoscopic images, and the specific calculation process is as follows:
based on the assumption that the luminance of the pixels of the target image in the scene remains constant between frames, we have the formula:
in the formula (i), Is the current frame image,/>Is the coordinates of a certain pixel in the current frame image,/>Is the current time of day and the time of day,Coordinates/>, respectivelyAnd time/>Is a derivative of (a).
Taylor expansion is carried out on the left side of the formula (1), and a first-order term is reserved, so that the following is obtained:
assuming a point on the inner wall of the digestive tract organ, the luminosity value remains unchanged in the endoscopic images of two adjacent frames, so that:
Wherein the method comprises the steps of For the speed of movement of the point in the x-axis, and/>The velocity of the movement of the points on the y-axis is denoted as/>. At the same timeFor the image at this point/>Gradient of direction, another term/>Then is at/>The gradient of the direction is denoted as/>。/>Is the derivative of I with respect to time t, noted as/>. Written in matrix form, there is:/>
Equation (5) has two unknowns, at least 2 equations are needed to solve. Based on the assumption that adjacent points of the same surface have similar motion, we consider a window/>, around the pixelThe pixel within has the same motion state as the pixel. Thus, it is possible to construct/>The equations, namely:
Based on least squares solution, we get:
in the formula (6) Representing the derivative of the kth point within the window with respect to time t.
5. And calculating the ratio relation of the displacement of all frames between the starting frame and the ending frame of one section of endoscopic video.
First defineThe axis direction is shown in fig. 1, and the lens coordinate system is shown in fig. 2:
Assume that Is the set of all frames of an endoscopic video, at the beginning frame/>The feature point set of time tracking is/>First/>The optical flow of each feature point in the frame is/>, respectively. According to the definition of the optical flow, the relation between the displacement of the lens of the endoscope and the optical flow can be obtained by deduction as follows:
Assuming that the camera only rotates about the z-axis, and only translates along the z-axis (i.e., advances and retreats), then optical flow is defined as the difference in pixel coordinates of the projected points of the same spatial point in both camera poses, i.e.,
In the method, in the process of the invention,Representing a projection function of a spatial point onto a pixel coordinate system,/>Representing the camera pose of two frames.
According to our hypothesis (camera only wrapped aroundThe amount of rotation and translation of the shaft), thus/>Is defined as shown in formula (11): /(I)
Wherein the method comprises the steps ofFor rotation angle around z-axis,/>Is a geometric internal reference matrix of the camera; substituting the formula (11) into the formula (10), and performing unfolding calculation to obtain/>As shown in formula (12):
to level displacement along the z-axis,/> For the focal length of the camera lens in the x-direction,/>For the focal length of the camera lens in the y direction,/>Is the x component of the pixel coordinate of the principal point,/>The pixel coordinate y component of the principal point,/>Is a constant describing camera properties, which can be obtained by camera calibration,/>Is the coordinates of the three-dimensional point in the camera coordinate system of the previous frame and is recorded
In the middle ofIs a notation for convenience of description and is not actually meant.
After the formula (12) is simplified, the relationship between the displacement of the endoscope lens on the z axis and the optical flow can be obtained as follows:
Due to Is constant, so we will camera z-axis displacement/>In connection with the optical flow OF the feature points, the ratio OF the optical flow OF the feature points (right side OF the expression) can be used to approximate the ratio OF the displacement amounts (left side OF the expression), OF x and OF y being components OF the optical flow in the x and y directions, respectively.
Obtaining the first using the above methodRatio of individual feature points/>In/>Representing the z coordinate of the ith feature point in the first frame camera coordinate system, and m represents the total number of frames in a minimum measurement unit of 10 mm; then combine all items with item 1/>Comparing to obtain/>; And then averaging all the tracked characteristic points to obtain a set/>, of each frame weightWherein/>I.e. the ratio of the displacement of the 1 st to the m-th frame and the displacement of the 1 st to the 2 nd frame.
6. And distributing and solving the displacement according to the calculated ratio relation.
In step 5, we note thatAnd/>,/>Is the camera displacement in the minimum measurement unit of 10mm, wherein/>Since the m-th frame is displaced in the z direction from the first frame, the/>And then the displacement of all intermediate frames relative to the first frame is obtained as shown in equation (14). /(I)
Wherein therein isIs the z-direction displacement of the mth frame relative to the 1 st frame.
7. The advance and retreat amount of each frame can be obtained by performing the above operation on all frames.
Fig. 3 and 4 are positions of the digestive endoscope estimated based on the method of the present invention, and fig. 3 corresponds to two cases of advancing and fig. 4 corresponds to retracting. The graph shows that the method can accurately and continuously determine the displacement of the lens of the digestive endoscope, and provides very accurate data support for accurate medical treatment.
Fig. 5 is an embodiment of an application of the method according to the present invention, in which the estimated digestive endoscope position is displayed simultaneously with the CT image, and the numeral 32.67439 indicates the position (distance from the starting point) of a certain frame shot calculated according to the method according to the present invention.
The foregoing description of the preferred embodiment of the invention is not intended to be limiting, but rather to cover all modifications, equivalents, and alternatives falling within the spirit and principles of the invention.

Claims (3)

1. A method for estimating the forward and backward distance of a digestive endoscope lens, the method comprising the steps of:
(1) Acquiring a camera position which takes 10mm as a minimum measurement unit and is output by a digestive endoscope with a positioning function;
(2) Acquiring each frame of image of the digestive endoscope in real time by adopting screen video equipment;
(3) Optionally selecting a frame of image as a current frame of image, and extracting feature points to be tracked;
(4) Calculating optical flow of the adjacent frame images using the L-K optical flow;
(5) Calculating the ratio relation of the displacement of all frames between a starting frame and an ending frame of a section of endoscopic video;
(6) Distributing and solving displacement according to the ratio relation obtained by calculation;
(7) The operation is carried out on all frames, and the advance and retreat amount of each frame can be obtained;
the step (5) specifically comprises the steps of:
(51) The difference OF pixel coordinates OF projection points OF the same spatial point in two lens poses,
Wherein the method comprises the steps ofIs the rotation angle of the lens around the z-axis,/>For displacement of the lens along the z-axis,/>For the focal length of the camera lens in the x-direction,/>For the focal length of the camera lens in the y direction,/>Is the x component of the pixel coordinate of the principal point,/>The pixel coordinate y component of the principal point,Is the coordinates of the three-dimensional point in the lens coordinate system of the previous frame and is recorded
; In/>Is a notation for convenience of description, and has no practical meaning;
(52) The relation between the displacement of the endoscope lens on the z axis and the optical flow is obtained after simplifying the formula as follows:
Wherein/> OF x and OF y are constant, the components OF the optical flow in the x and y directions, respectively;
(53) Obtaining the first according to the formula of the step (52) Ratio of individual feature points/>In/>Representing the z-coordinate of the ith feature point in the first frame camera coordinate system,/>Is the z-direction displacement of the mth frame relative to the first frame, m represents the total number of frames in a minimum measurement unit of 10 mm; then combine all items with item 1/>Comparing to obtain; Averaging all the tracked characteristic points to obtain a set of weight of each frameWherein/>I.e. the ratio of the displacement of the 1 st to the m-th frame and the displacement of the 1 st to the 2 nd frame.
2. The method for estimating a distance of a digestive endoscope according to claim 1, wherein the step (3) is specifically: and extracting a feature point set to be tracked by using a feature point detection algorithm on the current frame image.
3. The method for estimating a distance of a digestive endoscope according to claim 1, wherein the step (6) specifically comprises the steps of: according to the formulaCalculating the displacement of all intermediate frames relative to the first frame, whereinIs the z-direction displacement of the m-th frame relative to the 1 st frame,/>Is the camera displacement in 10mm as the minimum unit of measure.
CN202410226117.5A 2024-02-29 2024-02-29 Method for estimating advancing and retreating distance of digestive endoscope lens Active CN117796745B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202410226117.5A CN117796745B (en) 2024-02-29 2024-02-29 Method for estimating advancing and retreating distance of digestive endoscope lens

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202410226117.5A CN117796745B (en) 2024-02-29 2024-02-29 Method for estimating advancing and retreating distance of digestive endoscope lens

Publications (2)

Publication Number Publication Date
CN117796745A CN117796745A (en) 2024-04-02
CN117796745B true CN117796745B (en) 2024-05-03

Family

ID=90433821

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202410226117.5A Active CN117796745B (en) 2024-02-29 2024-02-29 Method for estimating advancing and retreating distance of digestive endoscope lens

Country Status (1)

Country Link
CN (1) CN117796745B (en)

Citations (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001161711A (en) * 1999-12-14 2001-06-19 Olympus Optical Co Ltd Endoscopic surgery system
WO2017054817A1 (en) * 2015-10-01 2017-04-06 Olaf Christiansen Endoscopic image processing system for surgery using means which generate geometric distance information in the detection region of an optical digital camera
CN106618454A (en) * 2016-11-21 2017-05-10 电子科技大学 Capsule endoscope system
JP2017185212A (en) * 2016-03-31 2017-10-12 国立大学法人浜松医科大学 Optical axis position measurement system, optical axis position measurement method, optical axis position measurement program, optical axis position measurement device
CN109523589A (en) * 2018-11-13 2019-03-26 浙江工业大学 A kind of design method of more robust visual odometry
JP2019219719A (en) * 2018-06-15 2019-12-26 株式会社デンソーテン Abnormality detection device and abnormality detection method
CN111291677A (en) * 2020-02-05 2020-06-16 吉林大学 Method for extracting and rendering dynamic video tactile features
CN111915573A (en) * 2020-07-14 2020-11-10 武汉楚精灵医疗科技有限公司 Digestive endoscopy focus tracking method based on time sequence feature learning
CN112766416A (en) * 2021-02-10 2021-05-07 中国科学院深圳先进技术研究院 Digestive endoscopy navigation method and system
CN113786239A (en) * 2021-08-26 2021-12-14 哈尔滨工业大学(深圳) Method and system for tracking and real-time early warning of surgical instruments under stomach and digestive tract
CN114359406A (en) * 2021-12-30 2022-04-15 像工场(深圳)科技有限公司 Calibration of auto-focusing binocular camera, 3D vision and depth point cloud calculation method
WO2022170562A1 (en) * 2021-02-10 2022-08-18 中国科学院深圳先进技术研究院 Digestive endoscope navigation method and system
CN115294128A (en) * 2022-10-08 2022-11-04 四川大学 Monocular structure three-dimensional imaging method and device for digestive endoscopy
CN115830122A (en) * 2022-12-21 2023-03-21 北京理工大学 Method and device for positioning continuous frame endoscope
CN115844317A (en) * 2022-12-29 2023-03-28 四川大学华西医院 Digestive endoscopy visualization method, system and equipment
WO2023126999A1 (en) * 2021-12-27 2023-07-06 日本電気株式会社 Image processing device, image processing method, and storage medium
CN116542952A (en) * 2023-05-17 2023-08-04 四川大学 Endoscopic coverage rate evaluation method and system based on three-dimensional reconstruction
CN116721128A (en) * 2023-05-24 2023-09-08 上海大学 Method for detecting endoscope advancing and retreating speed based on machine vision

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040155975A1 (en) * 2002-09-17 2004-08-12 Hart Douglas P. 3-D imaging system
KR20150140814A (en) * 2013-04-10 2015-12-16 오클랜드 유니서비시즈 리미티드 Head and eye tracking
WO2021071991A1 (en) * 2019-10-07 2021-04-15 S&N Orion Prime, S.A. Systems and methods for changing the direction of view during video guided clinical procedures using real-time image processing
JP7254742B2 (en) * 2020-03-26 2023-04-10 Hoya株式会社 Program, information processing method, information processing device, and diagnosis support system

Patent Citations (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001161711A (en) * 1999-12-14 2001-06-19 Olympus Optical Co Ltd Endoscopic surgery system
WO2017054817A1 (en) * 2015-10-01 2017-04-06 Olaf Christiansen Endoscopic image processing system for surgery using means which generate geometric distance information in the detection region of an optical digital camera
JP2017185212A (en) * 2016-03-31 2017-10-12 国立大学法人浜松医科大学 Optical axis position measurement system, optical axis position measurement method, optical axis position measurement program, optical axis position measurement device
CN106618454A (en) * 2016-11-21 2017-05-10 电子科技大学 Capsule endoscope system
JP2019219719A (en) * 2018-06-15 2019-12-26 株式会社デンソーテン Abnormality detection device and abnormality detection method
CN109523589A (en) * 2018-11-13 2019-03-26 浙江工业大学 A kind of design method of more robust visual odometry
CN111291677A (en) * 2020-02-05 2020-06-16 吉林大学 Method for extracting and rendering dynamic video tactile features
CN111915573A (en) * 2020-07-14 2020-11-10 武汉楚精灵医疗科技有限公司 Digestive endoscopy focus tracking method based on time sequence feature learning
CN112766416A (en) * 2021-02-10 2021-05-07 中国科学院深圳先进技术研究院 Digestive endoscopy navigation method and system
WO2022170562A1 (en) * 2021-02-10 2022-08-18 中国科学院深圳先进技术研究院 Digestive endoscope navigation method and system
CN113786239A (en) * 2021-08-26 2021-12-14 哈尔滨工业大学(深圳) Method and system for tracking and real-time early warning of surgical instruments under stomach and digestive tract
WO2023126999A1 (en) * 2021-12-27 2023-07-06 日本電気株式会社 Image processing device, image processing method, and storage medium
CN114359406A (en) * 2021-12-30 2022-04-15 像工场(深圳)科技有限公司 Calibration of auto-focusing binocular camera, 3D vision and depth point cloud calculation method
CN115294128A (en) * 2022-10-08 2022-11-04 四川大学 Monocular structure three-dimensional imaging method and device for digestive endoscopy
CN115830122A (en) * 2022-12-21 2023-03-21 北京理工大学 Method and device for positioning continuous frame endoscope
CN115844317A (en) * 2022-12-29 2023-03-28 四川大学华西医院 Digestive endoscopy visualization method, system and equipment
CN116542952A (en) * 2023-05-17 2023-08-04 四川大学 Endoscopic coverage rate evaluation method and system based on three-dimensional reconstruction
CN116721128A (en) * 2023-05-24 2023-09-08 上海大学 Method for detecting endoscope advancing and retreating speed based on machine vision

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
Gastrointestinal Endoscopy‑Associated Infections: Update on an Emerging Issue;Anasua Deb, Abhilash Perisetti等;Digestive Diseases and Sciences;20220309(第2022年67期);1718-1732 *
光学三维传感中多视点距离像配准与复杂曲面展平;宋万忠;中国博士学位论文全文数据库(信息科技辑);20020228(第2002/02期);I140-12 *
基于计算机视觉的人体内腔三维重建技术综述;吴海斌,徐若彤等;计算机工程;20211015;第47卷(第10期);1-15 *
结肠内窥镜视觉导航系统中光流寻径算法的研究;李晗;中国优秀硕士学位论文全文数据库(医药卫生科技辑);20170430(第2017/04期);E064-21 *

Also Published As

Publication number Publication date
CN117796745A (en) 2024-04-02

Similar Documents

Publication Publication Date Title
JP5153620B2 (en) System for superimposing images related to a continuously guided endoscope
Mori et al. Tracking of a bronchoscope using epipolar geometry analysis and intensity-based image registration of real and virtual endoscopic images
Helferty et al. Computer-based system for the virtual-endoscopic guidance of bronchoscopy
US7922652B2 (en) Endoscope system
JP5715311B2 (en) Endoscope system
US20100149183A1 (en) Image mosaicing systems and methods
Deguchi et al. Selective image similarity measure for bronchoscope tracking based on image registration
JP2016519968A (en) Image reconstruction from in vivo multi-camera capsules
US20130002842A1 (en) Systems and Methods for Motion and Distance Measurement in Gastrointestinal Endoscopy
WO2009063423A1 (en) Interventional navigation using 3d contrast-enhanced ultrasound
Seshamani et al. Real-time endoscopic mosaicking
US20220398771A1 (en) Luminal structure calculation apparatus, creation method for luminal structure information, and non-transitory recording medium recording luminal structure information creation program
CN114983317A (en) Method and apparatus for travel distance measurement of capsule camera in gastrointestinal tract
Yao et al. Motion-based camera localization system in colonoscopy videos
JP2012165838A (en) Endoscope insertion support device
WO2015110934A1 (en) Continuous image integration for robotic surgery
Merritt et al. Real-time CT-video registration for continuous endoscopic guidance
US20220409030A1 (en) Processing device, endoscope system, and method for processing captured image
Kumar et al. Stereoscopic visualization of laparoscope image using depth information from 3D model
Turan et al. A fully dense and globally consistent 3d map reconstruction approach for gi tract to enhance therapeutic relevance of the endoscopic capsule robot
CN117796745B (en) Method for estimating advancing and retreating distance of digestive endoscope lens
CN116324897A (en) Method and system for reconstructing a three-dimensional surface of a tubular organ
CN116898586A (en) Autonomous intubation method applied to intubation robot
Lerotic et al. Dynamic view expansion for enhanced navigation in natural orifice transluminal endoscopic surgery
JP4540124B2 (en) Projection image generation apparatus, method, and program thereof

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant