EP3937126A1 - Endoscope image processing device - Google Patents
Endoscope image processing device Download PDFInfo
- Publication number
- EP3937126A1 EP3937126A1 EP20184995.7A EP20184995A EP3937126A1 EP 3937126 A1 EP3937126 A1 EP 3937126A1 EP 20184995 A EP20184995 A EP 20184995A EP 3937126 A1 EP3937126 A1 EP 3937126A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- endoscope
- images
- landing zone
- image processing
- stream
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Classifications
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H30/00—ICT specially adapted for the handling or processing of medical images
- G16H30/40—ICT specially adapted for the handling or processing of medical images for processing medical images, e.g. editing
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/04—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
- A61B1/05—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances characterised by the image sensor, e.g. camera, being in the distal end portion
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0012—Biomedical image inspection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/246—Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H20/00—ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
- G16H20/40—ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to mechanical, radiation or invasive therapies, e.g. surgery, laser therapy, dialysis or acupuncture
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H40/00—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
- G16H40/60—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
- G16H40/63—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for local operation
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00064—Constructional details of the endoscope body
- A61B1/00071—Insertion part of the endoscope body
- A61B1/0008—Insertion part of the endoscope body characterised by distal tip features
- A61B1/00087—Tools
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2200/00—Indexing scheme for image data processing or generation, in general
- G06T2200/24—Indexing scheme for image data processing or generation, in general involving graphical user interfaces [GUIs]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10068—Endoscopic image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20081—Training; Learning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20084—Artificial neural networks [ANN]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20092—Interactive image processing based on input by user
- G06T2207/20101—Interactive definition of point of interest, landmark or seed
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30241—Trajectory
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02A—TECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
- Y02A90/00—Technologies having an indirect contribution to adaptation to climate change
- Y02A90/10—Information and communication technologies [ICT] supporting adaptation to climate change, e.g. for weather forecasting or climate simulation
Definitions
- the present disclosure relates to an image processing device for estimating a landing zone of a medical device extendable from a tip of an endoscope, a display unit comprising an image processing device, an endoscope system comprising an endoscope and an image processing device, and a method of performing a medical procedure.
- Endoscopes are widely used in hospitals for visually examining body cavities and obtaining samples of tissue identified as potentially pathological.
- An endoscope typically comprises an image capturing unit arranged at the distal end of the endoscope either looking forward or to the side.
- An endoscope is further typically provided with a working channel allowing a medical device such as a gripping device, a suction device, or a catheter to be introduced.
- ERCP Endoscopic retrograde cholangiopancreatography
- the major duodenal papilla is catheterized using a catheter advanced from the tip of duodenoscope.
- the duodenoscope is provided with a guiding element is the form of an elevator element for guiding the catheter in a particular direction.
- the present disclosure relates to an image processing device for estimating a landing zone of a medical device extendable from a tip of an endoscope, the endoscope comprising one or more sensors including an image capturing device, the image processing device comprising a processing unit operationally connectable to the one or more sensors of the endoscope, wherein the processing unit is configured to:
- the operator of the endoscope may directly be provided with visual aid for assisting with performing a successful procedure.
- the image processing device may be built into a display unit or it may be a standalone device operationally connectable to both the one or more sensors of the endoscope and a display unit.
- the image capturing device may be the only sensor of the endoscope, i.e. the one or more sensors may consist of the image capturing device.
- the landing zone may be the position the medical device will reach when extended a predetermined length from the endoscope.
- the processing unit of the image processing device may be any processing unit, such as a central processing unit (CPU), a graphics processing unit (GPU), a microcontroller unit (MCU), a field-programmable gate array (FPGA), or any combination thereof.
- the processing unit may comprise one or more physical processors and/or may be combined by a plurality of individual processing units.
- the processing unit is further configured to obtain an identification of the type of medical instrument, and wherein the estimated landing zone is further dependent on the identification of the type of medical instrument.
- the processing unit may obtain the identification via user input and / or automated e.g. by processing sensor data from the one or more sensors.
- the visual indication may be a visual element overlaid on the stream of images.
- the visual element may comprise parts that are either fully or partly transparent. Thus, visual content of the stream of images may only to a small degree be covered by the visual element.
- the visual element may be arranged at the estimated landing zone.
- the visual element may be arranged next to the estimated landing zone. Arranging the visual element next to the estimated landing zone may be beneficial as the landing zone typically is at a point of interest that optimally should be freely visible to the user.
- the processing unit is further configured to estimate a part of the trajectory by processing the first sensor data, and wherein the visual indication further indicates the estimated trajectory.
- ERCP is an example of such a medical procedure.
- the major duodenal papilla is catheterized using a catheter advanced from the tip of duodenoscope where the orientation between the catheter and the major duodenal papilla is crucial for the success of the procedure. Removal of polyps may also require a precise orientation between the polyp and a cutting tool.
- the processing unit is further configured to:
- the processing unit is further configured to: obtain the identification of the anatomic landmark by processing on or more images of the stream of images.
- the system may be at least partly automated.
- the processing unit is configured to obtain the identification of the anatomic landmark by processing the stream of images using a machine learning data architecture trained to identify the anatomic landmark in endoscope images.
- the machine learning data architecture may be a supervised machine learning architecture, trained by being provided with a training data set of images from a large number of patients, where a first subset of images of the training data set includes the anatomic landmark and a second subset of images does not include the anatomic landmark.
- the machine learning data architecture is an artificial neural network such as a deep structured learning architecture.
- the processing unit is further configured to: obtain the identification of the anatomic landmark through a user input.
- the processing unit may be operationally connected to a touch screen displaying the stream of images live to the user, where the user may manually identify the anatomic landmark e.g. by clicking on it on the touchscreen.
- the processing unit may then use motion tacking techniques in order to keep track of the anatomic landmark as the endoscope moves around.
- the processing unit is further configured to: provide the stream of images with a visual indication, indicating that the estimated landing zone intersects with the determined location of the anatomic landmark.
- the user may be provided with further assistance when performing a medical procedure. This may make the medical procedure easier and safer.
- the first sensor data is one or more images of the stream of images.
- the endoscope does not need be provided with further sensors.
- the one or more images of the stream of images are processed using a machine learning data architecture trained to estimate the landing zone of the medical device.
- the machine learning data architecture may be a supervised machine learning architecture.
- the machine learning data architecture is an artificial neural network such as a deep structured learning architecture.
- the endoscope further comprises a guiding element configured to guide the medical device in a particular direction
- the one or more sensors further comprises a guiding element sensor for detecting the position and / or orientation of the guiding element, and wherein the first sensor data is recorded by the guiding element sensor.
- the endoscope is duodenoscope and wherein the guiding element is an elevator element configured to elevate a catheter.
- the disclosure relates to a display unit for displaying images obtained by an image capturing device of an endoscope, wherein the display unit comprises an image processing device as disclosed in relation to the first aspect of the disclosure.
- the disclosure relates to an endoscope system comprising an endoscope and an image processing device as disclosed in relation to the first aspect of the invention, wherein the endoscope has an image capturing device and the processing unit of the image processing device is operationally connectable to the image capturing device of the endoscope.
- the endoscope may comprise a working channel for allowing a medical device to extend from its tip.
- the endoscope may further comprise a guiding element for guiding the medical device in a particular direction.
- the guiding element may be movable from a first position to a second position.
- the guiding element may be provided with or without a guiding element sensor.
- the guiding element may be controllable from the handle of the endoscope.
- the guiding element may be configured guide the medical device in a particular direction relative to the tip of the endoscope. Thus, dependent on the position of the guiding element the landing zone of the medical device will be arranged at the different parts of the endoscope image.
- the endoscope may be a duodenoscope and wherein the guiding element is an elevator element configured to elevate a catheter.
- the disclosure relates to a method of training a machine learning data architecture for estimating a landing zone of a medical device extendable from a tip of an endoscope, comprising the steps of:
- the endoscope comprises a guiding element for guiding the medical device in a particular direction, wherein the guiding element is movable from a first position to a second position, and wherein the guiding element is arranged in different positions in the plurality of images.
- a part of the medical device can be seen in at least some of the plurality of images.
- the plurality of images includes images of different medical devices.
- the disclosure relates to a method of performing a medical procedure using an endoscope system as disclosed in relation to the third aspect of the disclosure, comprising the steps of:
- the different aspects of the present disclosure can be implemented in different ways including image processing devices, display units, endoscope systems, methods of training a machine learning data architecture, and methods of performing a medical procedure described above and in the following, each yielding one or more of the benefits and advantages described in connection with at least one of the aspects described above, and each having one or more preferred embodiments corresponding to the preferred embodiments described in connection with at least one of the aspects described above and/or disclosed in the dependant claims.
- embodiments described in connection with one of the aspects described herein may equally be applied to the other aspects.
- Fig. 1 illustrates an example of an endoscope 100.
- This endoscope may be adapted for single-use.
- the endoscope 100 is provided with a handle 102 attached to an insertion tube 104 provided with a bending section 106.
- the insertion tube 104 as well as the bending section 106 may be provided with one or several working channels such that instruments, such as a gripping device or a catheter, may extend from the tip and be inserted into a human body via the endoscope.
- One or several exit holes of the one or several channels may be provided in a tip part 108 of the endoscope 100.
- a camera sensor such as a CMOS sensor or any other image capturing device, as well as one or several light sources, such as light emitting diodes (LEDs), fiber, or any other light emitting devices, may be placed in the tip part 108.
- a monitor 200 illustrated in Fig. 2 , configured to display images based on image data captured by the camera sensor, an operator is able to see and analyze an inside of the human body in order to for instance localize a position for taking a sample.
- the operator will be able to control the instrument in a precise manner due to the provided visual feedback.
- some diseases or health issues may result in a shift in natural colors or other visual symptoms, the operator is provided with valuable input for making a diagnosis based on the image data provided via the image capturing device and the monitor.
- the endoscope may further comprise a guiding element 121 for guiding the medical device in a particular direction (only schematically shown).
- the guiding element 121 may be movable from a first position to a second position.
- the guiding element may be controllable from the handle of the endoscope via an actuation element 120 (only schematically shown).
- the guiding element may be configured guide the medical device in a particular direction relative to the tip of the endoscope. Thus, dependent on the position of the guiding element 121 the landing zone of the medical device will be arranged at the different parts of the endoscope image.
- the endoscope may further optionally comprise a guiding element sensor 122.
- the guiding element sensor 122 may detect the position of the guiding element, which may then be used to estimate the landing zone a medical device extendable from the tip of the catheter.
- the endoscope does not comprise a guiding element sensor 122, and the landing zone of the medical device is estimated only using image data recorded by the image capturing device.
- the endoscope has a bending section 106 that can be bent in different directions with respect to the insertion tube 104.
- the bending section 106 may be controlled by the operator by using a knob 110 placed on the handle 102.
- the handle 102 illustrated in fig 1 is designed such that the knob 110 is controlled by a thumb of the operator, but other designs are also possible.
- a push button 112 may be used.
- the handle 102 illustrated in fig 1 is designed such that a index finger of the operator is used for controlling the gripping device, but other designs are also possible.
- Fig. 3 shows a schematic drawing of an endoscope system 301 according to an embodiment of the disclosure.
- the endoscope system 301 comprises an endoscope 302 and an image processing device 304.
- the endoscope 302 has a working channel allowing a medical instrument to extend from the tip of the endoscope 302.
- the image processing device comprises a processing unit 307.
- the endoscope 302 comprises one or more sensors including an image capturing device 303.
- the processing unit 307 of the image processing device 304 is operationally connectable to the image capturing device of the endoscope 303.
- the processing unit 307 is configured to: obtain a stream of images captured by the image capturing device of the endoscope; process first sensor data recorded by the one or more sensors to estimate the landing zone of the medical device; and provide the stream of images with a visual indication, indicating the estimated landing zone.
- the image processing device 304 is integrated in a display unit 305. Consequently, the user may in real time be assisted when performing complex medical procedures such as catherization of the major duodenal papilla.
- the landing zone may be the position the medical device will reach when extended a predetermined length from the endoscope.
- the position of the landing zone may dependent on the type of medical instrument and / or the orientation of the medical instrument.
- the medical instrument e.g. a catheter
- the processing may also be configured to estimate the landing zone by analyzing one or more images of the stream of images e.g. one or more images showing the type of medical instrument and / or the orientation of the medical instrument e.g. after the medical instrument has propagated a short distance out of the working channel.
- the one or more images of the stream of images may be processed using a machine learning data architecture trained to estimate the landing zone of the medical device.
- the machine learning data architecture may be trained by being provided with a training data set comprising a larger number of images showing medical instruments extended a short distance from the tip of the endoscope and further data of the resulting landing zones.
- the position of the landing zone may be dependent on the position of a guiding element configured to guide the medical device in a particular direction.
- a guiding element configured to guide the medical device in a particular direction.
- the guiding element is an elevator element configured to elevate a catheter enabling the catheter to catherize the major duodenal papilla.
- the one or more sensors of the endoscope may include a guiding element sensor configured to detect the position and / or orientation of the guiding element.
- the processing may also be configured to estimate the landing zone by analyzing one or more images of the stream of images e.g. one or more images showing the medical instrument and / or the guiding element e.g. after the medical instrument has propagated a short distance out of the working channel.
- the one or more images of the stream of images may be processed using a machine learning data architecture trained to estimate the landing zone of the medical device.
- the machine learning data architecture may be trained by being provided with a training data set comprising a larger number of images showing medical instruments extended a short distance from the tip of the endoscope where the guiding element is positioned and / or oriented in different positions / orientations and further data of the resulting landing zones. If the position / orientation of the guiding element is visible to the image capturing device, then the images of the training data set may not include a medical instrument.
- Fig 5a-d show schematically images captured by an image capturing device of an endoscope according to an embodiment of the disclosure.
- Fig. 5a shows a traditional unaltered image captured by an image capturing device of an endoscope.
- a medical device 501 e.g. a cutting tool or a catheter
- an area of interest 502 can be seen.
- the area of interest may be polyp that needs to be removed using a cutting tool or an opening that needs to be catherized.
- Fig. 5b shows an image captured by an image capturing device of an endoscope provided with a visual indication 501, indicating the estimated landing zone of the medical device 501 e.g. the position the medical device 501 will reach when extended a predetermined length from the endoscope.
- the visual indication 503 marks the center of the landing zone.
- the endoscope comprises a guiding element configured to guide the medical device 501 in a particular direction the user may move the guiding element so that the medical device is guided in a different direction.
- Fig. 5c shows an image captured by an image capturing device of an endoscope provided with a visual indication 503, indicating the estimated landing zone of the medical device 501 after a guiding element of the endoscope has been moved.
- the medical device 501 is now arranged with a different angle in the image and the visual indication 503 now has moved and is positioned at the area of interest 502. The user can now safely extend the medical device 501 to the area of interest 502.
- Fig. 5d shows an image captured by an image capturing device of an endoscope provided with a visual indication 503 indicating the estimated landing zone of the medical device 501.
- the medical device 501 has been extended a distance and has reached the estimated landing zone 503.
- a visual indication 503 indicating the estimated landing zone procedures may be conducted faster and safer.
- Figs 6a-b show images captured by an image capturing device of an endoscope according to an embodiment of the disclosure.
- the endoscope is a duodenoscope and the medical doctor is catheterizing the major duodenal papilla as part of an ERCP procedure.
- a catheter 601 and the major duodenal papilla 602 is shown.
- the catheter 601 extends a distance from the duodenoscope.
- the image has further been provided with a visual indication 604 605 606 indicating an estimated trajectory and landing zone of the catheter 601.
- the image has further been provided with a visual indication 603 showing the location of the major duodenal papilla 602 in the image.
- the visual indication 604 605 606 indicating the estimated trajectory and landing zone of the catheter does is not aligned with the major duodenal papilla 602. This may be signalled to the user by displaying the visual indication 603 with a first colour e.g. a yellow colour.
- the visual indication 603 may be displayed with a second colour e.g. a green colour.
- Fig. 6b shows an image after the major duodenal papilla 602 has been catheterized with the catheter 601.
- Fig. 7 shows a flowchart of a method of training a machine learning data architecture for estimating a landing zone of a medical device extendable from a tip of an endoscope according to an embodiment of the disclosure.
- a training data set comprising a plurality of images captured by an image capturing device of an endoscope is provided. The images may show a part of the medical device.
- step 702 for each image the position of the resulting landing zone of the medical device is provided.
- Fig. 8 shows a flowchart of a method of performing a medical procedure using an endoscope system according to an embodiment of the disclosure.
- the endoscope is advanced through a body to a position near a point of interest.
- a medical device is extended from the endoscope to a treatment position based on a stream of visual images provided with a visual indication, indicating an estimated landing zone of the medical device.
- a medical procedure is performed at the treatment position, such as removing a polyp, catherizing the major duodenal papilla or cutting a larger opening in the major duodenal papilla.
Abstract
provide the stream of images with a visual indication, indicating the estimated landing zone.
Description
- The present disclosure relates to an image processing device for estimating a landing zone of a medical device extendable from a tip of an endoscope, a display unit comprising an image processing device, an endoscope system comprising an endoscope and an image processing device, and a method of performing a medical procedure.
- Endoscopes are widely used in hospitals for visually examining body cavities and obtaining samples of tissue identified as potentially pathological. An endoscope typically comprises an image capturing unit arranged at the distal end of the endoscope either looking forward or to the side. An endoscope is further typically provided with a working channel allowing a medical device such as a gripping device, a suction device, or a catheter to be introduced.
- It may however be difficult for a number of medical procedures to secure that that the medical device reaches its intended destination.
- Endoscopic retrograde cholangiopancreatography, ERCP, is an example of such a medical procedure. In ERCP the major duodenal papilla is catheterized using a catheter advanced from the tip of duodenoscope. The duodenoscope is provided with a guiding element is the form of an elevator element for guiding the catheter in a particular direction. By controlling the elevator element and positioning the duodenoscope correctly the catheter may be introduced into the major duodenal papilla. This may however by a challenging and time-consuming process.
- Thus it remains a problem to provide an improved device / system for performing endoscopic procedures.
- According to a first aspect, the present disclosure relates to an image processing device for estimating a landing zone of a medical device extendable from a tip of an endoscope, the endoscope comprising one or more sensors including an image capturing device, the image processing device comprising a processing unit operationally connectable to the one or more sensors of the endoscope, wherein the processing unit is configured to:
- obtain a stream of images captured by the image capturing device of the endoscope;
- process first sensor data recorded by the one or more sensors to estimate the landing zone of the medical device; and
- provide the stream of images with a visual indication, indicating the estimated landing zone.
- Consequently, the operator of the endoscope may directly be provided with visual aid for assisting with performing a successful procedure.
- The image processing device may be built into a display unit or it may be a standalone device operationally connectable to both the one or more sensors of the endoscope and a display unit. The image capturing device may be the only sensor of the endoscope, i.e. the one or more sensors may consist of the image capturing device. The landing zone may be the position the medical device will reach when extended a predetermined length from the endoscope.
- The processing unit of the image processing device may be any processing unit, such as a central processing unit (CPU), a graphics processing unit (GPU), a microcontroller unit (MCU), a field-programmable gate array (FPGA), or any combination thereof. The processing unit may comprise one or more physical processors and/or may be combined by a plurality of individual processing units.
- In some embodiments, the processing unit is further configured to obtain an identification of the type of medical instrument, and wherein the estimated landing zone is further dependent on the identification of the type of medical instrument.
- The processing unit may obtain the identification via user input and / or automated e.g. by processing sensor data from the one or more sensors.
- The visual indication may be a visual element overlaid on the stream of images.
- The visual element may comprise parts that are either fully or partly transparent. Thus, visual content of the stream of images may only to a small degree be covered by the visual element.
- The visual element may be arranged at the estimated landing zone. Alternatively, the visual element may be arranged next to the estimated landing zone. Arranging the visual element next to the estimated landing zone may be beneficial as the landing zone typically is at a point of interest that optimally should be freely visible to the user.
- In some embodiments, the processing unit is further configured to estimate a part of the trajectory by processing the first sensor data, and wherein the visual indication further indicates the estimated trajectory.
- By further estimating and indicating the trajectory of the medical device procedures requiring a precise orientation between the medical device and the point of interest may be performed easier and safer. ERCP, is an example of such a medical procedure. In ERCP the major duodenal papilla is catheterized using a catheter advanced from the tip of duodenoscope where the orientation between the catheter and the major duodenal papilla is crucial for the success of the procedure. Removal of polyps may also require a precise orientation between the polyp and a cutting tool.
- In some embodiments, the processing unit is further configured to:
- obtain an identification of an anatomic landmark in the stream of images;
- determine the location of the anatomic landmark in the stream of images;
- determine if the estimated landing zone intersects with the determined location of the anatomic landmark.
- Consequently, a user may be provided with further assistance.
- In some embodiments, the processing unit is further configured to:
obtain the identification of the anatomic landmark by processing on or more images of the stream of images. - Consequently, the system may be at least partly automated.
- In some embodiments, the processing unit is configured to obtain the identification of the anatomic landmark by processing the stream of images using a machine learning data architecture trained to identify the anatomic landmark in endoscope images.
- The machine learning data architecture may be a supervised machine learning architecture, trained by being provided with a training data set of images from a large number of patients, where a first subset of images of the training data set includes the anatomic landmark and a second subset of images does not include the anatomic landmark.
- In some embodiments, the machine learning data architecture is an artificial neural network such as a deep structured learning architecture.
- Additionally / alternatively, in some embodiments, the processing unit is further configured to:
obtain the identification of the anatomic landmark through a user input. - As an example the processing unit may be operationally connected to a touch screen displaying the stream of images live to the user, where the user may manually identify the anatomic landmark e.g. by clicking on it on the touchscreen. The processing unit may then use motion tacking techniques in order to keep track of the anatomic landmark as the endoscope moves around.
- In some embodiments, the processing unit is further configured to:
provide the stream of images with a visual indication, indicating that the estimated landing zone intersects with the determined location of the anatomic landmark. - Consequently, the user may be provided with further assistance when performing a medical procedure. This may make the medical procedure easier and safer.
- In some embodiments, the first sensor data is one or more images of the stream of images.
- Consequently, by utilizing one or more images from the stream of images for estimating the landing zone, the endoscope does not need be provided with further sensors.
- In some embodiments, the one or more images of the stream of images are processed using a machine learning data architecture trained to estimate the landing zone of the medical device.
- The machine learning data architecture may be a supervised machine learning architecture.
- In some embodiments, the machine learning data architecture is an artificial neural network such as a deep structured learning architecture.
- In some embodiments, the endoscope further comprises a guiding element configured to guide the medical device in a particular direction, and wherein the one or more sensors further comprises a guiding element sensor for detecting the position and / or orientation of the guiding element, and wherein the first sensor data is recorded by the guiding element sensor.
- In some embodiments, the endoscope is duodenoscope and wherein the guiding element is an elevator element configured to elevate a catheter.
- According to a second aspect the disclosure relates to a display unit for displaying images obtained by an image capturing device of an endoscope, wherein the display unit comprises an image processing device as disclosed in relation to the first aspect of the disclosure.
- According to a third aspect the disclosure relates to an endoscope system comprising an endoscope and an image processing device as disclosed in relation to the first aspect of the invention, wherein the endoscope has an image capturing device and the processing unit of the image processing device is operationally connectable to the image capturing device of the endoscope.
- The endoscope may comprise a working channel for allowing a medical device to extend from its tip. The endoscope may further comprise a guiding element for guiding the medical device in a particular direction. The guiding element may be movable from a first position to a second position. The guiding element may be provided with or without a guiding element sensor. The guiding element may be controllable from the handle of the endoscope. The guiding element may be configured guide the medical device in a particular direction relative to the tip of the endoscope. Thus, dependent on the position of the guiding element the landing zone of the medical device will be arranged at the different parts of the endoscope image.
- The endoscope may be a duodenoscope and wherein the guiding element is an elevator element configured to elevate a catheter.
- According to a fourth aspect the disclosure relates to a method of training a machine learning data architecture for estimating a landing zone of a medical device extendable from a tip of an endoscope, comprising the steps of:
- providing a training data set comprising a plurality of images captured by an image capturing device of an endoscope,
- providing for each image the position of the resulting landing zone of the medical device.
- In some embodiments, the endoscope comprises a guiding element for guiding the medical device in a particular direction, wherein the guiding element is movable from a first position to a second position, and wherein the guiding element is arranged in different positions in the plurality of images.
- In some embodiments, a part of the medical device can be seen in at least some of the plurality of images.
- In some embodiments, the plurality of images includes images of different medical devices.
- According to a fifth aspect the disclosure relates to a method of performing a medical procedure using an endoscope system as disclosed in relation to the third aspect of the disclosure, comprising the steps of:
- advancing the endoscope through a body to a position near a point of interest,
- extending a medical device from the endoscope to a treatment position based on a stream of visual images provided with a visual indication, indicating an estimated landing zone of the medical device,
- performing a medical procedure at the treatment position, such as removing a polyp, catherizing the major duodenal papilla or cutting a larger opening in the major duodenal papilla.
- The different aspects of the present disclosure can be implemented in different ways including image processing devices, display units, endoscope systems, methods of training a machine learning data architecture, and methods of performing a medical procedure described above and in the following, each yielding one or more of the benefits and advantages described in connection with at least one of the aspects described above, and each having one or more preferred embodiments corresponding to the preferred embodiments described in connection with at least one of the aspects described above and/or disclosed in the dependant claims. Furthermore, it will be appreciated that embodiments described in connection with one of the aspects described herein may equally be applied to the other aspects.
- The above and/or additional objects, features and advantages of the present disclosure, will be further elucidated by the following illustrative and non-limiting detailed description of embodiments of the present disclosure, with reference to the appended drawings, wherein:
-
Fig. 1 shows an example of an endoscope. -
Fig. 2 shows an example of a display unit that can be connected to the endoscope shown inFig. 1 . -
Fig. 3 shows a schematic drawing of an endoscope system according to an embodiment of the disclosure. -
Fig. 4 shows a schematically drawing of an endoscope system according to an embodiment of the disclosure. -
Fig 5a-d show schematically images captured by an image capturing device of an endoscope according to an embodiment of the disclosure. -
Figs 6a-b show images captured by an image capturing device of an endoscope according to an embodiment of the disclosure -
Fig. 7 shows a flowchart of a method of training a machine learning data architecture for estimating a landing zone of a medical device extendable from a tip of an endoscope according to an embodiment of the disclosure. -
Fig. 8 shows a flowchart of a method of performing a medical procedure using an endoscope system according to an embodiment of the disclosure. Detailed description - In the following description, reference is made to the accompanying figures, which show by way of illustration how the embodiments of the present disclosure may be practiced.
-
Fig. 1 illustrates an example of anendoscope 100. This endoscope may be adapted for single-use. Theendoscope 100 is provided with ahandle 102 attached to aninsertion tube 104 provided with abending section 106. Theinsertion tube 104 as well as thebending section 106 may be provided with one or several working channels such that instruments, such as a gripping device or a catheter, may extend from the tip and be inserted into a human body via the endoscope. One or several exit holes of the one or several channels may be provided in atip part 108 of theendoscope 100. In addition to the exit holes, a camera sensor, such as a CMOS sensor or any other image capturing device, as well as one or several light sources, such as light emitting diodes (LEDs), fiber, or any other light emitting devices, may be placed in thetip part 108. By having the camera sensor and the light sources and amonitor 200, illustrated inFig. 2 , configured to display images based on image data captured by the camera sensor, an operator is able to see and analyze an inside of the human body in order to for instance localize a position for taking a sample. In addition, the operator will be able to control the instrument in a precise manner due to the provided visual feedback. Further, since some diseases or health issues may result in a shift in natural colors or other visual symptoms, the operator is provided with valuable input for making a diagnosis based on the image data provided via the image capturing device and the monitor. - The endoscope may further comprise a guiding
element 121 for guiding the medical device in a particular direction (only schematically shown). The guidingelement 121 may be movable from a first position to a second position. The guiding element may be controllable from the handle of the endoscope via an actuation element 120 (only schematically shown). The guiding element may be configured guide the medical device in a particular direction relative to the tip of the endoscope. Thus, dependent on the position of the guidingelement 121 the landing zone of the medical device will be arranged at the different parts of the endoscope image. - The endoscope may further optionally comprise a guiding
element sensor 122. The guidingelement sensor 122 may detect the position of the guiding element, which may then be used to estimate the landing zone a medical device extendable from the tip of the catheter. However, in some embodiment the endoscope does not comprise a guidingelement sensor 122, and the landing zone of the medical device is estimated only using image data recorded by the image capturing device. - In order to make it possible for the operator to direct the camera sensor such that different field of views can be achieved, the endoscope has a
bending section 106 that can be bent in different directions with respect to theinsertion tube 104. Thebending section 106 may be controlled by the operator by using aknob 110 placed on thehandle 102. Thehandle 102 illustrated infig 1 is designed such that theknob 110 is controlled by a thumb of the operator, but other designs are also possible. In order to control a gripping device or other device provided via a working channel apush button 112 may be used. Thehandle 102 illustrated infig 1 is designed such that a index finger of the operator is used for controlling the gripping device, but other designs are also possible. -
Fig. 3 shows a schematic drawing of anendoscope system 301 according to an embodiment of the disclosure. Theendoscope system 301 comprises anendoscope 302 and animage processing device 304. Theendoscope 302 has a working channel allowing a medical instrument to extend from the tip of theendoscope 302. The image processing device comprises aprocessing unit 307. Theendoscope 302 comprises one or more sensors including animage capturing device 303. Theprocessing unit 307 of theimage processing device 304 is operationally connectable to the image capturing device of theendoscope 303. Theprocessing unit 307 is configured to:
obtain a stream of images captured by the image capturing device of the endoscope; process first sensor data recorded by the one or more sensors to estimate the landing zone of the medical device; and provide the stream of images with a visual indication, indicating the estimated landing zone. In this embodiment, theimage processing device 304 is integrated in adisplay unit 305. Consequently, the user may in real time be assisted when performing complex medical procedures such as catherization of the major duodenal papilla. The landing zone may be the position the medical device will reach when extended a predetermined length from the endoscope. - The position of the landing zone may dependent on the type of medical instrument and / or the orientation of the medical instrument. As an example if the medical instrument e.g. a catheter, is provided with a tip having a prebend then the landing zone will dependent of the orientation of the medical instrument within the working channel. The one or more sensors of the endoscope may include a sensor configured to detect the type of medical instrument in the working channel and / or the orientation of the medical instrument. However, the processing may also be configured to estimate the landing zone by analyzing one or more images of the stream of images e.g. one or more images showing the type of medical instrument and / or the orientation of the medical instrument e.g. after the medical instrument has propagated a short distance out of the working channel.
- The one or more images of the stream of images may be processed using a machine learning data architecture trained to estimate the landing zone of the medical device. The machine learning data architecture may be trained by being provided with a training data set comprising a larger number of images showing medical instruments extended a short distance from the tip of the endoscope and further data of the resulting landing zones.
- Additionally / alternatively, the position of the landing zone may be dependent on the position of a guiding element configured to guide the medical device in a particular direction. As an example, if the endoscope is duodenoscope the guiding element is an elevator element configured to elevate a catheter enabling the catheter to catherize the major duodenal papilla. The one or more sensors of the endoscope may include a guiding element sensor configured to detect the position and / or orientation of the guiding element.
- However, the processing may also be configured to estimate the landing zone by analyzing one or more images of the stream of images e.g. one or more images showing the medical instrument and / or the guiding element e.g. after the medical instrument has propagated a short distance out of the working channel.
- The one or more images of the stream of images may be processed using a machine learning data architecture trained to estimate the landing zone of the medical device. The machine learning data architecture may be trained by being provided with a training data set comprising a larger number of images showing medical instruments extended a short distance from the tip of the endoscope where the guiding element is positioned and / or oriented in different positions / orientations and further data of the resulting landing zones. If the position / orientation of the guiding element is visible to the image capturing device, then the images of the training data set may not include a medical instrument.
-
Fig 5a-d show schematically images captured by an image capturing device of an endoscope according to an embodiment of the disclosure.Fig. 5a shows a traditional unaltered image captured by an image capturing device of an endoscope. In the image amedical device 501 e.g. a cutting tool or a catheter, and an area ofinterest 502 can be seen. The area of interest may be polyp that needs to be removed using a cutting tool or an opening that needs to be catherized. -
Fig. 5b shows an image captured by an image capturing device of an endoscope provided with avisual indication 501, indicating the estimated landing zone of themedical device 501 e.g. the position themedical device 501 will reach when extended a predetermined length from the endoscope. In this embodiment thevisual indication 503 marks the center of the landing zone. Thus, since thevisual indication 503 it not positioned at the area ofinterest 502 the user will have to re-arranged the endoscope in order to secure that the landing zone will be positioned at the area ofinterest 502. This may be done by moving the entire endoscope. Additionally / alternatively if the endoscope comprises a guiding element configured to guide themedical device 501 in a particular direction the user may move the guiding element so that the medical device is guided in a different direction. -
Fig. 5c shows an image captured by an image capturing device of an endoscope provided with avisual indication 503, indicating the estimated landing zone of themedical device 501 after a guiding element of the endoscope has been moved. As a result, themedical device 501 is now arranged with a different angle in the image and thevisual indication 503 now has moved and is positioned at the area ofinterest 502. The user can now safely extend themedical device 501 to the area ofinterest 502. -
Fig. 5d shows an image captured by an image capturing device of an endoscope provided with avisual indication 503 indicating the estimated landing zone of themedical device 501. In this image themedical device 501 has been extended a distance and has reached the estimatedlanding zone 503. By providing the images with avisual indication 503 indicating the estimated landing zone procedures may be conducted faster and safer. -
Figs 6a-b show images captured by an image capturing device of an endoscope according to an embodiment of the disclosure. In this embodiment the endoscope is a duodenoscope and the medical doctor is catheterizing the major duodenal papilla as part of an ERCP procedure. - In
Fig. 6a acatheter 601 and the majorduodenal papilla 602 is shown. Thecatheter 601 extends a distance from the duodenoscope. The image has further been provided with avisual indication 604 605 606 indicating an estimated trajectory and landing zone of thecatheter 601. The image has further been provided with avisual indication 603 showing the location of the majorduodenal papilla 602 in the image. InFig. 6a thevisual indication 604 605 606 indicating the estimated trajectory and landing zone of the catheter does is not aligned with the majorduodenal papilla 602. This may be signalled to the user by displaying thevisual indication 603 with a first colour e.g. a yellow colour. If the user then move the duodenoscope and or an elevator element of the duodenoscope so that the estimated trajectory and landing zone of the catheter is aligned with the majorduodenal papilla 602 then thevisual indication 603 may be displayed with a second colour e.g. a green colour. -
Fig. 6b shows an image after the majorduodenal papilla 602 has been catheterized with thecatheter 601.Fig. 7 shows a flowchart of a method of training a machine learning data architecture for estimating a landing zone of a medical device extendable from a tip of an endoscope according to an embodiment of the disclosure. In step 701 a training data set comprising a plurality of images captured by an image capturing device of an endoscope is provided. The images may show a part of the medical device. Next, instep 702 for each image the position of the resulting landing zone of the medical device is provided. -
Fig. 8 shows a flowchart of a method of performing a medical procedure using an endoscope system according to an embodiment of the disclosure. Instep 801 the endoscope is advanced through a body to a position near a point of interest. Then instep 802, a medical device is extended from the endoscope to a treatment position based on a stream of visual images provided with a visual indication, indicating an estimated landing zone of the medical device. Finally, in step 803 a medical procedure is performed at the treatment position, such as removing a polyp, catherizing the major duodenal papilla or cutting a larger opening in the major duodenal papilla. - Although some embodiments have been described and shown in detail, the invention is not restricted to them, but may also be embodied in other ways within the scope of the subject matter defined in the following claims. In particular, it is to be understood that other embodiments may be utilised and structural and functional modifications may be made without departing from the scope of the present invention.
- In device claims enumerating several means, several of these means can be embodied by one and the same item of hardware. The mere fact that certain measures are recited in mutually different dependent claims or described in different embodiments does not indicate that a combination of these measures cannot be used to advantage.
- It should be emphasized that the term "comprises/comprising" when used in this specification is taken to specify the presence of stated features, integers, steps or components but does not preclude the presence or addition of one or more other features, integers, steps, components or groups thereof.
Claims (16)
- An image processing device for estimating a landing zone of a medical device extendable from a tip of an endoscope, the endoscope comprising one or more sensors including an image capturing device, the image processing device comprising a processing unit operationally connectable to the one or more sensors of the endoscope, wherein the processing unit is configured to:obtain a stream of images captured by the image capturing device of the endoscope;process first sensor data recorded by the one or more sensors to estimate the landing zone of the medical device; andprovide the stream of images with a visual indication, indicating the estimated landing zone.
- An image processing device according to claim 1, wherein the processing unit is further configured to estimate a part of the trajectory by processing the first sensor data, and wherein the visual indication further indicates the estimated trajectory.
- An image processing device according to claims 1 or 2, wherein the processing unit is further configured to:obtain an identification of an anatomic landmark in the stream of images;determine the location of the anatomic landmark in the stream of images;determine if the estimated landing zone intersects with the determined location of the anatomic landmark.
- An image processing device according to claim 3, wherein the processing unit is further configured to:
obtain the identification of the anatomic landmark by processing on or more images of the stream of images. - An image processing device according to claim 4, wherein the processing unit is configured to obtain the identification of the anatomic landmark by processing the stream of images using a machine learning data architecture trained to identify the anatomic landmark in endoscope images.
- An image processing device according to claim 3, wherein the processing unit is further configured to:
obtain the identification of the anatomic landmark through a user input. - An image processing device according to any one of claims 3 to 6, wherein the processing unit is further configured to:
provide the stream of images with a visual indication, indicating that the estimated landing zone intersects with the determined location of the anatomic landmark. - An image processing device according to any one of claims 1 to 7, wherein the first sensor data is one or more images of the stream of images.
- An image processing device according to claim 8, wherein the one or more images of the stream of images are processed using a machine learning data architecture trained to estimate the landing zone of the medical device.
- An image processing device according to any one of claims 1 to 9, wherein the endoscope further comprises a guiding element configured to guide the medical device in a particular direction, and wherein the one or more sensors further comprises a guiding element sensor for detecting the position and / or orientation of the guiding element, and wherein the first sensor data is recorded by the guiding element sensor.
- A display unit for displaying images obtained by an image capturing device of an endoscope, wherein the display unit comprises an image processing device according to any one of claims 1 to 10.
- An endoscope system comprising an endoscope and an image processing device according to any one of claims 1 to 10, wherein the endoscope has an image capturing device and the processing unit of the image processing device is operationally connectable to the image capturing device of the endoscope.
- An endoscope system according to claim 12, wherein the endoscope is a duodenoscope and wherein the guiding element is an elevator element configured to elevate a catheter.
- A method of training a machine learning data architecture for estimating a landing zone of a medical device extendable from a tip of an endoscope, comprising the steps of:providing a training data set comprising a plurality of images captured by an image capturing device of an endoscope,providing for each image the position of the resulting landing zone of the medical device.
- A method according to claim 14, wherein the endoscope comprises a guiding element for guiding the medical device in a particular direction, wherein the guiding element is movable from a first position to a second position, and wherein the guiding element is arranged in different positions in the plurality of images.
- A method of performing a medical procedure using an endoscope system according to any one of claims 12 or 13, comprising the steps of:advancing the endoscope through a body to a position near a point of interest,extending a medical device from the endoscope to a treatment position based on a stream of visual images provided with a visual indication, indicating an estimated landing zone of the medical device,performing a medical procedure at the treatment position, such as removing a polyp, catherizing the major duodenal papilla or cutting a larger opening in the major duodenal papilla.
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP20184995.7A EP3937126A1 (en) | 2020-07-09 | 2020-07-09 | Endoscope image processing device |
CN202180047864.7A CN115803776A (en) | 2020-07-09 | 2021-07-06 | Endoscope image processing apparatus |
US18/014,471 US20230255461A1 (en) | 2020-07-09 | 2021-07-06 | Endoscope image processing device |
PCT/EP2021/068628 WO2022008497A1 (en) | 2020-07-09 | 2021-07-06 | Endoscope image processing device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP20184995.7A EP3937126A1 (en) | 2020-07-09 | 2020-07-09 | Endoscope image processing device |
Publications (1)
Publication Number | Publication Date |
---|---|
EP3937126A1 true EP3937126A1 (en) | 2022-01-12 |
Family
ID=71575032
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP20184995.7A Pending EP3937126A1 (en) | 2020-07-09 | 2020-07-09 | Endoscope image processing device |
Country Status (4)
Country | Link |
---|---|
US (1) | US20230255461A1 (en) |
EP (1) | EP3937126A1 (en) |
CN (1) | CN115803776A (en) |
WO (1) | WO2022008497A1 (en) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115211796A (en) * | 2022-08-02 | 2022-10-21 | 湖南英术生命科技有限公司 | Endoscope medical monitoring system |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20160309992A1 (en) * | 2015-04-27 | 2016-10-27 | Endochoice, Inc. | Endoscope with Integrated Measurement of Distance to Objects of Interest |
US20180296281A1 (en) * | 2017-04-12 | 2018-10-18 | Bio-Medical Engineering (HK) Limited | Automated steering systems and methods for a robotic endoscope |
WO2018195216A1 (en) * | 2017-04-18 | 2018-10-25 | Intuitive Surgical Operations, Inc. | Graphical user interface for monitoring an image-guided procedure |
US20200054400A1 (en) * | 2017-04-26 | 2020-02-20 | Olympus Corporation | Image processing apparatus, operating method of image processing apparatus, and computer-readable recording medium |
-
2020
- 2020-07-09 EP EP20184995.7A patent/EP3937126A1/en active Pending
-
2021
- 2021-07-06 US US18/014,471 patent/US20230255461A1/en active Pending
- 2021-07-06 CN CN202180047864.7A patent/CN115803776A/en active Pending
- 2021-07-06 WO PCT/EP2021/068628 patent/WO2022008497A1/en active Application Filing
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20160309992A1 (en) * | 2015-04-27 | 2016-10-27 | Endochoice, Inc. | Endoscope with Integrated Measurement of Distance to Objects of Interest |
US20180296281A1 (en) * | 2017-04-12 | 2018-10-18 | Bio-Medical Engineering (HK) Limited | Automated steering systems and methods for a robotic endoscope |
WO2018195216A1 (en) * | 2017-04-18 | 2018-10-25 | Intuitive Surgical Operations, Inc. | Graphical user interface for monitoring an image-guided procedure |
US20200054400A1 (en) * | 2017-04-26 | 2020-02-20 | Olympus Corporation | Image processing apparatus, operating method of image processing apparatus, and computer-readable recording medium |
Also Published As
Publication number | Publication date |
---|---|
CN115803776A (en) | 2023-03-14 |
US20230255461A1 (en) | 2023-08-17 |
WO2022008497A1 (en) | 2022-01-13 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
KR102598706B1 (en) | Flexible biopsy needle system | |
US6190330B1 (en) | Endoscopic location and vacuum assembly and method | |
US20070015967A1 (en) | Autosteering vision endoscope | |
US7931588B2 (en) | System for assessment of colonoscope manipulation | |
JP6749020B2 (en) | Endoscope navigation device | |
KR20170043623A (en) | Systems and methods for display of pathological data in an image guided procedure | |
EP3136943A1 (en) | System and method of scanning a body cavity using a multiple viewing elements endoscope | |
CN106794012B (en) | System and method for dynamic trajectory control | |
KR20140033128A (en) | Medical system providing dynamic registration of a model of an anatomical structure for image-guided surgery | |
EP3876186A1 (en) | Method for enhancing the visibility of blood vessels in color images and visualization systems implementing the method | |
JP7292376B2 (en) | Control device, trained model, and method of operation of endoscope movement support system | |
JP2009207793A (en) | Endoscope system | |
EP3937126A1 (en) | Endoscope image processing device | |
JP7323647B2 (en) | Endoscopy support device, operating method and program for endoscopy support device | |
CN111093463A (en) | Endoscope with a detachable handle | |
US20190231167A1 (en) | System and method for guiding and tracking a region of interest using an endoscope | |
JP7300514B2 (en) | Endoscope insertion control device, endoscope operation method and endoscope insertion control program | |
EP3653122A1 (en) | Catheter with irrigator and/or aspirator and with fiber-optic brain-clot analyzer | |
US20230122179A1 (en) | Procedure guidance for safety | |
KR102236146B1 (en) | Endoscope cap and Endoscope including the same | |
EP4190270A1 (en) | Endoscope image processing device | |
CN115668388A (en) | Estimating endoscope position in a model of a human airway | |
EP4190271A1 (en) | Endoscope image processing device | |
EP4191598A1 (en) | Endoscope image processing device | |
US20150320407A1 (en) | Sampling apparatus and method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE APPLICATION HAS BEEN PUBLISHED |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
B565 | Issuance of search results under rule 164(2) epc |
Effective date: 20201222 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE |
|
17P | Request for examination filed |
Effective date: 20220706 |
|
RBV | Designated contracting states (corrected) |
Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |