WO2021061336A1 - System, device and method for turbidity analysis - Google Patents

System, device and method for turbidity analysis Download PDF

Info

Publication number
WO2021061336A1
WO2021061336A1 PCT/US2020/048229 US2020048229W WO2021061336A1 WO 2021061336 A1 WO2021061336 A1 WO 2021061336A1 US 2020048229 W US2020048229 W US 2020048229W WO 2021061336 A1 WO2021061336 A1 WO 2021061336A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
fluid delivery
endoscopic system
metrics
metric
Prior art date
Application number
PCT/US2020/048229
Other languages
French (fr)
Inventor
Niraj Prasad RAUNIYAR
Robert J. Riker
Timothy Paul HARRAH
Original Assignee
Boston Scientific Scimed, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Boston Scientific Scimed, Inc. filed Critical Boston Scientific Scimed, Inc.
Priority to KR1020227009551A priority Critical patent/KR20220054340A/en
Priority to CN202080066581.2A priority patent/CN114531846A/en
Priority to AU2020354896A priority patent/AU2020354896B2/en
Priority to EP20768788.0A priority patent/EP4033958A1/en
Priority to CA3147729A priority patent/CA3147729A1/en
Priority to JP2022515015A priority patent/JP7309050B2/en
Publication of WO2021061336A1 publication Critical patent/WO2021061336A1/en
Priority to JP2023109724A priority patent/JP2023126283A/en
Priority to AU2023241346A priority patent/AU2023241346A1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/307Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor for the urinary organs, e.g. urethroscopes, cystoscopes
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00002Operational features of endoscopes
    • A61B1/00004Operational features of endoscopes characterised by electronic signal processing
    • A61B1/00009Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope
    • A61B1/000094Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope extracting biological structures
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00002Operational features of endoscopes
    • A61B1/00004Operational features of endoscopes characterised by electronic signal processing
    • A61B1/00009Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope
    • A61B1/000096Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope using artificial intelligence
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/04Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
    • A61B1/05Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances characterised by the image sensor, e.g. camera, being in the distal end portion
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/12Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with cooling or rinsing arrangements
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61MDEVICES FOR INTRODUCING MEDIA INTO, OR ONTO, THE BODY; DEVICES FOR TRANSDUCING BODY MEDIA OR FOR TAKING MEDIA FROM THE BODY; DEVICES FOR PRODUCING OR ENDING SLEEP OR STUPOR
    • A61M3/00Medical syringes, e.g. enemata; Irrigators
    • A61M3/02Enemata; Irrigators
    • A61M3/0204Physical characteristics of the irrigation fluid, e.g. conductivity or turbidity
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N20/00Machine learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformation in the plane of the image
    • G06T3/40Scaling the whole image or part thereof
    • G06T3/4053Super resolution, i.e. output image resolution higher than sensor resolution
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • G06T7/0014Biomedical image inspection using an image reference approach
    • G06T7/0016Biomedical image inspection using an image reference approach involving temporal comparison
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/20Scenes; Scene-specific elements in augmented reality scenes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00002Operational features of endoscopes
    • A61B1/00004Operational features of endoscopes characterised by electronic signal processing
    • A61B1/00009Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61MDEVICES FOR INTRODUCING MEDIA INTO, OR ONTO, THE BODY; DEVICES FOR TRANSDUCING BODY MEDIA OR FOR TAKING MEDIA FROM THE BODY; DEVICES FOR PRODUCING OR ENDING SLEEP OR STUPOR
    • A61M2205/00General characteristics of the apparatus
    • A61M2205/33Controlling, regulating or measuring
    • A61M2205/3306Optical measuring means
    • A61M2205/331Optical measuring means used as turbidity change detectors, e.g. for priming-blood or plasma-hemoglubine-interface detection
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61MDEVICES FOR INTRODUCING MEDIA INTO, OR ONTO, THE BODY; DEVICES FOR TRANSDUCING BODY MEDIA OR FOR TAKING MEDIA FROM THE BODY; DEVICES FOR PRODUCING OR ENDING SLEEP OR STUPOR
    • A61M2205/00General characteristics of the apparatus
    • A61M2205/33Controlling, regulating or measuring
    • A61M2205/3331Pressure; Flow
    • A61M2205/3334Measuring or controlling the flow rate
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10068Endoscopic image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20084Artificial neural networks [ANN]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30084Kidney; Renal
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30168Image quality inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V2201/00Indexing scheme relating to image or video recognition or understanding
    • G06V2201/03Recognition of patterns in medical or anatomical images

Definitions

  • the present disclosure relates to a system, a device and a method for performing an endoscopic procedure and, in particular, a turbidity analysis of an endoscopic imaging environment.
  • An endoscopic imager may be used during a variety of medical interventions.
  • the view of the anatomy provided by the imager is limited when the imaging environment is cloudy, or turbid.
  • Turbidity may be caused by blood, urine or other particles.
  • a turbid imaging environment may be managed by a fluid management system that circulates fluid in the imaged cavity.
  • the present disclosure relates to an endoscopic system which includes an endoscopic imager configured to capture image frames of a target site within a living body; and a processor.
  • the processor is configured to: determine one or more image metrics for each one of a plurality of image frames captured over a time span; analyze changes in the image metrics over the time span; and determine a turbidity metric for the target site based on the analyzed changes in the image metrics.
  • the image metrics are image entropy metrics including a red entropy metric and a cyan entropy metric.
  • the processor is further configured to: estimate a blood content in a current image frame; and alter the current image frame to mitigate a visual effect of the blood content and enhance a remainder of the current image frame.
  • the processor is further configured to: identify and classify a particle in a current image frame.
  • the endoscopic system further includes a display configured to annotate the current image frame with the turbidity metric.
  • the display is further configured to: bracket the identified particle; and annotate the current image frame with the particle classification.
  • the identified particle is a kidney stone and the classification relates to a size of the kidney stone.
  • the present disclosure also relates an endoscopic system which includes an endoscopic imager configured to capture image frames of a target site within a living body; a fluid delivery mechanism providing irrigation fluid to the target site for clarifying a field of view of the endoscopic imager; a processor configured to: determine a turbidity metric for at least one image from the imager; determine a fluid delivery adjustment for the irrigation fluid based on the turbidity metric; and control the fluid delivery mechanism to adjust a fluid delivery provided by the fluid delivery mechanism based on the determined fluid delivery adjustment.
  • the processor is further configured to: determine a type of interventional activity based on a feature detection identifying an intervention instrument. [0013] In an embodiment, the processor is further configured to: determine a phase of the interventional activity; and adjust the fluid delivery provided by the fluid delivery mechanism based on the phase of the interventional activity.
  • the processor is further configured to: identify and classify a particle in a current image frame; and adjust the fluid delivery provided by the fluid delivery mechanism based on the particle classification.
  • the particle is a kidney stone and the particle classification relates to a size of the kidney stone.
  • the processor is further configured to: determine a blood metric for the at least one image; and adjust the fluid delivery provided by the fluid delivery mechanism based on the blood metric.
  • the processor is further configured to: determine image entropy metrics for the at least one image.
  • the turbidity metric and the blood metric are determined based in part on the image entropy metrics.
  • the present disclosure relates a method which includes determining one or more image metrics for each one of a plurality of image frames of a target site within a living body captured over a time span; analyzing changes in the image metrics over the time span; and determining a turbidity metric for the target site based on the analyzed changes in the image metrics.
  • the image metrics are image entropy metrics including a red entropy metric and a cyan entropy metric.
  • the method further includes estimating a blood content in a current image frame; and altering the current image frame to mitigate a visual effect of the blood content and enhance a remainder of the current image frame. [0022] In an embodiment, the method further includes identifying and classifying a particle in a current image frame. [0023] In an embodiment, the method further includes annotating the current image frame with the turbidity metric.
  • FIG. 1 shows a system for performing an endoscopic procedure according to various exemplary embodiments of the present disclosure.
  • FIG. 2 shows a method for managing irrigation fluid flow in a closed-loop feedback system according to various exemplary embodiments of the present invention.
  • FIG. 3 shows a method for enhancing visibility through a liquid medium clouded by blood according to a first exemplary embodiment.
  • FIG. 4 shows a method for enhancing visibility through a liquid medium clouded by blood according to a second exemplary embodiment.
  • FIG. 5 shows a flowchart describing a combined embodiment of the methods of Figs. 2-4. Detailed Description
  • Typical urological procedures utilize an imaging device (e.g. a ureteroscope or other endoscopic imager), a mechanism to provide fluid (for clearing field of view and/or distending the body cavity) and a treatment mechanism (e.g. the Boston Scientific LithoVueTM device, and/or a source of laser or RF energy, etc.).
  • an imaging device e.g. a ureteroscope or other endoscopic imager
  • a mechanism to provide fluid for clearing field of view and/or distending the body cavity
  • a treatment mechanism e.g. the Boston Scientific LithoVueTM device, and/or a source of laser or RF energy, etc.
  • the improvements described herein include, e.g., methods for analyzing endoscopic images and determining turbidity and providing the information to a fluid management system to manage clarity in the field of view.
  • Other improvements include algorithmically subtracting blood features from an endoscopic image and amplifying blood-obscured image features, particularly for urological procedures.
  • Some common urological procedures include kidney stone management (e.g., lithotripsy), BPH (i.e., benign prostate hyperplasia) procedures (e.g., GreenLightTM laser surgery), prostatectomy, bladder tumor resection, uterine fibroids management, diagnostics, etc. although those skilled in the art will understand that the devices and techniques for improving images may be used in a wide variety of procedures (i.e., non- urological procedures as well) in which turbidity is an issue.
  • Fig. 1 shows a system 100 for performing an endoscopic procedure according to various exemplary embodiments of the present disclosure.
  • the system 100 includes an endoscope 102 with an imager 104 for acquiring image frames of an anatomical site within a living body during the endoscopic procedure and a fluid delivery mechanism 106 for providing a fluid (e.g., saline), to the anatomy to clear blood and debris that may impair the view of the imager 104.
  • the fluid delivery mechanism 106 also provides suction to simultaneously remove fluid from the anatomy. In this way, the anatomy is continuously refreshed with substantially transparent fluid such that clearer images may be generated.
  • the system 100 may further include a treatment device 108, selected depending on the nature of the endoscopic procedure.
  • the treatment device 108 may be run through the endoscope 102 or may be external to the endoscope 102.
  • the treatment device 108 may be, e.g., a laser or a shockwave generator for breaking up kidney stones or a resectoscope for removing prostate tissue.
  • the endoscopic procedure is for diagnostic purposes, i.e., for examining the anatomy and not for treating a condition, there may be no treatment device used.
  • the exemplary embodiments are described with respect to urological imaging, the exemplary embodiments are not limited thereto. Certain embodiments may be applicable as well to a wide range of procedures such as, for example, endoscopic procedures in the digestive system, etc., including endoscopic procedures that do not include a fluid delivery mechanism.
  • the system 100 includes a computer 110 processing image frames provided by the imager 104 and providing the processed images to a display 112.
  • the computer 110 and the display 112 are provided, in this embodiment, at an integrated station such as an endoscopic console.
  • Other features for performing the urological procedure may be implemented at the endoscopic console, including, e.g., actuators controlling a flow rate, pressure or manner of dispensation for the fluid delivered through the fluid delivery mechanism 106.
  • the exemplary embodiments describe algorithmic processes for altering and enhancing the displayed images, generally on a continuous basis or in any other desired manner.
  • image metrics determined from captured image frames are used to estimate a degree to which a video sequence is being occluded by a turbid imaging environment.
  • the image metrics include image entropy metrics.
  • Image entropy may be generally defined as a measure of information content in an image, which may be approximated by evaluating a frequency of intensity values in an image.
  • High entropy may reflect, e.g., a high amount of anatomic detail associated with a clear image, or it may reflect, e.g., a swirling cloud of particles obscuring the anatomy and will be associated with a non-clear image.
  • a time course of entropy measurements may help differentiate between e.g.
  • a high entropy clear image and a high entropy non-clear image may differentiate between these two types of images through a variability of entropy in non-clear images.
  • low entropy may reflect a loss of contrast and/or the obscuring of detail. Low entropy may also result from a very plain scene (e.g., a blank white wall).
  • An amount of image entropy may be measured with image entropy metrics including, e.g. a total entropy in an image or a ratio of red entropy to cyan entropy in an image.
  • the ratio of red entropy to cyan entropy may highlight a contribution of blood (represented by red entropy) versus other fluids (represented by cyan entropy) to highlight the blood contribution.
  • the image entropy metrics may further encompass entropy fluctuations in specific temporal frequency bands in a sequence of images. As described above, the entropy fluctuations are measured over time to differentiate between clear and non-clear images. Specifically, a rapid change in a high entropy measure of an observed image may reflect chaotic entropy associated with a swirling cloud of particles, thus signifying that the observed image is a non-clear image. [0036] An optical flow analysis is utilized to characterize a changing scene between images.
  • the optical flow analysis generally identifies neighborhoods in a given image that may correspond to neighborhoods in a prior image and identify differences in their positions.
  • One such algorithm for estimating such positional displacement is the Farneback algorithm.
  • Such an analysis provides information about stationary and moving objects in the field, as well as systematic field motion, i.e., pan, rotation, advancement and retraction of the camera.
  • Machine learning systems may also be employed to characterize and classify video segments according to a level of occlusion and scene contents.
  • machine learning methods such as neural networks, convolutional neural networks, optimizers, linear regression and/or logistical regression classifiers may be used to discover novel image analyses and/or combinations of the above-described entropy metrics to characterize and classify the video segments.
  • the machine learning methods may generate scalar estimates of turbidity, as well as blood and particle-field probability.
  • other metrics may be used for classifying the video segments such as, for instance, spatial and temporal frequency decompositions.
  • a spatial frequency analysis may effectively measure a sharpness of an image to provide direct insight into image clarity.
  • An image with a turbid field of particles may present relatively high spatial frequencies, the spatial frequencies being similar to spatial frequencies of an image with high anatomic detail. Distinguishing between the high spatial frequencies of the image with the turbid field and the image with high anatomic detail may be done with a temporal frequency analysis. As described above, the temporal frequency analysis refers to tracking changes of the metrics over time.
  • the machine learning systems may optimize the non-linear combiners and various assessments including a particle/stone assessment, a blood assessment, and a clarity assessment, to be described in detail below.
  • Various metrics may be derived through a linear combination of pixel values, intensity histogram values, non-linear functions (e.g., logarithms) of the above, and time-course sequences of these values.
  • the machine learning systems may use algorithms to find alternative combinations of the above values that do not directly correspond to spatial frequency or entropy, but more effectively correspond to an observer’s impression of turbidity.
  • the machine learning systems operate on a feature space that they are presented with (i.e., values of the systems determine classification).
  • video machine learning systems have video pixels inputted directly into the systems.
  • the image metrics e.g., spatial frequency spectra
  • the feature space may be augmented by feeding system analyses that may be helpful but not well- suited for computation in the machine learning systems. Entropy, for example, believed to be useful for this calculation, is not readily computable on many common video processing neural network architectures. As a result, the machine learning systems may perform better when provided with, in addition to the raw pixels, other analyses.
  • Mathematical variations of the metrics may be identified by the system as superior. Thus, careful enrichment of the feature space may improve performance dramatically.
  • No particular metric directly derivable from image data does a suitable job, in itself, estimating turbidity in a clinical setting with a constantly changing image background.
  • the exemplary embodiments describe methods for e.g. informing and tuning a neural network with entropy-related metrics to assess a turbidity level from image data.
  • the aforementioned metrics and analyses may be input into an appropriately designed machine learning algorithm that performs a plurality of assessments that may be output to a display to annotate a displayed image frame and/or may inform further analyses.
  • a particle or stone assessment is performed.
  • the particle/stone assessment determines properties for particles suspended in the fluid. Feature detection may be used to determine the nature of the particles. For instance, a kidney stone may be identified as such and metrics such as a size of the stone may be determined, as well as a depth estimation (i.e., a distance from imager). Motion of the stone may also be estimated, based in part on the aforementioned optical flow analysis. Other particles may include, e.g., proteins, tissue, a protrusion, a thrombus, etc.
  • the particle/stone assessment identifies the particle, segments it in the image frame, and sizes it.
  • the particle identification and classification may be used to annotate the current image frame. For instance, the particle may be bracketed and associated metrics displayed to inform the operating physician and allow for quicker decision making.
  • the particle/stone analysis may also be used to manage the delivery of irrigating fluid via by the fluid delivery mechanism 106.
  • an identification of a mobile stone may suggest a reduction in the fluid flow to maintain the position of the stone in the imager field of view.
  • the particle/stone assessment may impact the use of the treatment device 108, depending on the nature of the treatment, either directly or indirectly. For example, in a lithotripsy procedure, the classification of a stone as, e.g., too large to retrieve, too large to pass naturally, or small enough to pass naturally, may automatically (or under physician guidance) drive or influence the operation of the system during the remainder of the intervention.
  • a clarity assessment is performed. The clarity assessment determines a total measure of turbidity in the image.
  • the machine learning processes may produce a novel turbidity measure.
  • the turbidity measure may be e.g. a scalar estimate of turbidity or some other metric developed by the machine learning algorithms.
  • the clarity assessment may be used to annotate a currently displayed image and may also be used to manage the delivery of irrigating fluid.
  • a blood assessment is performed.
  • the blood assessment estimates a total measure of blood content in the cavity in, for instance, parts-per-million (PPM), although other metrics may be used.
  • PPM parts-per-million
  • the blood assessment may be used to annotate a currently displayed image and may also be used to manage the delivery of irrigating fluid.
  • the blood assessment may inform, e.g., a thrombus analysis, laser settings for e.g. a BPH procedure, etc.
  • a fourth assessment in addition to the particle assessment, clarity assessment, and blood assessment discussed above, may be used to manage fluid circulation in the imaged cavity in vivo.
  • the fourth assessment relates to the type of intervention being performed during the endoscopic procedure.
  • the intervention may be kidney stone management, prostatectomy, uterine fibroids management, etc.
  • a feature detector may be applied to a current image frame to determine the type of interventional activity underway. For instance, during a laser prostatectomy, the laser device will be within the FOV of the imager and may be identifiable as a laser device by a feature detector.
  • a stone retrieval basket device may be identified by a feature detector, etc. Once identified, the phase of the intervention is assessed. For example, the system could measure the extent to which a stone has been pulverized or otherwise reduced in size and/or could determine when a stone has been received within a basket for retrieval, etc.
  • the interventional assessment may be used in combination with, e.g., at least the particle/stone assessment during a laser lithotripsy procedure, to identify phases of the intervention (e.g., no stones in field, stones in field, lasering in progress) that may influence desired flow characteristics.
  • the type of intervention may affect the optimal flow rate of the fluid of the irrigation system.
  • a laser prostatectomy may demand a higher flow rate than, e.g., a kidney stone procedure due to the heating of the tissue.
  • the heating of the tissue may cause a complementary heating of the fluid in the cavity.
  • a more rapid inflow and outflow of fluid may act to maintain a more constant temperature in the cavity and prevent damage to healthy tissue.
  • Fig. 2 shows a method 200 for managing irrigation fluid delivery in a closed-loop feedback system according to various exemplary embodiments of the present invention.
  • the method 200 may employ some or all of the metrics described above. Certain of the described metrics may be derived from a single image frame, while others may be derived from a sequence of image frames.
  • the closed-loop adjustment of the fluid delivery may be performed based on multiple image frames.
  • the fluid delivery adjustments are applied on a rapid enough basis that the adjustments will appear to be substantially continuous from the perspective of the operating physician.
  • a sequence of image frames is captured by the imager 104.
  • the total number of images necessary to perform a given calculation may vary, so the number of images in the sequence may vary accordingly.
  • some or all of the above-described metrics are determined from one or more image frames in the sequence.
  • image entropy metrics may be derived from each of the images and determined therefrom as the images are captured.
  • pixel metrics are determined directly from the images.
  • the various metrics and analyses are processed with a machine-learning combiner algorithm/function to characterize and classify the images or video segments according to a level of occlusion and scene contents.
  • the machine-learning algorithm may combine the metrics in various ways and self-adjust a weighting or use of a metric depending on the analyses discovered. It is noted that all of the aforementioned metrics/analyses with the exception of the therapy feature detection and interventional phase assessment are input to the machine-learning combiner algorithm, while the therapy feature/phase detection/assessment are determined directly from the image frames and do not inform the particle assessment, blood assessment or clarity assessment.
  • a flow management algorithm processes the particle assessment, blood assessment, clarity assessment and intervention phase assessment to determine whether a fluid delivery adjustment of the irrigation fluid is warranted, and if so, an adjustment value.
  • the adjustment value is fed back to the processor and the flow provided by the fluid delivery mechanism is adjusted.
  • the method steps 205-225 are performed on a substantially continuous basis, providing a closed-loop feedback system for managing fluid flow during an endoscopic intervention.
  • Certain of the above-described metrics may be used to algorithmically provide image enhancements to compensate for a bloody field of view in an endoscopic intervention. Turbidity caused by blood may be managed by algorithmically subtracting blood features from images and amplifying blood-obscured features. Methods 300 and 400, described below, may be applied alone or in combination.
  • Fig. 3 shows a method 300 for enhancing visibility through a liquid medium clouded by blood according to a first exemplary embodiment.
  • the method 300 includes certain steps of the method 200 for managing irrigation fluid delivery (or steps similar to those of the method 200) and may be used alone or in combination therewith.
  • a sequence of images is captured by the imager 104 according to step 205 of method 200.
  • each of the images is divided into fixed size regions. Each region may be, e.g., 20x20 pixels.
  • the metrics described above are calculated and processed, with the aforementioned machine-learning combiner algorithms, for each of the regions, similar to steps 210-215 of method 200.
  • each of the regions are assessed for blood content and a “blood map” is generated to characterize each of the regions according to a presence of blood.
  • the metrics informing the machine-learning combiner algorithms for the blood assessment are determined and processed to classify each of the regions. It is not necessary to perform the particle/stone assessment or the clarity assessment, or to determine any intervention device features for the method 300. However, in another embodiment, where, e.g., a turbid field is caused by particles other than blood, the clarity assessment may be performed.
  • the blood map values are multiplied by regional frame-to-frame variations of each metric associated with blood detection in the combiner, as adjusted for optical flow. In this way, the individual image sub-regions may be utilized to localize the areas occluded by blood and selectively or proportionally apply image enhancements to mitigate the loss of image detail caused by the blood.
  • the values resulting from 320 are used to scale a convolution kernel or color transform calculated to be complementary to each metric, which is then applied in its region with a sigmoid reduction at the region boundary.
  • Entropy-related kernels will use either low-pass filter kernels (for positive entropy correlation) or high-pass filter kernels (for negative entropy correlation) applied to the associated color channels.
  • the image frames are recomposed from a spatial frequency representation for each of the regions, consisting of a phase and an altered magnitude, to deemphasize the bloody foreground and enhance the interesting background.
  • the visual effect of the blood may be mitigated across the image to varying degrees depending on the scaling of the convolution kernel.
  • the enhancement may be applied to the magnitude components of the spatial frequencies, which may then be recombined with their phase components and transformed back into a conventional image.
  • FIG. 4 shows a method 400 for enhancing visibility through a liquid medium clouded by blood according to a second exemplary embodiment.
  • a current image frame is correlated to a super-resolution image map using a deformable non-linear transform.
  • the super-resolution image map may be generally defined as an improved resolution image generated from a fusion of lower-resolution images.
  • a current image may be fused with the super-resolution image map to enhance the resolution of the current image.
  • a “blood mask” is created based on the aforementioned metrics associated with blood detection, the blood mask reflecting the spatial distribution of inferred blood turbidity in a given image frame.
  • the current image frame is fused with the deformed super-resolution map to generate an enhanced frame that is a weighted average of the current frame and the superresolution map.
  • the pixels are driven toward those of the original image frame when the blood mask value is low, and the pixels are driven toward those of the super-resolution map when the mask value is high.
  • Fig. 5 shows a flowchart 500 describing a combined embodiment comprising elements of methods 200, 300 and 400. It will be understood by those skilled in the art that the various metrics and analyses described previously may be used alone or in combination in a variety of ways.
  • an image is extracted from an endoscopic imaging feed.
  • certain of the metrics/analyses such as e.g. the optical flow analysis, require a number of images to determine.
  • the image is separated into its red and cyan components for calculating various metrics. For example, in 515 the red and cyan entropy values are calculated. Metrics such as e.g. a red to cyan entropy ratio may be calculated, or the red cyan value may be used directly.
  • the spatial frequency analysis is performed for the red and cyan spectra components.
  • the red/cyan entropy values and the spatial frequency analysis inform a temporal frequency analysis. It will be understood by those skilled in the art that the temporal frequency analysis requires a sequence of pictures for analyzing differences therebetween. For example, entropy fluctuations are measured over time.
  • a color balance is performed where the red and cyan components are compared.
  • the pixel change rate analysis is performed where pixel values between images are compared.
  • the optical flow analysis is performed for correlating successive images and deriving information such as e.g. object or tissue motion over time.
  • the aforementioned metrics/analyses are combined by a machine learning system in various ways to generate the particle information (particle/stone assessment 550), blood content information (blood assessment 555), and turbidity information (clarity assessment 560).
  • An interventional assessment is also performed where an interventional feature in the image is identified (therapy device feature detection 565) and a phase of the intervention is assessed (therapy phase assessment 570).
  • the particle, blood, clarity and interventional analyses are used to control a fluid flow 575 of the irrigation system.
  • the fluid flow management 575 may include adjusting a flow rate, a pressure, or a manner of dispensing the fluid.
  • the current image may be enhanced by removing or mitigating a blood component of the image according to methods 300 or 400.
  • the current image may be annotated according to method 200.
  • the current image may further be annotated with blood or turbidity metrics derived in the blood/clarity assessments.
  • the enhanced/annotated image is displayed. As further images are captured the aforementioned steps/metrics/analysis are performed or determined.
  • a computer-readable medium comprises instructions which, when executed by a computer, cause the computer to perform the various image processing steps/analyses discussed above, e.g., determining the turbidity metric.

Landscapes

  • Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Surgery (AREA)
  • Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Radiology & Medical Imaging (AREA)
  • Public Health (AREA)
  • Animal Behavior & Ethology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Veterinary Medicine (AREA)
  • Biomedical Technology (AREA)
  • Optics & Photonics (AREA)
  • Molecular Biology (AREA)
  • Biophysics (AREA)
  • Pathology (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Signal Processing (AREA)
  • Quality & Reliability (AREA)
  • Multimedia (AREA)
  • Urology & Nephrology (AREA)
  • Anesthesiology (AREA)
  • Hematology (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Computation (AREA)
  • Human Computer Interaction (AREA)
  • Software Systems (AREA)
  • Computing Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Mathematical Physics (AREA)
  • Data Mining & Analysis (AREA)
  • Endoscopes (AREA)
  • Investigating Or Analysing Materials By Optical Means (AREA)
  • Image Processing (AREA)

Abstract

An endoscopic system includes an endoscopic imager configured to capture images of a target site within a living body and a processor configured to determine one or more image metrics for each one of a plurality of image frames captured over a time span, analyze changes in the image metrics over the time span, and determine a turbidity metric for the target site based on the analyzed changes in the image metrics.

Description

System, Device and Method for Turbidity Analysis
Priority Claim
[0001] The present disclosure claims priority to U.S. Provisional Patent Application Serial No. 62/904,882 filed September 24, 2019; the disclosure of which is incorporated herewith by reference.
Field
[0002] The present disclosure relates to a system, a device and a method for performing an endoscopic procedure and, in particular, a turbidity analysis of an endoscopic imaging environment.
Background
[0003] An endoscopic imager may be used during a variety of medical interventions. The view of the anatomy provided by the imager is limited when the imaging environment is cloudy, or turbid. Turbidity may be caused by blood, urine or other particles. In some endoscopic procedures (e.g., ureteroscopic procedures), a turbid imaging environment may be managed by a fluid management system that circulates fluid in the imaged cavity.
Summary
[0004] The present disclosure relates to an endoscopic system which includes an endoscopic imager configured to capture image frames of a target site within a living body; and a processor. The processor is configured to: determine one or more image metrics for each one of a plurality of image frames captured over a time span; analyze changes in the image metrics over the time span; and determine a turbidity metric for the target site based on the analyzed changes in the image metrics.
[0005] In an embodiment, the image metrics are image entropy metrics including a red entropy metric and a cyan entropy metric.
[0006] In an embodiment, the processor is further configured to: estimate a blood content in a current image frame; and alter the current image frame to mitigate a visual effect of the blood content and enhance a remainder of the current image frame.
[0007] In an embodiment, the processor is further configured to: identify and classify a particle in a current image frame. [0008] In an embodiment, the endoscopic system further includes a display configured to annotate the current image frame with the turbidity metric.
[0009] In an embodiment, the display is further configured to: bracket the identified particle; and annotate the current image frame with the particle classification.
[0010] In an embodiment, the identified particle is a kidney stone and the classification relates to a size of the kidney stone.
[0011] The present disclosure also relates an endoscopic system which includes an endoscopic imager configured to capture image frames of a target site within a living body; a fluid delivery mechanism providing irrigation fluid to the target site for clarifying a field of view of the endoscopic imager; a processor configured to: determine a turbidity metric for at least one image from the imager; determine a fluid delivery adjustment for the irrigation fluid based on the turbidity metric; and control the fluid delivery mechanism to adjust a fluid delivery provided by the fluid delivery mechanism based on the determined fluid delivery adjustment.
[0012] In an embodiment, the processor is further configured to: determine a type of interventional activity based on a feature detection identifying an intervention instrument. [0013] In an embodiment, the processor is further configured to: determine a phase of the interventional activity; and adjust the fluid delivery provided by the fluid delivery mechanism based on the phase of the interventional activity.
[0014] In an embodiment, the processor is further configured to: identify and classify a particle in a current image frame; and adjust the fluid delivery provided by the fluid delivery mechanism based on the particle classification.
[0015] In an embodiment, the particle is a kidney stone and the particle classification relates to a size of the kidney stone. [0016] In an embodiment, the processor is further configured to: determine a blood metric for the at least one image; and adjust the fluid delivery provided by the fluid delivery mechanism based on the blood metric.
[0017] In an embodiment, the processor is further configured to: determine image entropy metrics for the at least one image.
[0018] In an embodiment, the turbidity metric and the blood metric are determined based in part on the image entropy metrics. [0019] In addition, the present disclosure relates a method which includes determining one or more image metrics for each one of a plurality of image frames of a target site within a living body captured over a time span; analyzing changes in the image metrics over the time span; and determining a turbidity metric for the target site based on the analyzed changes in the image metrics.
[0020] In an embodiment, the image metrics are image entropy metrics including a red entropy metric and a cyan entropy metric.
[0021] In an embodiment, the method further includes estimating a blood content in a current image frame; and altering the current image frame to mitigate a visual effect of the blood content and enhance a remainder of the current image frame. [0022] In an embodiment, the method further includes identifying and classifying a particle in a current image frame. [0023] In an embodiment, the method further includes annotating the current image frame with the turbidity metric.
Brief Description [0024] Fig. 1 shows a system for performing an endoscopic procedure according to various exemplary embodiments of the present disclosure.
[0025] Fig. 2 shows a method for managing irrigation fluid flow in a closed-loop feedback system according to various exemplary embodiments of the present invention.
[0026] Fig. 3 shows a method for enhancing visibility through a liquid medium clouded by blood according to a first exemplary embodiment.
[0027] Fig. 4 shows a method for enhancing visibility through a liquid medium clouded by blood according to a second exemplary embodiment.
[0028] Fig. 5 shows a flowchart describing a combined embodiment of the methods of Figs. 2-4. Detailed Description
[0029] The present disclosure may be further understood with reference to the following description and the appended drawings, wherein like elements are referred to with the same reference numerals. The exemplary embodiments describe algorithmic improvements for managing turbidity in an endoscopic imaging environment. Typical urological procedures utilize an imaging device (e.g. a ureteroscope or other endoscopic imager), a mechanism to provide fluid (for clearing field of view and/or distending the body cavity) and a treatment mechanism (e.g. the Boston Scientific LithoVue™ device, and/or a source of laser or RF energy, etc.).
[0030] The improvements described herein include, e.g., methods for analyzing endoscopic images and determining turbidity and providing the information to a fluid management system to manage clarity in the field of view. Other improvements include algorithmically subtracting blood features from an endoscopic image and amplifying blood-obscured image features, particularly for urological procedures. Some common urological procedures include kidney stone management (e.g., lithotripsy), BPH (i.e., benign prostate hyperplasia) procedures (e.g., GreenLight™ laser surgery), prostatectomy, bladder tumor resection, uterine fibroids management, diagnostics, etc. although those skilled in the art will understand that the devices and techniques for improving images may be used in a wide variety of procedures (i.e., non- urological procedures as well) in which turbidity is an issue.
[0031] Fig. 1 shows a system 100 for performing an endoscopic procedure according to various exemplary embodiments of the present disclosure. The system 100 includes an endoscope 102 with an imager 104 for acquiring image frames of an anatomical site within a living body during the endoscopic procedure and a fluid delivery mechanism 106 for providing a fluid (e.g., saline), to the anatomy to clear blood and debris that may impair the view of the imager 104. The fluid delivery mechanism 106 also provides suction to simultaneously remove fluid from the anatomy. In this way, the anatomy is continuously refreshed with substantially transparent fluid such that clearer images may be generated.
[0032] The system 100 may further include a treatment device 108, selected depending on the nature of the endoscopic procedure. The treatment device 108 may be run through the endoscope 102 or may be external to the endoscope 102. For example, the treatment device 108 may be, e.g., a laser or a shockwave generator for breaking up kidney stones or a resectoscope for removing prostate tissue. When the endoscopic procedure is for diagnostic purposes, i.e., for examining the anatomy and not for treating a condition, there may be no treatment device used. Although the exemplary embodiments are described with respect to urological imaging, the exemplary embodiments are not limited thereto. Certain embodiments may be applicable as well to a wide range of procedures such as, for example, endoscopic procedures in the digestive system, etc., including endoscopic procedures that do not include a fluid delivery mechanism.
[0033] The system 100 includes a computer 110 processing image frames provided by the imager 104 and providing the processed images to a display 112. The computer 110 and the display 112 are provided, in this embodiment, at an integrated station such as an endoscopic console. Other features for performing the urological procedure may be implemented at the endoscopic console, including, e.g., actuators controlling a flow rate, pressure or manner of dispensation for the fluid delivered through the fluid delivery mechanism 106. The exemplary embodiments describe algorithmic processes for altering and enhancing the displayed images, generally on a continuous basis or in any other desired manner.
[0034] In one embodiment, image metrics determined from captured image frames are used to estimate a degree to which a video sequence is being occluded by a turbid imaging environment. The image metrics include image entropy metrics. Image entropy may be generally defined as a measure of information content in an image, which may be approximated by evaluating a frequency of intensity values in an image. High entropy may reflect, e.g., a high amount of anatomic detail associated with a clear image, or it may reflect, e.g., a swirling cloud of particles obscuring the anatomy and will be associated with a non-clear image. A time course of entropy measurements may help differentiate between e.g. a high entropy clear image and a high entropy non-clear image. For example, the time course of entropy measurements for a series of images may differentiate between these two types of images through a variability of entropy in non-clear images.
[0035] Alternatively, low entropy may reflect a loss of contrast and/or the obscuring of detail. Low entropy may also result from a very plain scene (e.g., a blank white wall). An amount of image entropy may be measured with image entropy metrics including, e.g. a total entropy in an image or a ratio of red entropy to cyan entropy in an image. In an embodiment, the ratio of red entropy to cyan entropy may highlight a contribution of blood (represented by red entropy) versus other fluids (represented by cyan entropy) to highlight the blood contribution. In some embodiments, the image entropy metrics may further encompass entropy fluctuations in specific temporal frequency bands in a sequence of images. As described above, the entropy fluctuations are measured over time to differentiate between clear and non-clear images. Specifically, a rapid change in a high entropy measure of an observed image may reflect chaotic entropy associated with a swirling cloud of particles, thus signifying that the observed image is a non-clear image. [0036] An optical flow analysis is utilized to characterize a changing scene between images.
The optical flow analysis generally identifies neighborhoods in a given image that may correspond to neighborhoods in a prior image and identify differences in their positions. One such algorithm for estimating such positional displacement is the Farneback algorithm. Such an analysis provides information about stationary and moving objects in the field, as well as systematic field motion, i.e., pan, rotation, advancement and retraction of the camera.
[0037] Machine learning systems may also be employed to characterize and classify video segments according to a level of occlusion and scene contents. For example, machine learning methods such as neural networks, convolutional neural networks, optimizers, linear regression and/or logistical regression classifiers may be used to discover novel image analyses and/or combinations of the above-described entropy metrics to characterize and classify the video segments. The machine learning methods may generate scalar estimates of turbidity, as well as blood and particle-field probability. [0038] In another embodiment, other metrics may be used for classifying the video segments such as, for instance, spatial and temporal frequency decompositions. A spatial frequency analysis may effectively measure a sharpness of an image to provide direct insight into image clarity. An image with a turbid field of particles may present relatively high spatial frequencies, the spatial frequencies being similar to spatial frequencies of an image with high anatomic detail. Distinguishing between the high spatial frequencies of the image with the turbid field and the image with high anatomic detail may be done with a temporal frequency analysis. As described above, the temporal frequency analysis refers to tracking changes of the metrics over time.
[0039] The machine learning systems may optimize the non-linear combiners and various assessments including a particle/stone assessment, a blood assessment, and a clarity assessment, to be described in detail below. Various metrics may be derived through a linear combination of pixel values, intensity histogram values, non-linear functions (e.g., logarithms) of the above, and time-course sequences of these values. The machine learning systems may use algorithms to find alternative combinations of the above values that do not directly correspond to spatial frequency or entropy, but more effectively correspond to an observer’s impression of turbidity.
[0040] The machine learning systems operate on a feature space that they are presented with (i.e., values of the systems determine classification). In an embodiment, video machine learning systems have video pixels inputted directly into the systems. The image metrics (e.g., spatial frequency spectra) are computed from these values in common machine learning models. The feature space may be augmented by feeding system analyses that may be helpful but not well- suited for computation in the machine learning systems. Entropy, for example, believed to be useful for this calculation, is not readily computable on many common video processing neural network architectures. As a result, the machine learning systems may perform better when provided with, in addition to the raw pixels, other analyses. Feeding a neural network with the intensity histogram data, including logarithms thereof, allow the network to incorporate entropy and entropy-like measures, thereby converging more quickly on a better result. Mathematical variations of the metrics may be identified by the system as superior. Thus, careful enrichment of the feature space may improve performance dramatically. [0041] No particular metric directly derivable from image data does a suitable job, in itself, estimating turbidity in a clinical setting with a constantly changing image background. The exemplary embodiments describe methods for e.g. informing and tuning a neural network with entropy-related metrics to assess a turbidity level from image data. [0042] The aforementioned metrics and analyses may be input into an appropriately designed machine learning algorithm that performs a plurality of assessments that may be output to a display to annotate a displayed image frame and/or may inform further analyses.
[0043] In a first example, a particle or stone assessment is performed. The particle/stone assessment determines properties for particles suspended in the fluid. Feature detection may be used to determine the nature of the particles. For instance, a kidney stone may be identified as such and metrics such as a size of the stone may be determined, as well as a depth estimation (i.e., a distance from imager). Motion of the stone may also be estimated, based in part on the aforementioned optical flow analysis. Other particles may include, e.g., proteins, tissue, a protrusion, a thrombus, etc. The particle/stone assessment identifies the particle, segments it in the image frame, and sizes it.
[0044] The particle identification and classification may be used to annotate the current image frame. For instance, the particle may be bracketed and associated metrics displayed to inform the operating physician and allow for quicker decision making. The particle/stone analysis may also be used to manage the delivery of irrigating fluid via by the fluid delivery mechanism 106.
For example, whereas typically a turbid field may suggest a need to increase fluid flow, an identification of a mobile stone may suggest a reduction in the fluid flow to maintain the position of the stone in the imager field of view. Additionally, the particle/stone assessment may impact the use of the treatment device 108, depending on the nature of the treatment, either directly or indirectly. For example, in a lithotripsy procedure, the classification of a stone as, e.g., too large to retrieve, too large to pass naturally, or small enough to pass naturally, may automatically (or under physician guidance) drive or influence the operation of the system during the remainder of the intervention. [0045] In a second example, a clarity assessment is performed. The clarity assessment determines a total measure of turbidity in the image. As discussed previously, the machine learning processes utilizing, e.g. image entropy metrics as an input, may produce a novel turbidity measure. The turbidity measure may be e.g. a scalar estimate of turbidity or some other metric developed by the machine learning algorithms. The clarity assessment may be used to annotate a currently displayed image and may also be used to manage the delivery of irrigating fluid.
[0046] In a third example, a blood assessment is performed. The blood assessment estimates a total measure of blood content in the cavity in, for instance, parts-per-million (PPM), although other metrics may be used. The blood assessment may be used to annotate a currently displayed image and may also be used to manage the delivery of irrigating fluid. In addition, the blood assessment may inform, e.g., a thrombus analysis, laser settings for e.g. a BPH procedure, etc.
[0047] A fourth assessment, in addition to the particle assessment, clarity assessment, and blood assessment discussed above, may be used to manage fluid circulation in the imaged cavity in vivo. The fourth assessment relates to the type of intervention being performed during the endoscopic procedure. For example, the intervention may be kidney stone management, prostatectomy, uterine fibroids management, etc. A feature detector may be applied to a current image frame to determine the type of interventional activity underway. For instance, during a laser prostatectomy, the laser device will be within the FOV of the imager and may be identifiable as a laser device by a feature detector.
[0048] In another example, a stone retrieval basket device may be identified by a feature detector, etc. Once identified, the phase of the intervention is assessed. For example, the system could measure the extent to which a stone has been pulverized or otherwise reduced in size and/or could determine when a stone has been received within a basket for retrieval, etc. Thus, the interventional assessment may be used in combination with, e.g., at least the particle/stone assessment during a laser lithotripsy procedure, to identify phases of the intervention (e.g., no stones in field, stones in field, lasering in progress) that may influence desired flow characteristics. The type of intervention may affect the optimal flow rate of the fluid of the irrigation system. For instance, a laser prostatectomy may demand a higher flow rate than, e.g., a kidney stone procedure due to the heating of the tissue. The heating of the tissue may cause a complementary heating of the fluid in the cavity. A more rapid inflow and outflow of fluid may act to maintain a more constant temperature in the cavity and prevent damage to healthy tissue. [0049] Fig. 2 shows a method 200 for managing irrigation fluid delivery in a closed-loop feedback system according to various exemplary embodiments of the present invention. The method 200 may employ some or all of the metrics described above. Certain of the described metrics may be derived from a single image frame, while others may be derived from a sequence of image frames. Thus, the closed-loop adjustment of the fluid delivery may be performed based on multiple image frames. However, considering the rapid frame rate of imagers typically used in the art, the fluid delivery adjustments are applied on a rapid enough basis that the adjustments will appear to be substantially continuous from the perspective of the operating physician.
[0050] In 205, a sequence of image frames is captured by the imager 104. As noted above, the total number of images necessary to perform a given calculation may vary, so the number of images in the sequence may vary accordingly.
[0051] In 210, some or all of the above-described metrics are determined from one or more image frames in the sequence. For example, image entropy metrics may be derived from each of the images and determined therefrom as the images are captured. In another example, pixel metrics are determined directly from the images.
[0052] In 215, the various metrics and analyses are processed with a machine-learning combiner algorithm/function to characterize and classify the images or video segments according to a level of occlusion and scene contents. As described above, the machine-learning algorithm may combine the metrics in various ways and self-adjust a weighting or use of a metric depending on the analyses discovered. It is noted that all of the aforementioned metrics/analyses with the exception of the therapy feature detection and interventional phase assessment are input to the machine-learning combiner algorithm, while the therapy feature/phase detection/assessment are determined directly from the image frames and do not inform the particle assessment, blood assessment or clarity assessment.
[0053] In 220, a flow management algorithm processes the particle assessment, blood assessment, clarity assessment and intervention phase assessment to determine whether a fluid delivery adjustment of the irrigation fluid is warranted, and if so, an adjustment value.
[0054] In 225, the adjustment value is fed back to the processor and the flow provided by the fluid delivery mechanism is adjusted.
[0055] The method steps 205-225 are performed on a substantially continuous basis, providing a closed-loop feedback system for managing fluid flow during an endoscopic intervention. [0056] Certain of the above-described metrics, particularly relating to the blood assessment and the clarity assessment, may be used to algorithmically provide image enhancements to compensate for a bloody field of view in an endoscopic intervention. Turbidity caused by blood may be managed by algorithmically subtracting blood features from images and amplifying blood-obscured features. Methods 300 and 400, described below, may be applied alone or in combination.
[0057] Fig. 3 shows a method 300 for enhancing visibility through a liquid medium clouded by blood according to a first exemplary embodiment. The method 300 includes certain steps of the method 200 for managing irrigation fluid delivery (or steps similar to those of the method 200) and may be used alone or in combination therewith.
[0058] In 305, a sequence of images is captured by the imager 104 according to step 205 of method 200. In 310, each of the images is divided into fixed size regions. Each region may be, e.g., 20x20 pixels.
[0059] In 315, certain of the metrics described above are calculated and processed, with the aforementioned machine-learning combiner algorithms, for each of the regions, similar to steps 210-215 of method 200. In this way, each of the regions are assessed for blood content and a “blood map” is generated to characterize each of the regions according to a presence of blood. Specifically, the metrics informing the machine-learning combiner algorithms for the blood assessment are determined and processed to classify each of the regions. It is not necessary to perform the particle/stone assessment or the clarity assessment, or to determine any intervention device features for the method 300. However, in another embodiment, where, e.g., a turbid field is caused by particles other than blood, the clarity assessment may be performed.
[0060] In 320, the blood map values are multiplied by regional frame-to-frame variations of each metric associated with blood detection in the combiner, as adjusted for optical flow. In this way, the individual image sub-regions may be utilized to localize the areas occluded by blood and selectively or proportionally apply image enhancements to mitigate the loss of image detail caused by the blood. [0061] In 325, the values resulting from 320 are used to scale a convolution kernel or color transform calculated to be complementary to each metric, which is then applied in its region with a sigmoid reduction at the region boundary. Entropy-related kernels will use either low-pass filter kernels (for positive entropy correlation) or high-pass filter kernels (for negative entropy correlation) applied to the associated color channels.
[0062] In 330, the image frames are recomposed from a spatial frequency representation for each of the regions, consisting of a phase and an altered magnitude, to deemphasize the bloody foreground and enhance the interesting background. The visual effect of the blood may be mitigated across the image to varying degrees depending on the scaling of the convolution kernel. The enhancement may be applied to the magnitude components of the spatial frequencies, which may then be recombined with their phase components and transformed back into a conventional image.
[0063] Fig. 4 shows a method 400 for enhancing visibility through a liquid medium clouded by blood according to a second exemplary embodiment.
[0064] In 405, a current image frame is correlated to a super-resolution image map using a deformable non-linear transform. The super-resolution image map may be generally defined as an improved resolution image generated from a fusion of lower-resolution images. A current image may be fused with the super-resolution image map to enhance the resolution of the current image.
[0065] In 410, a “blood mask” is created based on the aforementioned metrics associated with blood detection, the blood mask reflecting the spatial distribution of inferred blood turbidity in a given image frame.
[0066] In 415, the current image frame is fused with the deformed super-resolution map to generate an enhanced frame that is a weighted average of the current frame and the superresolution map. The pixels are driven toward those of the original image frame when the blood mask value is low, and the pixels are driven toward those of the super-resolution map when the mask value is high.
[0067] Fig. 5 shows a flowchart 500 describing a combined embodiment comprising elements of methods 200, 300 and 400. It will be understood by those skilled in the art that the various metrics and analyses described previously may be used alone or in combination in a variety of ways.
[0068] In 505, an image is extracted from an endoscopic imaging feed. As discussed previously, certain of the metrics/analyses, such as e.g. the optical flow analysis, require a number of images to determine. However, it may be assumed for the purposes of the flowchart 500 that sufficient previous images have been captured to perform each of the calculations.
[0069] In 510, the image is separated into its red and cyan components for calculating various metrics. For example, in 515 the red and cyan entropy values are calculated. Metrics such as e.g. a red to cyan entropy ratio may be calculated, or the red cyan value may be used directly. In 520, the spatial frequency analysis is performed for the red and cyan spectra components. In 525, the red/cyan entropy values and the spatial frequency analysis inform a temporal frequency analysis. It will be understood by those skilled in the art that the temporal frequency analysis requires a sequence of pictures for analyzing differences therebetween. For example, entropy fluctuations are measured over time. In 530, a color balance is performed where the red and cyan components are compared. In 535, the pixel change rate analysis is performed where pixel values between images are compared. In 540, the optical flow analysis is performed for correlating successive images and deriving information such as e.g. object or tissue motion over time.
[0070] In 545, the aforementioned metrics/analyses are combined by a machine learning system in various ways to generate the particle information (particle/stone assessment 550), blood content information (blood assessment 555), and turbidity information (clarity assessment 560). An interventional assessment is also performed where an interventional feature in the image is identified (therapy device feature detection 565) and a phase of the intervention is assessed (therapy phase assessment 570). The particle, blood, clarity and interventional analyses are used to control a fluid flow 575 of the irrigation system. As discussed previously, the fluid flow management 575 may include adjusting a flow rate, a pressure, or a manner of dispensing the fluid.
[0071] In 580, the current image may be enhanced by removing or mitigating a blood component of the image according to methods 300 or 400. In 585, the current image may be annotated according to method 200. The current image may further be annotated with blood or turbidity metrics derived in the blood/clarity assessments. In 590, the enhanced/annotated image is displayed. As further images are captured the aforementioned steps/metrics/analysis are performed or determined.
[0072] In another embodiment, a computer-readable medium comprises instructions which, when executed by a computer, cause the computer to perform the various image processing steps/analyses discussed above, e.g., determining the turbidity metric.
[0073] It will be appreciated by those skilled in the art that changes may be made to the embodiments described above without departing from the inventive concept thereof. It should further be appreciated that structural features and methods associated with one of the embodiments can be incorporated into other embodiments. It is understood, therefore, that this invention is not limited to the particular embodiment disclosed, but rather modifications are also covered within the scope of the present invention as defined by the appended claims.

Claims

Claims
1. An endoscopic system, comprising: an endoscopic imager configured to capture image frames of a target site within a living body; and a processor configured to: determine one or more image metrics for each one of a plurality of image frames captured over a time span; analyze changes in the image metrics over the time span; and determine a turbidity metric for the target site based on the analyzed changes in the image metrics.
2. The endoscopic system of claim 1, wherein the image metrics are image entropy metrics including a red entropy metric and a cyan entropy metric.
3. The endoscopic system of claim 2, wherein the processor is further configured to: estimate a blood content in a current image frame; and alter the current image frame to mitigate a visual effect of the blood content and enhance a remainder of the current image frame.
4. The endoscopic system of any one of claims 1-3, wherein the processor is further configured to: identify and classify a particle in a current image frame.
5. The endoscopic system of claim 4, further comprising: a display configured to annotate the current image frame with the turbidity metric.
6. The endoscopic system of claim 5, wherein the display is further configured to: bracket the identified particle; and annotate the current image frame with the particle classification.
7. The endoscopic system of claim 6, wherein the identified particle is a kidney stone and the classification relates to a size of the kidney stone.
8. An endoscopic system, comprising: an endoscopic imager configured to capture image frames of a target site within a living body; a fluid delivery mechanism providing irrigation fluid to the target site for clarifying a field of view of the endoscopic imager; and a processor configured to: determine a turbidity metric for at least one image from the imager; determine a fluid delivery adjustment for the irrigation fluid based on the turbidity metric; and control the fluid delivery mechanism to adjust a fluid delivery provided by the fluid delivery mechanism based on the determined fluid delivery adjustment.
9. The endoscopic system of claim 8, wherein the processor is further configured to: determine a type of interventional activity based on a feature detection identifying an intervention instrument.
10. The endoscopic system of claim 9, wherein the processor is further configured to: determine a phase of the interventional activity; and adjust the fluid delivery provided by the fluid delivery mechanism based on the phase of the interventional activity.
11. The endoscopic system of any one of claims 8-10, wherein the processor is further configured to: identify and classify a particle in a current image frame; and adjust the fluid delivery provided by the fluid delivery mechanism based on the particle classification.
12. The endoscopic system of claim 11, wherein the particle is a kidney stone and the particle classification relates to a size of the kidney stone.
13. The endoscopic system of any one of claims 8-11, wherein the processor is further configured to: determine a blood metric for the at least one image; and adjust the fluid delivery provided by the fluid delivery mechanism based on the blood metric.
14. The endoscopic system of claim 13, wherein the processor is further configured to: determine image entropy metrics for the at least one image.
15. The endoscopic system of claim 14, wherein the turbidity metric and the blood metric are determined based in part on the image entropy metrics.
PCT/US2020/048229 2019-09-24 2020-08-27 System, device and method for turbidity analysis WO2021061336A1 (en)

Priority Applications (8)

Application Number Priority Date Filing Date Title
KR1020227009551A KR20220054340A (en) 2019-09-24 2020-08-27 Systems, devices, and methods for turbidity analysis
CN202080066581.2A CN114531846A (en) 2019-09-24 2020-08-27 System, apparatus and method for turbidity analysis
AU2020354896A AU2020354896B2 (en) 2019-09-24 2020-08-27 System, device and method for turbidity analysis
EP20768788.0A EP4033958A1 (en) 2019-09-24 2020-08-27 System, device and method for turbidity analysis
CA3147729A CA3147729A1 (en) 2019-09-24 2020-08-27 System, device and method for turbidity analysis
JP2022515015A JP7309050B2 (en) 2019-09-24 2020-08-27 System and equipment for turbidity analysis
JP2023109724A JP2023126283A (en) 2019-09-24 2023-07-04 System, device and method for turbidity analysis
AU2023241346A AU2023241346A1 (en) 2019-09-24 2023-10-05 System, device and method for turbidity analysis

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201962904882P 2019-09-24 2019-09-24
US62/904,882 2019-09-24

Publications (1)

Publication Number Publication Date
WO2021061336A1 true WO2021061336A1 (en) 2021-04-01

Family

ID=72433043

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2020/048229 WO2021061336A1 (en) 2019-09-24 2020-08-27 System, device and method for turbidity analysis

Country Status (8)

Country Link
US (1) US20210085165A1 (en)
EP (1) EP4033958A1 (en)
JP (2) JP7309050B2 (en)
KR (1) KR20220054340A (en)
CN (1) CN114531846A (en)
AU (2) AU2020354896B2 (en)
CA (1) CA3147729A1 (en)
WO (1) WO2021061336A1 (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11737434B2 (en) 2021-07-19 2023-08-29 X Development Llc Turbidity determination using computer vision
US20230039326A1 (en) * 2021-08-09 2023-02-09 Easyendo Surgical, Inc. Stone removing apparatus and stone size measuring method
US11881017B2 (en) * 2022-03-24 2024-01-23 X Development Llc Turbidity determination using machine learning
CN115063596A (en) * 2022-06-17 2022-09-16 上海蓝长自动化科技有限公司 Water quality turbidity detection method based on deep regression network
CN116211260B (en) * 2023-05-09 2023-07-21 西南医科大学附属医院 Kidney stone form three-dimensional imaging system and method based on zooming scanning

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100280314A1 (en) * 2007-11-20 2010-11-04 Pieter Brommersma Urological resectoscope comprising holes
US20120057754A1 (en) * 1997-12-23 2012-03-08 Dunton Randy R Image selection based on image content
US20120316421A1 (en) * 2009-07-07 2012-12-13 The Johns Hopkins University System and method for automated disease assessment in capsule endoscopy
US20130243286A1 (en) * 2006-03-13 2013-09-19 Given Imaging Ltd. Cascade analysis for intestinal contraction detection
US20180082104A1 (en) * 2015-03-02 2018-03-22 Siemens Aktiengesellschaft Classification of cellular images and videos
US20180271615A1 (en) * 2017-03-21 2018-09-27 Amit Mahadik Methods and systems to automate surgical interventions
US20190192237A1 (en) * 2016-01-29 2019-06-27 Boston Scientific Scimed, Inc. Medical user interfaces and related methods of use

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120057754A1 (en) * 1997-12-23 2012-03-08 Dunton Randy R Image selection based on image content
US20130243286A1 (en) * 2006-03-13 2013-09-19 Given Imaging Ltd. Cascade analysis for intestinal contraction detection
US20100280314A1 (en) * 2007-11-20 2010-11-04 Pieter Brommersma Urological resectoscope comprising holes
US20120316421A1 (en) * 2009-07-07 2012-12-13 The Johns Hopkins University System and method for automated disease assessment in capsule endoscopy
US20180082104A1 (en) * 2015-03-02 2018-03-22 Siemens Aktiengesellschaft Classification of cellular images and videos
US20190192237A1 (en) * 2016-01-29 2019-06-27 Boston Scientific Scimed, Inc. Medical user interfaces and related methods of use
US20180271615A1 (en) * 2017-03-21 2018-09-27 Amit Mahadik Methods and systems to automate surgical interventions

Also Published As

Publication number Publication date
AU2020354896A1 (en) 2022-03-10
US20210085165A1 (en) 2021-03-25
JP2023126283A (en) 2023-09-07
EP4033958A1 (en) 2022-08-03
AU2020354896B2 (en) 2023-07-06
CA3147729A1 (en) 2021-04-01
JP7309050B2 (en) 2023-07-14
CN114531846A (en) 2022-05-24
AU2023241346A1 (en) 2023-10-26
JP2022547132A (en) 2022-11-10
KR20220054340A (en) 2022-05-02

Similar Documents

Publication Publication Date Title
AU2020354896B2 (en) System, device and method for turbidity analysis
US10928622B2 (en) Medical image processing apparatus, medical image processing method, and medical observation system
US10213093B2 (en) Focus control device, endoscope apparatus, and method for controlling focus control device
KR102630074B1 (en) Systems and methods for endoscopic video enhancement, quantification, and surgical guidance
US10517467B2 (en) Focus control device, endoscope apparatus, and method for controlling focus control device
CN113573654A (en) AI system for detecting and determining lesion size
CN112513617A (en) Method and system for dye-free visualization of blood flow and tissue perfusion in laparoscopic surgery
US20220125280A1 (en) Apparatuses and methods involving multi-modal imaging of a sample
KR20210051141A (en) Method, apparatus and computer program for providing augmented reality based medical information of patient
US20130163719A1 (en) Stereo x-ray imaging apparatus and stereo x-ray imaging method
Gupta et al. Mi-unet: Improved segmentation in ureteroscopy
CN117202833A (en) System and method for controlling surgical pump using endoscopic video data
WO2020016886A1 (en) Systems and methods of navigation for robotic colonoscopy
US20230122835A1 (en) Methods and systems for generating clarified and enhanced intraoperative imaging data
Kim et al. Deep-learning-based cerebral artery semantic segmentation in neurosurgical operating microscope vision using indocyanine green fluorescence videoangiography
JP7315033B2 (en) Treatment support device, treatment support method, and program
JP7158471B2 (en) Inspection video processing device, operation method of inspection video processing device, and inspection video processing program
WO2023184526A1 (en) System and method of real-time stereoscopic visualization based on monocular camera
WO2023212190A1 (en) Continuous blood flow visualization with laser speckle contrast imaging
Ruano et al. Shape estimation of gastrointestinal polyps using motion information
CN114785948A (en) Endoscope focusing method and device, endoscope image processor and readable storage medium
Gao et al. Intelligent vision guide for automatic ventilation grommet insertion into the tympanic membrane

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20768788

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 3147729

Country of ref document: CA

ENP Entry into the national phase

Ref document number: 2022515015

Country of ref document: JP

Kind code of ref document: A

ENP Entry into the national phase

Ref document number: 2020354896

Country of ref document: AU

Date of ref document: 20200827

Kind code of ref document: A

ENP Entry into the national phase

Ref document number: 20227009551

Country of ref document: KR

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2020768788

Country of ref document: EP

Effective date: 20220425