US20230218146A1 - Systems, apparatuses, and methods for endoscopy - Google Patents
Systems, apparatuses, and methods for endoscopy Download PDFInfo
- Publication number
- US20230218146A1 US20230218146A1 US17/572,332 US202217572332A US2023218146A1 US 20230218146 A1 US20230218146 A1 US 20230218146A1 US 202217572332 A US202217572332 A US 202217572332A US 2023218146 A1 US2023218146 A1 US 2023218146A1
- Authority
- US
- United States
- Prior art keywords
- imaging
- cavity
- endoscope
- classification
- image
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 153
- 238000001839 endoscopy Methods 0.000 title description 2
- 238000003384 imaging method Methods 0.000 claims abstract description 124
- 230000033001 locomotion Effects 0.000 claims abstract description 105
- 239000013598 vector Substances 0.000 claims abstract description 79
- 238000012545 processing Methods 0.000 claims abstract description 37
- 238000001514 detection method Methods 0.000 claims abstract description 21
- 238000013473 artificial intelligence Methods 0.000 claims abstract description 16
- 238000011282 treatment Methods 0.000 claims description 66
- 238000002560 therapeutic procedure Methods 0.000 claims description 28
- 230000009471 action Effects 0.000 claims description 26
- 239000012636 effector Substances 0.000 claims description 25
- 238000006073 displacement reaction Methods 0.000 claims description 22
- 210000003708 urethra Anatomy 0.000 claims description 18
- 230000001133 acceleration Effects 0.000 claims description 15
- 230000007170 pathology Effects 0.000 claims description 12
- 230000008569 process Effects 0.000 claims description 11
- 238000013527 convolutional neural network Methods 0.000 claims description 10
- 230000008878 coupling Effects 0.000 claims description 4
- 238000010168 coupling process Methods 0.000 claims description 4
- 238000005859 coupling reaction Methods 0.000 claims description 4
- 238000003860 storage Methods 0.000 description 48
- 230000001225 therapeutic effect Effects 0.000 description 28
- 239000007943 implant Substances 0.000 description 20
- 206010004446 Benign prostatic hyperplasia Diseases 0.000 description 18
- 208000004403 Prostatic Hyperplasia Diseases 0.000 description 18
- 230000015654 memory Effects 0.000 description 18
- 238000010586 diagram Methods 0.000 description 14
- 230000006870 function Effects 0.000 description 14
- 238000012549 training Methods 0.000 description 14
- 210000002307 prostate Anatomy 0.000 description 13
- 238000004891 communication Methods 0.000 description 10
- 238000001356 surgical procedure Methods 0.000 description 7
- 210000001519 tissue Anatomy 0.000 description 7
- 238000003745 diagnosis Methods 0.000 description 6
- 238000002513 implantation Methods 0.000 description 6
- 230000002265 prevention Effects 0.000 description 6
- 238000013528 artificial neural network Methods 0.000 description 5
- 230000005540 biological transmission Effects 0.000 description 5
- 238000012544 monitoring process Methods 0.000 description 5
- 210000003484 anatomy Anatomy 0.000 description 4
- 238000004590 computer program Methods 0.000 description 4
- 230000001276 controlling effect Effects 0.000 description 4
- 230000036541 health Effects 0.000 description 4
- 238000005286 illumination Methods 0.000 description 4
- 238000001727 in vivo Methods 0.000 description 4
- 230000003287 optical effect Effects 0.000 description 4
- 238000011277 treatment modality Methods 0.000 description 4
- 238000011179 visual inspection Methods 0.000 description 4
- 208000024827 Alzheimer disease Diseases 0.000 description 3
- 206010021639 Incontinence Diseases 0.000 description 3
- 206010071289 Lower urinary tract symptoms Diseases 0.000 description 3
- 208000018737 Parkinson disease Diseases 0.000 description 3
- 230000008901 benefit Effects 0.000 description 3
- 230000033228 biological regulation Effects 0.000 description 3
- 230000000694 effects Effects 0.000 description 3
- 208000002551 irritable bowel syndrome Diseases 0.000 description 3
- 230000037361 pathway Effects 0.000 description 3
- 206010009900 Colitis ulcerative Diseases 0.000 description 2
- 208000011231 Crohn disease Diseases 0.000 description 2
- 208000001640 Fibromyalgia Diseases 0.000 description 2
- 206010028980 Neoplasm Diseases 0.000 description 2
- 208000031845 Pernicious anaemia Diseases 0.000 description 2
- 201000006704 Ulcerative Colitis Diseases 0.000 description 2
- 206010046555 Urinary retention Diseases 0.000 description 2
- 238000002679 ablation Methods 0.000 description 2
- 238000003491 array Methods 0.000 description 2
- 208000006673 asthma Diseases 0.000 description 2
- 238000012937 correction Methods 0.000 description 2
- 238000000315 cryotherapy Methods 0.000 description 2
- 238000005520 cutting process Methods 0.000 description 2
- 230000006378 damage Effects 0.000 description 2
- 238000013461 design Methods 0.000 description 2
- 208000007784 diverticulitis Diseases 0.000 description 2
- 239000003814 drug Substances 0.000 description 2
- 229940079593 drug Drugs 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 208000006454 hepatitis Diseases 0.000 description 2
- 231100000283 hepatitis Toxicity 0.000 description 2
- 208000027866 inflammatory disease Diseases 0.000 description 2
- 206010025135 lupus erythematosus Diseases 0.000 description 2
- 239000013307 optical fiber Substances 0.000 description 2
- 238000011369 optimal treatment Methods 0.000 description 2
- 210000000056 organ Anatomy 0.000 description 2
- 230000002085 persistent effect Effects 0.000 description 2
- 238000011084 recovery Methods 0.000 description 2
- 238000002271 resection Methods 0.000 description 2
- 230000004044 response Effects 0.000 description 2
- 206010039073 rheumatoid arthritis Diseases 0.000 description 2
- 239000004065 semiconductor Substances 0.000 description 2
- 230000036299 sexual function Effects 0.000 description 2
- 230000002485 urinary effect Effects 0.000 description 2
- 238000012795 verification Methods 0.000 description 2
- 241000251468 Actinopterygii Species 0.000 description 1
- 206010002091 Anaesthesia Diseases 0.000 description 1
- 206010002329 Aneurysm Diseases 0.000 description 1
- 206010002556 Ankylosing Spondylitis Diseases 0.000 description 1
- 201000001320 Atherosclerosis Diseases 0.000 description 1
- 206010005003 Bladder cancer Diseases 0.000 description 1
- 206010069632 Bladder dysfunction Diseases 0.000 description 1
- 208000003174 Brain Neoplasms Diseases 0.000 description 1
- 206010006187 Breast cancer Diseases 0.000 description 1
- 208000026310 Breast neoplasm Diseases 0.000 description 1
- 208000014181 Bronchial disease Diseases 0.000 description 1
- 241001164374 Calyx Species 0.000 description 1
- 208000031229 Cardiomyopathies Diseases 0.000 description 1
- 208000015943 Coeliac disease Diseases 0.000 description 1
- 206010009944 Colon cancer Diseases 0.000 description 1
- 206010010254 Concussion Diseases 0.000 description 1
- 206010010904 Convulsion Diseases 0.000 description 1
- RYGMFSIKBFXOCR-UHFFFAOYSA-N Copper Chemical compound [Cu] RYGMFSIKBFXOCR-UHFFFAOYSA-N 0.000 description 1
- 201000003883 Cystic fibrosis Diseases 0.000 description 1
- 206010012218 Delirium Diseases 0.000 description 1
- 201000004624 Dermatitis Diseases 0.000 description 1
- 206010013554 Diverticulum Diseases 0.000 description 1
- 206010017943 Gastrointestinal conditions Diseases 0.000 description 1
- 208000015872 Gaucher disease Diseases 0.000 description 1
- 208000007465 Giant cell arteritis Diseases 0.000 description 1
- 208000003807 Graves Disease Diseases 0.000 description 1
- 208000015023 Graves' disease Diseases 0.000 description 1
- 206010019233 Headaches Diseases 0.000 description 1
- 208000018565 Hemochromatosis Diseases 0.000 description 1
- 206010020751 Hypersensitivity Diseases 0.000 description 1
- 206010020853 Hypertonic bladder Diseases 0.000 description 1
- 206010020880 Hypertrophy Diseases 0.000 description 1
- 206010058467 Lung neoplasm malignant Diseases 0.000 description 1
- 206010025323 Lymphomas Diseases 0.000 description 1
- 241000124008 Mammalia Species 0.000 description 1
- 241001465754 Metazoa Species 0.000 description 1
- 208000019695 Migraine disease Diseases 0.000 description 1
- 208000009722 Overactive Urinary Bladder Diseases 0.000 description 1
- 208000002193 Pain Diseases 0.000 description 1
- 206010061902 Pancreatic neoplasm Diseases 0.000 description 1
- 206010033799 Paralysis Diseases 0.000 description 1
- 208000008469 Peptic Ulcer Diseases 0.000 description 1
- 206010035664 Pneumonia Diseases 0.000 description 1
- 208000026301 Postoperative Cognitive Complications Diseases 0.000 description 1
- 206010060862 Prostate cancer Diseases 0.000 description 1
- 208000000236 Prostatic Neoplasms Diseases 0.000 description 1
- 206010051482 Prostatomegaly Diseases 0.000 description 1
- 201000004681 Psoriasis Diseases 0.000 description 1
- 201000001263 Psoriatic Arthritis Diseases 0.000 description 1
- 208000036824 Psoriatic arthropathy Diseases 0.000 description 1
- 208000015634 Rectal Neoplasms Diseases 0.000 description 1
- 208000001647 Renal Insufficiency Diseases 0.000 description 1
- 206010039897 Sedation Diseases 0.000 description 1
- 201000001880 Sexual dysfunction Diseases 0.000 description 1
- 208000021386 Sjogren Syndrome Diseases 0.000 description 1
- 208000000453 Skin Neoplasms Diseases 0.000 description 1
- 208000024770 Thyroid neoplasm Diseases 0.000 description 1
- 208000006568 Urinary Bladder Calculi Diseases 0.000 description 1
- 208000007097 Urinary Bladder Neoplasms Diseases 0.000 description 1
- 206010046542 Urinary hesitation Diseases 0.000 description 1
- 206010046543 Urinary incontinence Diseases 0.000 description 1
- 208000012886 Vertigo Diseases 0.000 description 1
- 230000032683 aging Effects 0.000 description 1
- 230000007815 allergy Effects 0.000 description 1
- 230000003321 amplification Effects 0.000 description 1
- 230000037005 anaesthesia Effects 0.000 description 1
- 238000004458 analytical method Methods 0.000 description 1
- 208000007502 anemia Diseases 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 230000006793 arrhythmia Effects 0.000 description 1
- 206010003119 arrhythmia Diseases 0.000 description 1
- 206010003246 arthritis Diseases 0.000 description 1
- 206010006451 bronchitis Diseases 0.000 description 1
- 201000011510 cancer Diseases 0.000 description 1
- 206010009887 colitis Diseases 0.000 description 1
- 210000001072 colon Anatomy 0.000 description 1
- 208000029742 colonic neoplasm Diseases 0.000 description 1
- 230000008867 communication pathway Effects 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 230000009514 concussion Effects 0.000 description 1
- 229910052802 copper Inorganic materials 0.000 description 1
- 239000010949 copper Substances 0.000 description 1
- 208000029078 coronary artery disease Diseases 0.000 description 1
- 238000013144 data compression Methods 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 238000003066 decision tree Methods 0.000 description 1
- 238000013135 deep learning Methods 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 230000001066 destructive effect Effects 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000018109 developmental process Effects 0.000 description 1
- 206010012601 diabetes mellitus Diseases 0.000 description 1
- 238000002405 diagnostic procedure Methods 0.000 description 1
- 208000037265 diseases, disorders, signs and symptoms Diseases 0.000 description 1
- 208000035475 disorder Diseases 0.000 description 1
- 208000002173 dizziness Diseases 0.000 description 1
- 238000002651 drug therapy Methods 0.000 description 1
- 230000004064 dysfunction Effects 0.000 description 1
- 238000003708 edge detection Methods 0.000 description 1
- 210000003204 ejaculatory duct Anatomy 0.000 description 1
- 230000010102 embolization Effects 0.000 description 1
- 230000007159 enucleation Effects 0.000 description 1
- 206010015037 epilepsy Diseases 0.000 description 1
- 230000003090 exacerbative effect Effects 0.000 description 1
- 239000000835 fiber Substances 0.000 description 1
- 238000001914 filtration Methods 0.000 description 1
- 239000012530 fluid Substances 0.000 description 1
- 208000021302 gastroesophageal reflux disease Diseases 0.000 description 1
- 230000000762 glandular Effects 0.000 description 1
- 231100000869 headache Toxicity 0.000 description 1
- 208000006750 hematuria Diseases 0.000 description 1
- 208000008384 ileus Diseases 0.000 description 1
- 238000002649 immunization Methods 0.000 description 1
- 230000003053 immunization Effects 0.000 description 1
- 208000015181 infectious disease Diseases 0.000 description 1
- 230000000977 initiatory effect Effects 0.000 description 1
- 238000007689 inspection Methods 0.000 description 1
- 238000011835 investigation Methods 0.000 description 1
- 201000006370 kidney failure Diseases 0.000 description 1
- 210000000244 kidney pelvis Anatomy 0.000 description 1
- 238000009533 lab test Methods 0.000 description 1
- 230000003902 lesion Effects 0.000 description 1
- 208000032839 leukemia Diseases 0.000 description 1
- 230000000670 limiting effect Effects 0.000 description 1
- 201000007270 liver cancer Diseases 0.000 description 1
- 208000014018 liver neoplasm Diseases 0.000 description 1
- 238000002690 local anesthesia Methods 0.000 description 1
- 230000004807 localization Effects 0.000 description 1
- 230000005923 long-lasting effect Effects 0.000 description 1
- 201000005202 lung cancer Diseases 0.000 description 1
- 208000020816 lung neoplasm Diseases 0.000 description 1
- 208000015486 malignant pancreatic neoplasm Diseases 0.000 description 1
- 238000007726 management method Methods 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 208000030159 metabolic disease Diseases 0.000 description 1
- 206010027599 migraine Diseases 0.000 description 1
- 238000002324 minimally invasive surgery Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 210000004877 mucosa Anatomy 0.000 description 1
- 201000006417 multiple sclerosis Diseases 0.000 description 1
- 210000003205 muscle Anatomy 0.000 description 1
- 208000010125 myocardial infarction Diseases 0.000 description 1
- 201000008383 nephritis Diseases 0.000 description 1
- 208000004296 neuralgia Diseases 0.000 description 1
- 230000004770 neurodegeneration Effects 0.000 description 1
- 208000015122 neurodegenerative disease Diseases 0.000 description 1
- 230000000926 neurological effect Effects 0.000 description 1
- 206010029446 nocturia Diseases 0.000 description 1
- 238000009659 non-destructive testing Methods 0.000 description 1
- 238000003199 nucleic acid amplification method Methods 0.000 description 1
- 230000000414 obstructive effect Effects 0.000 description 1
- 230000008520 organization Effects 0.000 description 1
- 201000008482 osteoarthritis Diseases 0.000 description 1
- 208000020629 overactive bladder Diseases 0.000 description 1
- 230000036407 pain Effects 0.000 description 1
- 208000021090 palsy Diseases 0.000 description 1
- 201000002528 pancreatic cancer Diseases 0.000 description 1
- 208000008443 pancreatic carcinoma Diseases 0.000 description 1
- 230000036961 partial effect Effects 0.000 description 1
- 201000001245 periodontitis Diseases 0.000 description 1
- 208000033808 peripheral neuropathy Diseases 0.000 description 1
- 238000012805 post-processing Methods 0.000 description 1
- 230000002980 postoperative effect Effects 0.000 description 1
- 229940036310 program Drugs 0.000 description 1
- 208000017497 prostate disease Diseases 0.000 description 1
- 238000011471 prostatectomy Methods 0.000 description 1
- 206010038038 rectal cancer Diseases 0.000 description 1
- 201000001275 rectum cancer Diseases 0.000 description 1
- 230000000306 recurrent effect Effects 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
- 230000001105 regulatory effect Effects 0.000 description 1
- 230000002787 reinforcement Effects 0.000 description 1
- 230000002441 reversible effect Effects 0.000 description 1
- 238000012552 review Methods 0.000 description 1
- 230000036280 sedation Effects 0.000 description 1
- 230000035945 sensitivity Effects 0.000 description 1
- 230000001568 sexual effect Effects 0.000 description 1
- 201000009890 sinusitis Diseases 0.000 description 1
- 201000000849 skin cancer Diseases 0.000 description 1
- 210000005070 sphincter Anatomy 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 238000012706 support-vector machine Methods 0.000 description 1
- 208000024891 symptom Diseases 0.000 description 1
- 230000009885 systemic effect Effects 0.000 description 1
- 239000003826 tablet Substances 0.000 description 1
- 206010043207 temporal arteritis Diseases 0.000 description 1
- 238000000015 thermotherapy Methods 0.000 description 1
- 210000003813 thumb Anatomy 0.000 description 1
- 201000002510 thyroid cancer Diseases 0.000 description 1
- 230000009466 transformation Effects 0.000 description 1
- 238000000844 transformation Methods 0.000 description 1
- 201000008827 tuberculosis Diseases 0.000 description 1
- 208000001072 type 2 diabetes mellitus Diseases 0.000 description 1
- 238000002604 ultrasonography Methods 0.000 description 1
- 210000000626 ureter Anatomy 0.000 description 1
- 238000002562 urinalysis Methods 0.000 description 1
- 201000005112 urinary bladder cancer Diseases 0.000 description 1
- 210000001635 urinary tract Anatomy 0.000 description 1
- 208000019206 urinary tract infection Diseases 0.000 description 1
- 210000002700 urine Anatomy 0.000 description 1
- 231100000889 vertigo Toxicity 0.000 description 1
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Chemical compound O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 1
- 230000003936 working memory Effects 0.000 description 1
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00004—Operational features of endoscopes characterised by electronic signal processing
- A61B1/00009—Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope
- A61B1/000096—Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope using artificial intelligence
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00043—Operational features of endoscopes provided with output arrangements
- A61B1/00045—Display arrangement
- A61B1/0005—Display arrangement combining images e.g. side-by-side, superimposed or tiled
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00004—Operational features of endoscopes characterised by electronic signal processing
- A61B1/00009—Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope
- A61B1/000094—Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope extracting biological structures
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/06—Devices, other than using radiation, for detecting or locating foreign bodies ; determining position of probes within or on the body of the patient
- A61B5/065—Determining position of the probe employing exclusively positioning means located on or in the probe, e.g. using position sensors arranged on the probe
- A61B5/067—Determining position of the probe employing exclusively positioning means located on or in the probe, e.g. using position sensors arranged on the probe using accelerometers or gyroscopes
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0012—Biomedical image inspection
- G06T7/0014—Biomedical image inspection using an image reference approach
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H20/00—ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
- G16H20/40—ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to mechanical, radiation or invasive therapies, e.g. surgery, laser therapy, dialysis or acupuncture
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H30/00—ICT specially adapted for the handling or processing of medical images
- G16H30/40—ICT specially adapted for the handling or processing of medical images for processing medical images, e.g. editing
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H40/00—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
- G16H40/60—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
- G16H40/63—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for local operation
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00043—Operational features of endoscopes provided with output arrangements
- A61B1/00045—Display arrangement
- A61B1/00052—Display arrangement positioned at proximal end of the endoscope body
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/307—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor for the urinary organs, e.g. urethroscopes, cystoscopes
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B17/00—Surgical instruments, devices or methods, e.g. tourniquets
- A61B17/00234—Surgical instruments, devices or methods, e.g. tourniquets for minimally invasive surgery
- A61B2017/00238—Type of minimally invasive operation
- A61B2017/00274—Prostate operation, e.g. prostatectomy, turp, bhp treatment
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2046—Tracking techniques
- A61B2034/2048—Tracking techniques using an accelerometer or inertia sensor
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2046—Tracking techniques
- A61B2034/2065—Tracking using image or pattern recognition
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/30—Devices for illuminating a surgical field, the devices having an interrelation with other surgical devices or with a surgical procedure
- A61B2090/306—Devices for illuminating a surgical field, the devices having an interrelation with other surgical devices or with a surgical procedure using optical fibres
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/30—Devices for illuminating a surgical field, the devices having an interrelation with other surgical devices or with a surgical procedure
- A61B2090/309—Devices for illuminating a surgical field, the devices having an interrelation with other surgical devices or with a surgical procedure using white LEDs
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B2562/00—Details of sensors; Constructional details of sensor housings or probes; Accessories for sensors
- A61B2562/02—Details of sensors specially adapted for in-vivo measurements
- A61B2562/0219—Inertial sensors, e.g. accelerometers, gyroscopes, tilt switches
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/25—User interfaces for surgical systems
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10016—Video; Image sequence
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10068—Endoscopic image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20084—Artificial neural networks [ANN]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
- G06T2207/30081—Prostate
Definitions
- the present disclosure relates to systems, apparatuses, and methods utilizing an endoscopic imaging system. More specifically, the present disclosure relates to systems, apparatuses, and methods utilizing an endoscopic system for enabling various endoscopic procedures (medical and non-medical).
- various technologies disclosed herein enable prevention, forecasting, diagnosis, amelioration, monitoring, or treatment of medical conditions in various mammalian pathology, such as human prostate pathology including but not limited to benign prostatic hyperplasia (BPH) or other human or non-human medical conditions, or non-medical procedures.
- BPH benign prostatic hyperplasia
- BPH benign prostatic hyperplasia
- LUTS lower urinary tract symptoms
- nocturia frequency, urgency, hesitancy, incomplete emptying, leakage, and dribbling.
- LUTS lower urinary tract symptoms
- BPH is rarely life-threatening, BPH can lead to numerous clinical conditions including urinary retention, renal insufficiency, recurrent urinary tract infections, incontinence, hematuria, and bladder stones.
- early intervention may sometimes be recommended to improve patient outcomes and quality of life.
- Surgical treatments for BPH range from minimally invasive techniques, such as prostatic urethral lift devices and various ablation methods, to more invasive resection surgeries to fully invasive prostatectomy surgeries.
- the surgeries involve cutting or ablating tissue near delicate structures, such as the bladder and the verumontanum, which is critical for male sexual function. Therefore, these surgeries require extensive practice with cystoscopic methods to identify the delicate structures and to estimate the treatment areas proximate to those delicate structures.
- Transurethral prostatic procedures involve tissue examination of the bladder and urethral mucosa with a specialty endoscope called a cystoscope.
- a physician will expand the urethra and bladder with a clear fluid to visualize the mucosal surface of the bladder and urethra.
- the treatment area typically the region distal to the bladder neck and proximal to the verumontanum are identified as the treatment area. Once the treatment area is identified, the physician can apply treatment to one or more of the lateral, medial, and anterior prostatic lobes.
- TWVT transurethral water vapor therapies
- PUL Prostatic urethral lift
- PUL devices are permanent, implantable fixation devices similar to tacks or anchors that aim to create channels in one or more of the prostatic lobes between the bladder neck to the verumontanum to reduce obstruction and improve flow.
- typically four to five implants are required for an average-sized prostate to achieve an ideal opening, but upwards of ten implants may be necessary for large and/or abnormally shaped prostates.
- the first implant is placed approximately 2 cm distally of the bladder neck, the second implant placed just anterior to the verumontanum, with additional implants placed in between to form a continuous channel typically through the anterior and lateral aspects of the prostate.
- Each implant is housed in a disposable cartridge that must be replaced after the implant is deployed. Thus, the effector handle, cartridge, cystoscope are removed to replace the cartridge and introduce a new implant.
- an imaging unit for an endoscopic procedure comprises a housing and a display integrated into the housing.
- An imaging coupler is configured for receiving imaging information from an imaging assembly of an endoscope having a field of view (FoV) comprising of at least a portion of an end effector and a portion of a region of interest (ROI).
- An imaging processor is configured with instructions to process the received imaging information into pixel values representing an image of a time series and to display the image in real-time on the display, while a motion sensor is configured to detect a motion of the housing during the time series.
- the imaging unit comprises a detection processing unit (DPU) configured with instructions to: classify at least one anatomical feature in each image of the time series based on an artificial intelligence classifier; determine a confidence metric of the classification; determine a motion vector based on the detected motion; and display, concurrently with the corresponding image, the classification of the at least one anatomical feature, the determined confidence metric, and the determined motion vector.
- DPU detection processing unit
- the DPU is configured to display the displacement vector concurrently with the corresponding classification
- the detection processing unit is configured to display the displacement vector relative to one or more classified anatomical features.
- the detection processing unit is configured to display a plurality of displacement vectors each one relative to a unique classified anatomical feature.
- the detection processing unit determines the confidence metric based on the comparison.
- the detection processing unit is configured to identify at least one treatment site based on the at least one classified anatomical feature, and display, concurrently with the corresponding image, the at least one identified treatment site and a relative motion vector between the classified anatomical feature and the identified treatment site.
- the region of the interest includes at least a prostatic urethra the administered therapy includes prostatic treatment
- the detection processing unit is further configured to classify a prostatic pathology.
- a method for endoscopic imaging includes operatively coupling an imaging coupler of an imaging unit to an observation port of an endoscope.
- Imaging information is received from an imaging assembly of the endoscope.
- the imaging assembly has an FoV comprising of at least a portion of an end effector and a portion of a ROI.
- the received imaging information is processed into pixel values representing an image of a time series.
- the images are displayed in real-time on a display integrated into the housing of the imaging unit, and motion of the housing is detected during the capture of the time series. At least one anatomical feature is classified in each image of the time series based on an artificial intelligence classifier.
- the method further includes displaying the displacement vector relative to one or more classified anatomical features.
- the method further includes displaying a plurality of displacement vectors each one relative to a unique classified anatomical feature
- the artificial intelligence classifier is a convolutional neural network configured to compare each image to an anatomical model.
- the method further includes identifying at least one treatment site based on the at least one classified anatomical feature; and displaying, concurrently with the corresponding image, the at least one identified treatment site and the determined motion vector.
- the method further includes classifying a prostatic pathology.
- a kit for an endoscopic therapeutic procedure includes an endoscopic imaging unit which comprises a housing and a display integrated into the housing.
- the endoscopic imaging unit includes an imaging coupler is configured for receiving imaging information from an imaging assembly of an endoscope having a field of view (FoV) comprising of at least a portion of an end effector and a portion of a region of interest (ROI).
- the endoscopic imaging unit includes an imaging processor; a motion sensor configured to detect a motion of the housing during the time series; and a detection processing unit (DPU).
- the kit includes instructions to perform a method for endoscopic imaging.
- the method includes the steps of: operatively coupling the imaging coupler of the imaging unit to an observation port of an endoscope; receiving the imaging information from the imaging assembly; processing the received imaging information into pixel values representing an image of a time series; displaying the image in real-time on the display; detecting motion of the housing during the time series; classifying at least one anatomical feature in each image of the time series based on an artificial intelligence classifier; determining a confidence metric of the classification; determining a motion vector based on the detected motion; and displaying, concurrently with the corresponding image, the classification of the at least one anatomical feature, the determined confidence metric, and the determined motion vector.
- the step of detecting motion further includes generating a gyroscopic and acceleration signal associated with the motion of the housing, and determining a displacement vector based on at least the gyroscopic signal and the acceleration signal.
- the artificial intelligence classifier is a convolutional neural network configured to compare each image to an anatomical model.
- a method may comprise: receiving, by a processor, an imagery from an endoscope imaging a cavity, wherein the imagery depicts an anatomical feature within the cavity; performing, by the processor, a classification for the anatomical feature while the endoscope images the cavity; determining, by the processor, a confidence metric for the classficiation while the endoscope images the cavity; determining, by the processor, a motion vector for the endoscope imaging the cavity while the endoscope images the cavity; and requesting, by the processor, a display to simultaneously present at least two of the imagery, the classification, the confidence metric, or the motion vector while the endoscope images the cavity.
- a method may comprise: receiving, by a processor, an imagery from an endoscope imaging a cavity, wherein the imagery depicts an anatomical feature within the cavity; performing, by the processor, a classification for the anatomical feature while the endoscope images the cavity; determining, by the processor, a confidence metric for the classficiation while the endoscope images the cavity; determining, by the processor, a motion vector for the endoscope imaging the cavity while the endoscope images the cavity; and taking, by the processor, an action based on the classification, the confidence metric, and the motion vector.
- FIG. 1 is a structural diagram of an embodiment of a portable system
- FIG. 2 is a structural diagram of an embodiment of the portable system disposed in a region of interest
- FIG. 3 is a block diagram of an embodiment of a wireless imaging unit of the portable system
- FIG. 4 a network diagram of an embodiment of the portable system is illustrated
- FIGS. 5 A- 5 D endoscopic views of a region of interest as displayed on the portable system are illustrated;
- FIGS. 6 A- 6 B are flowcharts of embodiments for training an anatomical model
- FIG. 8 is a a structural diagram of a kit for an performing an endoscopic procedure.
- a term “or” is intended to mean an inclusive “or” rather than an exclusive “or.” That is, unless specified otherwise, or clear from context, “X employs A or B” is intended to mean any of the natural inclusive permutations. That is, if X employs A; X employs B; or X employs both A and B, then “X employs A or B” is satisfied under any of the foregoing instances.
- first, second, and others can be used herein to describe various elements, components, regions, layers, or sections, these elements, components, regions, layers, or sections should not necessarily be limited by such terms. Rather, these terms are used to distinguish one element, component, region, layer, or section from another element, component, region, layer, or section. As such, a first element, component, region, layer, or section discussed below could be termed a second element, component, region, layer, or section without departing from this disclosure.
- FIG. 1 shows the main components of a portable system 10 used during an endoscopic procedure, which may be a diagnostic or therapeutic procedure (or another type of procedure whether medical or non-medical).
- An endoscope 12 is inserted into a patient 14 (e.g., a mammal, a human, an animal, a pet, a bird, a fish, a male, a female) to a region of interest (ROI) 16 , such as a tissue, an organ, a body part, or any other in vivo feature, although non-medical uses may employ non-patients or inanimate objects, such as tubes, cavities, tunnels, crevices, bores, channels, or other relevant non-patient or inanimate ROIs.
- ROI region of interest
- the region of interest 16 is illuminated by an external light source 18 which directs incident light along an illumination pathway, such as an optical fiber that extends along a tube of the endoscope 12 to an illumination lens at a distal tip 14 .
- the illuminated region of interest 16 reflects the incident light back to an imaging lens at the distal tip 14 to convey the reflected light along an imaging pathway, such as an optical fiber to an observation port 20 , such as an eyepiece.
- the reflected light is received by a wireless imaging unit (WIU) 22 via the observation port 18 .
- the WIU 22 may include a digital imaging sensor that converts the reflected light into imaging data which can then be processed and displayed on a display 24 .
- the endoscope 12 may be a digital endoscope with a chip-on-a-tip arrangement.
- the endoscope 12 may include one or more light-emitting diodes (LEDs) disposed at the distal tip 14 for illuminating the ROI 16 .
- LEDs light-emitting diodes
- the distal tip 14 may also include the digital imaging sensor for generating the imaging data of the ROI 16 .
- a communication pathway along the tube of the endoscope 12 may transmit and receive control signals for controlling the LEDs and the digital imaging sensor instead of the illumination and imaging pathways.
- the WIU 22 may receive the imaging data from the digital imaging sensor via the observation port 20 .
- the observation port 20 serves as an optical observation port, such as an eyepiece, while in another embodiment, the observation port 20 may take the form of a digital interface, such as a digital connector for conveying imaging data electronically.
- the observation port 20 interfaces with the WIU 22 via an imaging coupler 26 .
- the imaging coupler 26 optically couples the WIU 22 to the observation port 20 of the endoscope 12 .
- the imaging coupler 26 digitally couples the WIU 22 to a digital observation port 20 via an electrical connector with various data channels and/or electrical channels for controlling the LEDs and/or digital imaging sensor at the distal tip 14 .
- the WIU 22 includes a housing 28 which is configured to integrate the observation port 22 , display 24 , imaging coupler 26 , and light source 18 into a single device, while protecting various internal components, such as, but not limited to, electronic circuit components, power source, thermal management, and the like.
- the portable system 10 includes a therapeutic device 30 configured to be disposed in vivo into the ROI 16 in tandem with the endoscope 12 to administer a therapy (or another action or technique) therein.
- a therapy or another action or technique
- this may include prevention, forecasting, diagnosis, amelioration, monitoring, or treatment of medical conditions via or while the endoscope 12 is disposed in vivo into the ROI 16 .
- the device 30 may be suitably labeled/configured (e.g., the diagnosis device 30 , the forecasting device 30 , the prevention device 30 , and so forth). In situations that are non-medical, the device 30 is suitably configured as well.
- the therapeutic device 30 includes an end effector 32 which delivers the therapy (or another action or technique) and includes an actuator 34 for initiating the delivery of the therapy (or another action or technique).
- the ROI 16 includes at least a prostatic urethra 40 , the prostate 42 , and the bladder 44 , although this is illustrative and other body parts, organs, or tissues may be used (or inanimate ROI 16 may be used for non-medical uses).
- the therapeutic device 30 is configured to administer therapies to treat medical conditions associated with prostatic pathologies, such as, but not limited to, benign prostatic hyperplasia (BPH) and the like, although non-prostatic pathologies may be used as well.
- BPH benign prostatic hyperplasia
- the therapeutic device 30 may be configured to administer one or more of the following therapeutic treatments, such as resection, incision, ablation, thermotherapy, enucleation, implantation, cryotherapy, vapor therapy, embolization, and the like. While in the illustrated embodiment the therapeutic device 30 is shown with a handle 36 and actuator 34 , it should be appreciated that the therapeutic device 30 may embodiment various shapes, sizes, and designs specified by the delivered therapy. For example, although the therapeutic device 30 is embodied as pistol-shaped via the handle 36 , this form factor is not required and other form factors may be used.
- the handle 36 may be omitted and the actuator 34 may be embodied differently than a lever pivoting toward or away from the handle 36 (e.g., a pressable/depressable button, a rotary knob, a rotating sleeve).
- a lever pivoting toward or away from the handle 36 e.g., a pressable/depressable button, a rotary knob, a rotating sleeve.
- the WIU 22 is capable of wireless (e.g., radio frequency, line of sight) communication 46 , such as high-speed bi-directional data communications directly (or indirectly) to one or more external devices 48 simultaneously or substantially simultaneously.
- the external devices 48 are capable of directly (or indirectly) receiving data, such as digital images, digital video, or other information pertaining to the therapeutic procedure.
- the external device 48 can also directly (or indirectly) transmit control data or signals to the WIU 22 to remotely control the WIU 22 .
- the external device 48 can also transmit therapeutic (or other action or procedure) information regarding the therapeutic (or other action or technique) procedure, such as patient data in the form of electronic medical records (EMR) or procedure data such as instructions for performing the procedure.
- EMR electronic medical records
- Examples of external devices 48 may include personal computing devices such as desktop computers; portable devices such as smart devices, smartphones, personal digital assistants, tablet computers, wrist-mounted displays, smartwatches, or the like; laptops or portable computers; head-mounted displays; or other computing devices not yet contemplated.
- the portable system 10 is configured for BPH therapy, although BPH or non BPH prevention, forecasting, diagnosis, amelioration, monitoring, or treatment is possible.
- BPH therapy typically involves reducing the effect of an enlarged prostate 42 has on the prostatic urethra 40 .
- the therapeutic device 30 is configured to deploy prostatic urethral lift (PUL) implants at various treatment sites along the prostatic urethra 40 to lift and pull prostatic tissue away from a urethral channel 50 to improve flow from, for example, the bladder 44 .
- PUL prostatic urethral lift
- the locations of the treatment sites are typically chosen at the discretion of the practitioner performing the procedure based on subjective criteria, such as the degree of the achieved lifting visualized through the endoscope 12 .
- This subjectivity may result in a non-optimal placement of the PUL implants which can result in the costly deployment of excess implants, insufficient deployment of implants such that the patient does not achieve the desired outcome, improper placement damaging sensitive anatomy such as the verumontanum or piercing through the bladder neck resulting in unintended consequences as sexual and/or bladder dysfunction, infection, and the like.
- Due to the subjective nature of PUL implantation practitioners have to undergo significant training and supervision to become familiar with the procedure to perform the procedure adequately. Regardless of the training a practitioner receives, the procedure still may not be performed optimally for long-lasting results.
- the practitioner introduces the distal tip 14 of the endoscope 12 to identify the ROI 16 .
- the practitioner may concurrently introduce the end effector 32 of the therapeutic device 30 while identifying the ROI 16 , or subsequently after the ROI 16 is identified.
- the practitioner observes real-time imaging information on the display 24 which is received by an imaging assembly 52 from the ROI 16 illuminated by incident light by a light emitter 54 .
- the imaging assembly 46 detects reflected light from the ROI 16 within a field of view (FoV) 56 that includes at least a portion of the ROI 16 and a portion of the end effector 32 of the therapeutic device 30 .
- FoV field of view
- the practitioner identifies a treatment region 58 between the bladder neck 60 and the verumontanum 62 so as not to damage these delicate anatomical features.
- the bladder neck 60 is a group of muscles that connect the bladder to the urethra and is primarily tasked with holding urine in the bladder, if damaged can lead to incontinence and other issues.
- the verumontanum 62 is an elevation in the floor of the prostatic urethra that is an important landmark distal which helps identify the entrance of the ejaculatory ducts.
- proximal treatment site 64 a , 64 b the practitioner retraces their movements to approximately 2 cm distal of the bladder neck 60 to a proximal treatment site 64 a , 64 b to achieve an adequate proximal opening.
- this proximal treatment site 64 a , 64 b differs greatly among patients based primarily on their specific prostatic anatomies, such as shape, size, density, and the like. If the bladder neck 60 is damaged, then such damage can lead to incontinence, bladder leakage, and other issues. If a site is chosen too proximal to the bladder neck 60 , then the practitioner may pierce the bladder neck 60 and cause such dysfunction.
- a practitioner may also identify medial treatment sites 68 a , 68 b that achieve a continuous channel through an anterior aspect therebetween. Creating a channel through the anterior aspect of the prostate is typically chosen because it is generally formed of fibromuscular tissue and is generally devoid of sensitive glandular tissue.
- additional areas of persistent obstruction are identified by the practitioner and additional implants are deployed at these medial treatment sites 68 a , 68 b .
- the entire extent of the prostate should be analyzed to identify the size and shape of the patient's prostate, e.g., tall, long, short, obstructive lobes, and the like.
- the practitioner can simulate the desired anterior channel.
- this iterative approach can result in implantation errors when the practitioner tries to revisit these identified optimal sites without a point of reference.
- the portable system 10 aims to minimize implantation errors by identifying and tracking various anatomical features 40 , 42 , 44 , 60 in the ROI 16 and determining optimal treatment sites 64 a , 64 b , 66 a , 66 b , 68 a , 68 b based on an artificial intelligence model trained to identify anatomical features in the ROI 16 and a tracking system configured to track a motion vector of the WIU 22 and thus track the motion of the distal tip 14 of the endoscope 12 and/or the end effector 32 of the therapeutic device 30 .
- an optimal treatment site based on a unique patient's anatomy, the therapeutic procedure can achieve enduring results while keeping costs low by increasing efficiency, reducing procedure time, and reducing non-optimal implantation errors.
- the previously mentioned electronic circuitry of the WIU 22 includes a system controller 70 which is configured to control and power the WIU 22 .
- the system controller 70 includes a plurality of circuit components that are responsible for controlling aspects of the WIU 22 .
- the system controller 70 includes a microprocessor 72 which interfaces with several electronic components to send and receive instructions to control various aspects of the WIU 22 functions.
- the system controller 70 includes a storage device 74 which is a memory device, such as a computer-readable medium (e.g., persistent memory, flash memory, embedded memory, ROM, RAM) for storing program instructions to be executed by the microprocessor 72 .
- a computer-readable medium e.g., persistent memory, flash memory, embedded memory, ROM, RAM
- the system controller 70 also includes an illumination controller 76 which receives instructions from the microprocessor 72 to adjust the intensity or brightness of the incident light from the light emitter 54 and/or one or more frequency components of the incident light produced therefrom.
- the light emitter 54 may include one or more LEDs for generating incident light in the ROI 16 , or it may be optically coupled to the light source 18 for transmitting incident light thereto.
- An imaging processing unit (IPU) 80 may include instructions or is configured to execute instructions stored on the storage device 74 to perform various imaging-related functions.
- the IPU 80 is configured to receive the imaging information from the imaging assembly 52 of the FoV 56 .
- the imaging assembly 52 can be an optical assembly that directs reflected light from the ROI 16 to the observation port 22 (e.g., eyepiece) of the endoscope 12 .
- the WIU 22 includes an imaging sensor 90 integrated into the housing 28 and in direct communication with the IPU 80 .
- the imaging assembly 52 comprises of the imaging sensor 90 and is disposed at the distal tip 14 of the endoscope 12 . In this arrangement, the imaging sensor 90 transmits at least one of analog signals, digital signals, or a combination of analog and digital signals pertaining to the imaging information to the observation port 22 which can then be transmitted to the IPU 80 .
- the imaging sensor 90 may include one of the following: complementary metal-oxide-semiconductor (CMOS), charge-coupled device (CCD), or other imaging sensor devices developed in the future not yet contemplated.
- CMOS complementary metal-oxide-semiconductor
- CCD charge-coupled device
- the imaging information can be one of a digital signal or an analog signal that is converted to a digital signal by an analog-to-digital converter (ADC) of the IPU 80 to form pixel values representing an image of a time series of the FoV 56 .
- ADC analog-to-digital converter
- the IPU 80 may also be configured to perform several image processing and post-processing functions in real-time or substantially real-time on the images of the time series. Examples of image processing techniques to enhance the images include edge detection, geometric transformations, perspective correction, color correction, color calibration, motion compensation, data compression, noise reduction, filtering, and the like.
- the IPU 80 may also be configured to control the functionality of the imaging sensor 90 , such as adjusting a focal depth by controlling an integrated autofocus mechanism, pixel clock, sensitivity, offset, signal amplification, gain, gamma, and the like.
- the IPU 80 may also be configured to adjust the image size that is displayed on an external device 48 due to the difference in screen resolution and screen size between external devices 48 and the display 24 .
- the IPU 80 may also be configured to automatically align the images such that the images are centered in the display 24 independent of the size and/or resolution of the display 24 being used whether it is the display 24 or a display of an external device 48 .
- the IPU 80 receives the display information from the microprocessor 72 and formats the output image correspondingly. Post-processed images can then be stored on an image memory 82 for later retrieval to be viewed locally on the display 24 or the external device 102 .
- the WIU 22 can also be configured with a wireless transceiver 100 to communicate with the external device 48 directly via the wireless connection 46 or indirectly via the Internet 102 with remote devices 104 , an institutional server 106 , cloud storage system 108 , and the like.
- the transceiver 100 can be omitted and there may be a receiver and a transmitter, or there may be a receiver or a transmitter.
- the remote devices 104 may be configured for viewing endoscopic imagery, video, or examination data or for remotely receiving controls and/or EMR data.
- the EMR data is a collection of patient and population health information electronically stored in a digital format.
- the EMR data may include a range of patient information such as demographics, medical history, medication, allergies, immunization status, laboratory test results, radiology images, vital signs, personal statistics, billing information, and the link.
- the EMR data can be stored on the institutional server 106 , such as those located at a hospital, insurance company, government entity, or the like.
- the EMR data can also be stored on cloud storage system 88 .
- the cloud storage system 108 can be a data storage system that includes logical pools of physical storage mediums that span multiple servers and often multiple discrete locations in a distributed fashion to ensure redundancy, fault tolerance, and durability of the data.
- the institutional server 106 and cloud storage system 108 can include a picture archiving and communication system (PACS) which is capable of providing storage and access to medical images from multiple modalities using a universal image format such as Digital Imaging and Communications in Medicine (DICOM) format.
- PACS picture archiving and communication system
- the health institution server 106 and cloud storage system 88 are compliant with data protection and privacy regulation such as the Health Insurance Portability and Accountability Act (HIPAA) in the United States of America, General Data Protection Regulation (GDPR) in the European Union, Personal Information Protection and Electronic Documents Act (PIPEDA) in Canada, National Health Portal compliance set by the Insurance Regulatory and Development Authority of India (IRDAI), or other compliance regulations mandated globally.
- HIPAA Health Insurance Portability and Accountability Act
- GDPR General Data Protection Regulation
- PIPEDA Personal Information Protection and Electronic Documents Act
- IRSI Insurance Regulatory and Development Authority of India
- the WIU 22 is wirelessly coupled to a local network 110 via a wireless access point 112 using a suitable wireless transmission protocol such as the 802.11 families of modulation techniques, IEEE 802.15.4a ultra-wideband (UWB), and the like (Bluetooth), although a suitable wired or waveguide connection/hardware is possible.
- the local network 110 may include cables, switches, and routers that may utilize Ethernet standards for communication.
- At least one institutional server 106 may be in communication with the local network 110 .
- the institutional server 106 may store or have access to EMR data which may be accessed by the WIU 22 .
- the local network 104 may be attached to a picture archiving and communication system (PACS) 114 which may be in communication with the institutional server 106 and the WIU 22 .
- PACS picture archiving and communication system
- At least one external device 48 is in communication with the local network either directly via a physical connection or wirelessly via the wireless access point 112 .
- a firewall 116 or other network security technology may be connected to the local network 110 to control access to the Internet 102 .
- a remote device 104 may be authorized to access the local network 110 via the Internet 102 utilizing a secure connection facilitated by the firewall 116 .
- the cloud storage system 108 may be configured to store or retrieve data and may be accessed via the Internet 102 which is facilitated by the firewall 116 .
- the WIU includes a motion processing unit (MPU) 120 which may include instructions or is configured to execute instructions stored on the storage device 74 to receive motion signals from a motion sensor 122 .
- the motion sensor 122 includes at least one of a gyroscopic sensor configured to generate gyroscopic signals and an accelerometer configured to generate acceleration signals.
- the motion signals (e.g., the gyroscopic and acceleration signals) detect the motion of the housing 28 during the therapeutic (or another action or technique) procedure which can be used to estimate the motion of the distal tip 14 and/or the end effector 32 .
- the WIU 22 includes a detection processing unit (DPU) 130 which may include instructions or is configured to execute instructions stored on the storage device 74 to perform various detection-related functions. For example, these instructions may enable the DPU 130 to compare images of the time series to an artificial intelligence classifier (AIC) based on an anatomical model of the ROI 16 to classify at least one anatomical feature in each image of the series.
- AIC may be based on an artificial neural network (ANN), which may include a convolutional neural network (CNN), a recurrent neural network (RNN), or other suitable ANNs.
- ANN artificial neural network
- CNN convolutional neural network
- RNN recurrent neural network
- the storage device 74 may locally store the AIC, which may enable edge computing. This may be technologically advantageous in various environments, which may involve poor or no network connection.
- the ROI 16 includes the following anatomical features: prostatic urethra 40 , prostate 42 , bladder 44 , verumontanum 52 , and the bladder neck 60 .
- other urinary tract anatomical features are also contemplated, such as but not limited to the penile urethra, membranous urethra/external urinary sphincter, bulbous urethra, median lobe, lateral lobes, ureteral orifice, ureterovesical junction, ureter, ureteropelvic junction, renal pelvis, right/left ureteral orifice, infundibulum, calyx, and the like.
- the anatomical features may also include pathologies, such as, but not limited to hypertrophy, trabeculations, tumors/lesions, calculus, diverticulum, and the like. Likewise, as disclosed herein, the anatomic features may or may not be anatomical, whether medical or non-medical.
- the DPU 130 receives each image from the time series from the IPU 80 , which may be in real-time or substantially in real-time, and compares each received image to the anatomical model, which may be in real-time or substantially in real-time, then determines a confidence metric based on the comparison, which may be in real-time or substantially in real-time.
- the DPU includes instructions for a neural network based AIC, such as a deep learning convolutional neural network (DLCNN); however, it should be appreciated that other classifiers are also contemplated such as, but not limited to, a perceptron, a Na ⁇ ve Bayes classifier, decision trees, logistical regression, K-Nearest Neighbor, a support vector machine, CNN, RNN, and the like.
- the AIC initially trains the anatomical model based on individual frames of previously captured times series from similar and/or adjacent ROIs. The training can be performed on the WIU 22 itself; however, the anatomical model can also be trained on an external device 48 , remote device 104 , institutional service 106 , or the cloud storage system 108 .
- the trained anatomical model can then be transferred to the working memory of the DPU 130 or the storage device 74 via the wireless transceiver 100 .
- This may enable edge computing.
- the DPU 130 compares each image of the time series to the trained anatomical model in real-time or substantially in real-time and determines in real-time or substantially in real-time a classification for each image and a confidence metric based on the comparison.
- the DPU 130 is configured to instruct the IPU 80 to display, concurrently with the corresponding image, the classified anatomical features on the display 24 .
- the DPU 130 also receives the motion signals from the MPU 120 in real-time or substantially in real-time and determines a motion vector of the housing 28 in real-time or substantially in real-time which can then be used to estimate a motion vector of the distal tip 14 of the endoscope 12 and/or end effector 32 of the therapeutic device 30 in real-time or substantially in real-time.
- the motion vector can be a displacement vector, an acceleration vector, a velocity vector, a rotation vector, or the like.
- the DPU 130 can estimate in real-time or substantially in real-time a displacement and direction of portions of the portable system 10 disposed within the ROI 16 based on the detected motion of the housing 28 .
- the DPU 130 is configured to instruct the IPU 80 to display, concurrently with the corresponding image, the determined motion vector and/or classified anatomical features on the display 24 in real-time.
- the DPU 130 can be configured to identify in real-time or substantially in real-time a treatment region 58 within the ROI 16 .
- the treatment region 58 may be a region proximal to the verumontanum 62 and distal to the bladder neck 60 , thus, avoiding those delicate anatomical features.
- the DPU 130 may also be configured to determine in real-time or substantially in real-time the one or more treatment sites 64 a , 64 b , 66 a , 66 b , 68 a , 68 b .
- the DPU 130 is configured to instruct the IPU 80 to display, concurrently with the corresponding image, the determined motion vector, the classified anatomical features 40 , 42 , 44 , 60 , 62 , and/or the determined treatment sites 64 a , 64 b , 66 a , 66 b , 68 a , 68 b on the display 24 in real-time or substantially in real-time.
- endoscopes views of the ROI 16 displayed on the display 24 of the exemplary embodiment of a therapeutic procedure is depicted.
- the practitioner will view the entire ROI 16 to classify the anatomical features 40 , 42 , 44 , 52 , 54 for the specified therapeutic procedure.
- the practitioner may interact (e.g., by touch) with a user interface displayed on the display 24 via the touchscreen 78 to select the desired therapeutic procedure, although a default procedure may be selected or no default procedure is selected.
- the DPU 130 then identifies the desired anatomical model based on the user input and retrieves the anatomical model from any one of the storage device 74 , the external device 102 , the remote device 104 , the institutional server 106 , or the cloud storage system 108 .
- the practitioner initiates the procedure via a touch command through the user interface or by an external command by an assistant.
- the microcontroller 72 instructs the IPU 80 to begin collecting images of the time series and instructs the MPU 120 to begin collection of the motion signals of the housing 28 .
- the microprocessor 72 may also retrieve procedure data and/or EMR data from one of the storage device 74 , external device 48 , the external device 102 , the remote device 104 , the institutional server 106 , or the cloud storage system 108 and displays the procedure data on the display 24 for the practitioner to review before commencing the procedure.
- the practitioner commences the procedure by introducing the portable therapeutic system into the ROI 16 to image the entirety of the ROI 16 as prescribed by the procedure data.
- the practitioner can choose whether to perform the procedure manually based on the displayed classified anatomical features 140 , the displayed motion vector 142 , and the displayed confidence metric 144 ; or the practitioner may choose to perform the procedure in a semi-automated fashion based on the determined treatment region 58 and the determined treatment sites 64 a , 64 b , 66 a , 66 b , 68 a , 68 b which will be described in greater detail below.
- the practitioner may rely on a relative motion vector 146 such as, for example, from the bladder neck 60 to apply treatment to the proximal treatment sites 56 a , 56 b .
- the practitioner will introduce the distal tip 14 and end effector 32 into the ROI 16 till bladder 54 and bladder neck 60 is displayed as the classified anatomical feature 140 in real-time or substantially in real-time as, for example, a textual indicator indicating the corresponding anatomical feature and the displayed confidence metric 144 meets the practitioner's expectations as illustrated in FIG. 5 A .
- the practitioner then may interact with the touchscreen 78 of the display 24 to initiate a relative motion vector 146 therefrom and retract the distal tip 14 and/or end effector 32 until the relative motion vector 146 displays an adequate displacement and/or rotation to locate an optimal location for the proximal treatment sites 64 a , 64 b as illustrated in FIG. 5 B .
- the practitioner will engage the actuator 34 to deploy the treatment thereto.
- the practitioner may repeat this process with the verumontanum 62 and the distal treatment sites 66 a , 66 b to deploy the treatment.
- the practitioner retracts the distal tip 14 and/or end effector 32 until the verumontanum 62 is displayed as a classified anatomical feature 140 in real-time or substantially in real-time. From there, the practitioner will protract the distal tip 14 and/or end effector 32 until the relative motion vector 146 displays in real-time or substantially in real-time an adequate displacement and/or rotation to locate an optimal location for the distal treatment sites 66 a , 66 b as illustrated in FIG. 5 C .
- the practitioner will repeat the process as necessary to apply therapy to the medial treatment sites 68 a , 66 b .
- the practitioner protracts the distal tip 14 and/or end effector 32 along the treatment region 58 to identify regions of excess occlusion to identify one or more medial treatment sites 68 a , 66 b .
- the practitioner can rely on the relative motion vector 144 to ensure that subsequent medial treatment sites 68 a , 66 b are adequately spaced to achieve an optimal and continuous channel through the anterior aspect of the prostatic urethra 40 without creating unnecessary bulging adjacent to previously treated treatment sites 64 a , 64 b , 66 a , 66 b , 68 a , 68 b.
- the practitioner may rely on the automatically classified treatment region 58 and treatment sites 64 a , 64 b , 66 a , 66 b , 68 a , 68 b determined by the DPU 130 in real-time or substantially in real-time.
- the DPU 130 determines in real-time or substantially in real-time an optimal location for the treatment sites 64 a , 64 b , 66 a , 66 b , 68 a , 68 b .
- the practitioner reintroduces the distal tip 14 and end effector into the ROI 16 until the displayed treatment site 146 is achieved in real-time or substantially in real-time with a confidence metric 144 deemed sufficient by the practitioner.
- the practitioner engages the actuator 34 to deploy the treatment thereto. Similar to the manual procedure, the practitioner first deploys treatment at the proximal treatment sites 64 a , 64 b , then the distal treatment sites 66 a , 66 b , and then to one or more medial treatment sites 68 a , 68 b until the continuous channel through the anterior aspect of the prostatic urethra 40 is achieved.
- a flowchart of an embodiment of a method 200 for training the anatomical model by the DPU 130 is depicted.
- the images of the time series captured in real-time or substantially in real-time and classified in real-time or substantially in real-time during the therapeutic (or another action or technique) procedure are stored in the image memory 82 with corresponding classification and confidence metric metadata (or another form of data organization) as training data, S 10 .
- the training data can be used to retrain (or reinforce or update) the corresponding anatomical model.
- the training data is binned by the DPU 130 based on a predetermined confidence metric threshold, S 12 .
- Classified images with a confidence metric that exceeds (or satisfies) the predetermined confidence metric threshold are deemed as high confidence images and can be used to retrain (or reinforce or update) the corresponding anatomical model without further intervention.
- classified images that do not meet (or satisfy) the confidence metric threshold are deemed low confidence images and are binned for further manual verification (e.g., via a physical or virtual keyboard) by a user via the display 24 or the external device 48 , the remote device 104 , the institutional service 106 , or the cloud storage system 108 .
- the said devices can retrieve the low confidence images binned for manual verification from the image memory 82 via the network 104 or the Internet 102 .
- the DPU 130 is configured to retrieve the classified images binned as high confidence images, S 14 , and to retrain (or reinforce or update) the anatomical models, S 16 .
- the retrained (or reinforced or updated) anatomical model is then stored on the storage device 74 for future therapeutic (or other actions or techniques) procedures, S 18 .
- the retrained (or reinforced or updated) anatomical model may also be stored on one or more of the external device 102 , the remote device 104 , the institutional server 106 , or the cloud storage system 108 for future therapeutic (or other actions or techniques) procedures.
- the portable system 10 includes a training processing unit (TPU) 150 disposed within at least one of the external device 102 , the remote device 84 , the institutional service 86 , or the cloud storage system 88 that performs the retraining (or reinforcement or updating) of the anatomical model.
- TPU training processing unit
- FIG. 6 B another embodiment of a method 202 for training the anatomical model by the TPU 150 is depicted.
- the training data is retrieved by the TPU 150 of any one of the external device 102 , the remote device 84 , the institutional service 86 , or the cloud storage system 88 , S 20 .
- the TPU 150 bins the images of the training data into high confidence bins and low confidence bins based on the predetermined confidence metric threshold, S 22 .
- the TPU 150 is configured to retrieve the classified images binned as high confidence images, S 24 , and to retrain (or reinforce or update) the anatomical models, S 26 .
- the retrained (or reinforced or updated) anatomical model is then stored on a storage device of any one of the external device 102 , the remote device 104 , the institutional server 106 , or the cloud storage system 108 for future therapeutic (or other action or techniques) procedures, S 28 .
- the WIU 22 may retrieve the retrained (or reinforced or updated) anatomical model stored on any one of the external device 102 , the remote device 104 , the institutional server 106 , or the cloud storage system 108 and store retrained (or reinforced or updated) anatomical model on the storage device 64 for future therapeutic (or other action or techniques) procedures.
- a practitioner e.g., a user, a physician, a technician
- the practitioner selects a desired therapeutic (or other action or technique) procedure, S 32 .
- the DPU 130 of the WIU 22 determines and retrieves the corresponding anatomical model of the desired therapeutic (or other action or technique) procedure from any one of the storage device 74 , the external device 102 , the remote device 104 , the institutional server 106 , or the cloud storage system 108 , S 34 .
- the practitioner initiates the procedure via a touch command through the user interface and in response, the microcontroller 72 instructs the IPU 80 to begin collecting imaging information and instructs the MPU 120 to begin collection of the motion signals of the housing 28 , S 36 .
- the WIU 22 receives imaging information of the FoV 56 which comprises of at least a portion of the end effector 32 of the therapeutic (or other action or technique) device 30 and a portion of the ROI 16 , S 38 , which is converted in real-time or substantially in real-time into images of a time series and displayed in real-time on the display 24 .
- the DPU 130 detects in real-time or substantially in real-time the motion of the housing 28 and estimates in real-time or substantially in real-time a motion vector of the endoscope 12 and/or end effector 32 based on the detected motion signals, S 40 .
- the artificial intelligence classifier of the DPU 130 classifies in real-time or substantially in real-time at least one anatomical feature in each image of the times series, S 42 .
- the DPU 130 instructs the microprocessor 72 to display, concurrently with the corresponding image of the time series, the classification of the at least one anatomical feature 140 , the determined confidence metric 144 , and the determined motion vector 142 , S 44 .
- the kit 400 includes at least the WIU 22 according to any one of the embodiments described above and instructions 402 for performing at least one of the method 200 for training the anatomical model by the DPU 130 ; the method 202 for training the anatomical model by the TPU 150 ; or the method 300 for performing an endoscopic procedure, which, in some situations, may be an endoscopic therapeutic procedure according to or adapted to any one of the embodiments described above.
- the kit 400 can include a container (e.g., a box, a plastic bag, a package, a case) containing the WIU 22 according to any one of the embodiments described above and instructions to perform an endoscopic procedure according to any one of the embodiments described above, whether the endoscopic procedure is medical (e.g., for prevention, forecasting, diagnosis, amelioration, monitoring, or treatment of medical conditions in various mammalian pathology) or not medical (e.g., to assist visual inspection of narrow, difficult-to-reach cavities).
- a container e.g., a box, a plastic bag, a package, a case
- the container may also include at least one of the system controller 70 , the light source 18 , the observation port 20 , the imaging sensor 90 , the motion sensor 122 , the touchscreen 78 , the I/O port 156 , the display 24 , the external device 48 , or others, as disclosed or not disclosed herein.
- the endoscopic procedure may be used to prevent, diagnose, monitor, ameliorate, or treat a neurological condition, such as epilepsy, headache/migraine, whether primary or secondary, whether cluster or tension, neuralgia, seizures, vertigo, dizziness, concussion, aneurysm, palsy, Parkinson's disease, Alzheimer's disease, or others, as understood to skilled artisans and which are only omitted here for brevity.
- a neurological condition such as epilepsy, headache/migraine, whether primary or secondary, whether cluster or tension, neuralgia, seizures, vertigo, dizziness, concussion, aneurysm, palsy, Parkinson's disease, Alzheimer's disease, or others, as understood to skilled artisans and which are only omitted here for brevity.
- the endoscopic procedure may be used to prevent, diagnose, monitor, ameliorate, or treat a neurodegenerative disease, such as Alzheimer's disease, Parkinson's disease, multiple sclerosis, postoperative cognitive dysfunction, and postoperative delirium, or others, as understood to skilled artisans and which are only omitted here for brevity.
- a neurodegenerative disease such as Alzheimer's disease, Parkinson's disease, multiple sclerosis, postoperative cognitive dysfunction, and postoperative delirium, or others, as understood to skilled artisans and which are only omitted here for brevity.
- the endoscopic procedure may be used to prevent, diagnose, monitor, ameliorate, or treat an inflammatory disease or disorder, such as Alzheimer's disease, ankylosing spondylitis, arthritis (osteoarthritis, rheumatoid arthritis (RA), Sjogren's syndrome, temporal arteritis, Type 2 diabetes, psoriatic arthritis, asthma, atherosclerosis, Crohn's disease, colitis, dermatitis, diverticulitis, fibromyalgia, hepatitis, irritable bowel syndrome (IBS), systemic lupus erythematous (SLE), nephritis, fibromyalgia, Celiac disease, Parkinson's disease, ulcerative colitis, chronic peptic ulcer, tuberculosis, periodontitis, sinusitis, hepatitis, Graves disease, psoriasis, pernicious anemia (PA), peripheral neuropathy, lupus or others, as
- an inflammatory disease or disorder
- the endoscopic procedure may be used to prevent, diagnose, monitor, ameliorate, or treat a gastrointestinal condition, such as ileus, irritable bowel syndrome, Crohn's disease, ulcerative colitis, diverticulitis, gastroesophageal reflux disease, or others, as understood to skilled artisans and which are only omitted here for brevity.
- a gastrointestinal condition such as ileus, irritable bowel syndrome, Crohn's disease, ulcerative colitis, diverticulitis, gastroesophageal reflux disease, or others, as understood to skilled artisans and which are only omitted here for brevity.
- a bronchial disorder such as asthma, bronchitis, pneumonia, or others, as understood to skilled artisans and which are only omitted here for brevity.
- the endoscopic procedure may be used to prevent, diagnose, monitor, ameliorate, or treat a coronary artery disease, heart attack, arrhythmia, cardiomyopathy, or others, as understood to skilled artisans and which are only omitted here for brevity.
- the endoscopic procedure, as disclosed herein may be used to prevent, diagnose, monitor, ameliorate, or treat a urinary disorder, such as urinary incontinence, urinalysis, overactive bladder, or others, as understood to skilled artisans and which are only omitted here for brevity.
- the endoscopic procedure may be used to prevent, diagnose, monitor, ameliorate, or treat eat a cancer, such as bladder cancer, breast cancer, prostate cancer, lung cancer, colon or rectal cancer, skin cancer, thyroid cancer, brain cancer, leukemia, liver cancer, lymphoma, pancreatic cancer, or others, as understood to skilled artisans and which are only omitted here for brevity.
- a cancer such as bladder cancer, breast cancer, prostate cancer, lung cancer, colon or rectal cancer, skin cancer, thyroid cancer, brain cancer, leukemia, liver cancer, lymphoma, pancreatic cancer, or others, as understood to skilled artisans and which are only omitted here for brevity.
- the endoscopic procedure may be used to prevent, diagnose, monitor, ameliorate, or treat a metabolic disorder, such as diabetes (type 1 , type 2 , or gestational), Gaucher's disease, sick cell anemia, cystic fibrosis, hemochromatosis, or others, as understood to skilled artisans and which are only omitted here for brevity.
- a metabolic disorder such as diabetes (type 1 , type 2 , or gestational), Gaucher's disease, sick cell anemia, cystic fibrosis, hemochromatosis, or others, as understood to skilled artisans and which are only omitted here for brevity.
- the non-medical endoscopic procedure may be used for visual inspection work where the target area is inaccessible by other means, or where accessibility may require destructive, time consuming and/or expensive dismounting activities.
- the non-medical endoscopic procedure may be used for in nondestructive testing techniques for recognizing defects or imperfections (e.g., the visual inspection of aircraft engines, gas turbines, steam turbines, diesel engines, automotive engines, truck engines, machined or cast parts, surface finishes, complete through-holes, forensic applications in law enforcement, building inspection, in gunsmithing for inspecting the interior bore of a firearm).
- defects or imperfections e.g., the visual inspection of aircraft engines, gas turbines, steam turbines, diesel engines, automotive engines, truck engines, machined or cast parts, surface finishes, complete through-holes, forensic applications in law enforcement, building inspection, in gunsmithing for inspecting the interior bore of a firearm.
- the ROI 16 , the model, the device 30 , the implants, and relevant hardware/software and techniques of manufacture and use are adapted accordingly.
- Some embodiments may include a method comprising: receiving, by a processor, an imagery from an endoscope imaging a cavity, wherein the imagery depicts an anatomical feature within the cavity; performing, by the processor, a classification for the anatomical feature while the endoscope images the cavity; determining, by the processor, a confidence metric for the classficiation while the endoscope images the cavity; determining, by the processor, a motion vector for the endoscope imaging the cavity while the endoscope images the cavity; and requesting, by the processor, a display to simultaneously present at least two of the imagery, the classification, the confidence metric, or the motion vector while the endoscope images the cavity.
- the display may simultaneously present at least three of the imagery, the classification, the confidence metric, or the motion vector while the endoscope images the cavity.
- the display may simultaneously present the imagery, the classification, the confidence metric, and the motion vector while the endoscope images the cavity.
- the display may simultaneously present the imagery and at least two of the classification, the confidence metric, or the motion vector while the endoscope images the cavity.
- the display may simultaneously present the imagery and the classification and at least one of the confidence metric or the motion vector while the endoscope images the cavity.
- Some embodiments may include a method comprising: receiving, by a processor, an imagery from an endoscope imaging a cavity, wherein the imagery depicts an anatomical feature within the cavity; performing, by the processor, a classification for the anatomical feature while the endoscope images the cavity; determining, by the processor, a confidence metric for the classficiation while the endoscope images the cavity; determining, by the processor, a motion vector for the endoscope imaging the cavity while the endoscope images the cavity; and taking, by the processor, an action based on the classification, the confidence metric, and the motion vector.
- the action may include deploying (e.g., moving, extending, adjusting, powering, grasping, cutting) an end effector within the cavity being imaged by the endoscope.
- the end effector may be a component of a robotic arm (e.g., during a surgical procedure, an investigation of a cavity).
- the action may be with respect to the anatomical feature within the cavity (e.g., contacting the anatomical feature by the end effector).
- the action may not with respect to the anatomical feature within the cavity (e.g., another anatomical feature, an object internal to the cavity, an object external to the cavity).
- the cavity may be a mammalian cavity or an inanimate cavity.
- the action may include requesting, by the processor, a display to simultaneously present at least two of the imagery, the classification, the confidence metric, or the motion vector while the endoscope images the cavity.
- the display may simultaneously present at least three of the imagery, the classification, the confidence metric, or the motion vector while the endoscope images the cavity.
- the display may simultaneously present the imagery, the classification, the confidence metric, and the motion vector while the endoscope images the cavity.
- the display may simultaneously present the imagery and at least two of the classification, the confidence metric, or the motion vector while the endoscope images the cavity.
- the display may simultaneously present the imagery and the classification and at least one of the confidence metric or the motion vector while the endoscope images the cavity.
- Various embodiments of the present disclosure may be implemented in a data processing system suitable for storing and/or executing program code that includes at least one processor coupled directly or indirectly to memory elements through a system bus.
- the memory elements include, for instance, local memory employed during actual execution of the program code, bulk storage, and cache memory which provide temporary storage of at least some program code in order to reduce the number of times code must be retrieved from bulk storage during execution.
- I/O devices can be coupled to the system either directly or through intervening I/O controllers.
- Network adapters may also be coupled to the system to enable the data processing system to become coupled to other data processing systems or remote printers or storage devices through intervening private or public networks. Moderns, cable moderns, and Ethernet cards are just a few of the available types of network adapters.
- the present disclosure may be embodied in a system, a method, and/or a computer program product.
- the computer program product may include a computer readable storage medium (or media) having computer readable program instructions thereon for causing a processor to carry out aspects of the present disclosure.
- the computer readable storage medium can be a tangible device that can retain and store instructions for use by an instruction execution device.
- the computer readable storage medium may be, for example, but is not limited to, an electronic storage device, a magnetic storage device, an optical storage device, an electromagnetic storage device, a semiconductor storage device, or any suitable combination of the foregoing.
- a non-exhaustive list of more specific examples of the computer readable storage medium includes the following: a portable computer diskette, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), a static random access memory (SRAM), a portable compact disc read-only memory (CD-ROM), a digital versatile disk (DVD), a memory stick, a floppy disk, a mechanically encoded device such as punch-cards or raised structures in a groove having instructions recorded thereon, and any suitable combination
- Computer readable program instructions described herein can be downloaded to respective computing/processing devices from a computer readable storage medium or to an external computer or external storage device via a network, for example, the Internet, a local area network, a wide area network and/or a wireless network.
- the network may comprise copper transmission cables, optical transmission fibers, wireless transmission, routers, firewalls, switches, gateway computers and/or edge servers.
- a network adapter card or network interface in each computing/processing device receives computer readable program instructions from the network and forwards the computer readable program instructions for storage in a computer readable storage medium within the respective computing/processing device.
- Computer readable program instructions for carrying out operations of the present disclosure may be assembler instructions, instruction-set-architecture (ISA) instructions, machine instructions, machine dependent instructions, microcode, firmware instructions, state-setting data, or either source code or object code written in any combination of one or more programming languages, including an object oriented programming language such as Smalltalk, C++ or the like, and conventional procedural programming languages, such as the “C” programming language or similar programming languages.
- a code segment or machine-executable instructions may represent a procedure, a function, a subprogram, a program, a routine, a subroutine, a module, a software package, a class, or any combination of instructions, data structures, or pro-gram statements.
- a code segment may be coupled to another code segment or a hardware circuit by passing and/or receiving information, data, arguments, parameters, or memory contents.
- Information, arguments, parameters, data, etc. may be passed, forwarded, or transmitted via any suitable means including memory sharing, message passing, token passing, network transmission, among others.
- the computer readable program instructions may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the latter scenario, the remote computer may be connected to the user's computer through any type of network, including a local area network (LAN) or a wide area network (WAN), or an Internet Service Provider).
- LAN local area network
- WAN wide area network
- Internet Service Provider an Internet Service Provider
- electronic circuitry including, for example, programmable logic circuitry, field-programmable gate arrays (FPGA), or programmable logic arrays (PLA) may execute the computer readable program instructions by utilizing state information of the computer readable program instructions to personalize the electronic circuitry, in order to perform aspects of the present disclosure.
- FPGA field-programmable gate arrays
- PLA programmable logic arrays
- each block in the flowchart or block diagrams may represent a module, segment, or portion of instructions, which comprises one or more executable instructions for implementing the specified logical function(s).
- the functions noted in the block may occur out of the order noted in the figures.
- two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems that perform the specified functions or acts or carry out combinations of special purpose hardware and computer instructions.
- Words such as “then,” “next,” etc. are not intended to limit the order of the steps; these words are simply used to guide the reader through the description of the methods.
- process flow diagrams may describe the operations as a sequential process, many of the operations can be performed in parallel or concurrently.
- the order of the operations may be re-arranged.
- a process may correspond to a method, a function, a procedure, a subroutine, a subprogram, etc.
- its termination may correspond to a return of the function to the calling function or the main function.
Abstract
A portable endoscopic system comprising an imaging unit for an endoscopic procedure. The imaging unit has an imaging coupler for receiving imaging information from an imaging assembly of an endoscope; a display integrated into a housing of the imaging unit; an image processing unit for processing the received imaging information into images of a time series and to displaying the image in real-time; a motion sensor configured to detect a motion of the housing; and a detection processing unit. The detection processing unit is configured to classify at least one anatomical feature in each image of the time series based on an artificial intelligence classifier; determine a confidence metric of the classification; determine a motion vector based on the detected motion; and display, concurrently with the corresponding image, the classification of the at least one anatomical feature, the determined confidence metric, and the determined motion vector.
Description
- The present disclosure relates to systems, apparatuses, and methods utilizing an endoscopic imaging system. More specifically, the present disclosure relates to systems, apparatuses, and methods utilizing an endoscopic system for enabling various endoscopic procedures (medical and non-medical). For example, various technologies disclosed herein enable prevention, forecasting, diagnosis, amelioration, monitoring, or treatment of medical conditions in various mammalian pathology, such as human prostate pathology including but not limited to benign prostatic hyperplasia (BPH) or other human or non-human medical conditions, or non-medical procedures.
- There are various prostate diseases. One of those is benign prostatic hyperplasia (BPH), which may be found in nearly every aging human male. BPH is often the primary cause for lower urinary tract symptoms (LUTS), such as nocturia, frequency, urgency, hesitancy, incomplete emptying, leakage, and dribbling. It is generally estimated that 90% of men between the ages of 45 and 80 years have some form of LUTS or BPH with prevalence increasing nearly linearly with age. While BPH is rarely life-threatening, BPH can lead to numerous clinical conditions including urinary retention, renal insufficiency, recurrent urinary tract infections, incontinence, hematuria, and bladder stones. Thus, early intervention may sometimes be recommended to improve patient outcomes and quality of life.
- Although several drug therapies are available and effective to treat BPH, their effectiveness is typically short-lived. Surgical treatments for BPH range from minimally invasive techniques, such as prostatic urethral lift devices and various ablation methods, to more invasive resection surgeries to fully invasive prostatectomy surgeries. The surgeries involve cutting or ablating tissue near delicate structures, such as the bladder and the verumontanum, which is critical for male sexual function. Therefore, these surgeries require extensive practice with cystoscopic methods to identify the delicate structures and to estimate the treatment areas proximate to those delicate structures.
- Minimally invasive procedures offer the advantage of less pain, faster recovery, lower costs, and the use of local anesthesia and mild sedation. Transurethral prostatic procedures involve tissue examination of the bladder and urethral mucosa with a specialty endoscope called a cystoscope. During an examination, a physician will expand the urethra and bladder with a clear fluid to visualize the mucosal surface of the bladder and urethra. For prostatic procedures, typically the region distal to the bladder neck and proximal to the verumontanum are identified as the treatment area. Once the treatment area is identified, the physician can apply treatment to one or more of the lateral, medial, and anterior prostatic lobes.
- Ablative or resective prostatic surgeries require highly-specialized equipment, such as microwave, ultrasound, laser, vapor, or cryotherapy sources that provide ablative energy to a purpose-built therapeutic device. Furthermore, transurethral ablative procedures tend to be expensive and/or complicative due to their specialized equipment and need for technical expertise and thus not practical in ambulatory/office settings or in areas of the world where such equipment are cost-prohibitive. Recently, transurethral water vapor therapies (TWVT) have gained momentum as a treatment modality with good efficacy for large prostate volumes. Prostatic urethral lift (PUL) procedures have also become a mainstay of BPH treatment in the past decade preserving ejaculatory function, while requiring minimal anesthesia. Patients that received a PUL procedure reported generally better sexual function, improved recovery time, and less interference in daily activities over other treatment modalities. PUL devices are permanent, implantable fixation devices similar to tacks or anchors that aim to create channels in one or more of the prostatic lobes between the bladder neck to the verumontanum to reduce obstruction and improve flow.
- Despite the treatment modality, each requires accurate localization to achieve optimal and enduring results. For example, in a PUL procedure typically four to five implants are required for an average-sized prostate to achieve an ideal opening, but upwards of ten implants may be necessary for large and/or abnormally shaped prostates. The first implant is placed approximately 2 cm distally of the bladder neck, the second implant placed just anterior to the verumontanum, with additional implants placed in between to form a continuous channel typically through the anterior and lateral aspects of the prostate. Each implant is housed in a disposable cartridge that must be replaced after the implant is deployed. Thus, the effector handle, cartridge, cystoscope are removed to replace the cartridge and introduce a new implant. While the sheath remains disposed in the prostatic urethra, the physician must constantly iterate this process, which can introduce errors in the optimal placement of implants. Furthermore, many of the PUL devices require the physician to actuate one or more controls multiple times to fully deploy the implant thereby further exacerbating achievement of optimal implantation. Finally, identifying the optimal location of the implants based on each patient's unique anatomy requires a learning curve that can be highly subjective.
- These compromises and technological problems are believed to be present in virtually all currently known treatment modalities for typical prostate pathologies and not just PUL-type treatments. Accordingly, there exists a technological need for a lightweight, portable imaging platform for endoscopic therapies to identify and track anatomical landmarks and therapeutic sites in vivo.
- This disclosure addresses these compromises and solves the technological problems noted above by enabling various systems, apparatuses, and methods for endoscopy, whether for medical (e.g., prevention, forecasting, diagnosis, amelioration, monitoring, or treatment of medical conditions in various mammalian pathology) or non-medical purposes (e.g., to assist visual inspection of narrow, difficult-to-reach cavities). These and other features, aspects, and advantages of the present embodiments will become better understood upon consideration of the following detailed description, drawings, and appended claims.
- In one example of the present disclosure, an imaging unit for an endoscopic procedure is presented. The imaging unit comprises a housing and a display integrated into the housing. An imaging coupler is configured for receiving imaging information from an imaging assembly of an endoscope having a field of view (FoV) comprising of at least a portion of an end effector and a portion of a region of interest (ROI). An imaging processor is configured with instructions to process the received imaging information into pixel values representing an image of a time series and to display the image in real-time on the display, while a motion sensor is configured to detect a motion of the housing during the time series. The imaging unit comprises a detection processing unit (DPU) configured with instructions to: classify at least one anatomical feature in each image of the time series based on an artificial intelligence classifier; determine a confidence metric of the classification; determine a motion vector based on the detected motion; and display, concurrently with the corresponding image, the classification of the at least one anatomical feature, the determined confidence metric, and the determined motion vector.
- In another example of the present disclosure, the motion sensor includes at least a gyroscope configured to generate a gyroscopic signal and an accelerometer configured to generate acceleration signals, the detection processing unit further configured to determine a displacement vector based on at least the gyroscopic signal and the acceleration signal.
- In another example of the present disclosure, the DPU is configured to display the displacement vector concurrently with the corresponding classification
- In another example of the present disclosure, the detection processing unit is configured to display the displacement vector relative to one or more classified anatomical features.
- In another example of the present disclosure, the detection processing unit is configured to display a plurality of displacement vectors each one relative to a unique classified anatomical feature.
- In another example of the present disclosure, wherein the artificial intelligence classifier is a convolutional neural network configured to compare each image to an anatomical model.
- In another example of the present disclosure, wherein the detection processing unit determines the confidence metric based on the comparison.
- In another example of the present disclosure, the detection processing unit is configured to identify at least one treatment site based on the at least one classified anatomical feature, and display, concurrently with the corresponding image, the at least one identified treatment site and a relative motion vector between the classified anatomical feature and the identified treatment site.
- In another example of the present disclosure, the region of the interest includes at least a prostatic urethra the administered therapy includes prostatic treatment, the detection processing unit is further configured to classify a prostatic pathology.
- In another example of the present disclosure, a method for endoscopic imaging is presented. The method includes operatively coupling an imaging coupler of an imaging unit to an observation port of an endoscope. Imaging information is received from an imaging assembly of the endoscope. The imaging assembly has an FoV comprising of at least a portion of an end effector and a portion of a ROI. The received imaging information is processed into pixel values representing an image of a time series. The images are displayed in real-time on a display integrated into the housing of the imaging unit, and motion of the housing is detected during the capture of the time series. At least one anatomical feature is classified in each image of the time series based on an artificial intelligence classifier. A confidence metric of the classification is determined; a motion vector based on the detected motion is determined; and, concurrently with the corresponding image, the classification of the at least one anatomical feature, the determined confidence metric, and the determined motion vector are displayed on the display in real-time.
- In another example of the present disclosure, wherein the step of detecting motion further includes generating a gyroscopic and acceleration signal associated with the motion of the housing. A displacement vector based on at least the gyroscopic signal and the acceleration signal is determined.
- In another example of the present disclosure, wherein the method further includes displaying the displacement vector concurrently with the corresponding classification.
- In another example of the present disclosure, wherein the method further includes displaying the displacement vector relative to one or more classified anatomical features.
- In another example of the present disclosure, wherein the method further includes displaying a plurality of displacement vectors each one relative to a unique classified anatomical feature
- In another example of the present disclosure, wherein the artificial intelligence classifier is a convolutional neural network configured to compare each image to an anatomical model.
- In another example of the present disclosure, wherein the confidence metric is based on the comparison.
- In another example of the present disclosure, wherein the method further includes identifying at least one treatment site based on the at least one classified anatomical feature; and displaying, concurrently with the corresponding image, the at least one identified treatment site and the determined motion vector.
- In another example of the present disclosure, wherein the region of the interest includes at least a prostatic urethra and the administered therapy includes prostatic treatment, the method further includes classifying a prostatic pathology.
- In another example of the present disclosure, a kit for an endoscopic therapeutic procedure is presented. The kit includes an endoscopic imaging unit which comprises a housing and a display integrated into the housing. The endoscopic imaging unit includes an imaging coupler is configured for receiving imaging information from an imaging assembly of an endoscope having a field of view (FoV) comprising of at least a portion of an end effector and a portion of a region of interest (ROI). In addition, the endoscopic imaging unit includes an imaging processor; a motion sensor configured to detect a motion of the housing during the time series; and a detection processing unit (DPU). Furthermore, the kit includes instructions to perform a method for endoscopic imaging. The method includes the steps of: operatively coupling the imaging coupler of the imaging unit to an observation port of an endoscope; receiving the imaging information from the imaging assembly; processing the received imaging information into pixel values representing an image of a time series; displaying the image in real-time on the display; detecting motion of the housing during the time series; classifying at least one anatomical feature in each image of the time series based on an artificial intelligence classifier; determining a confidence metric of the classification; determining a motion vector based on the detected motion; and displaying, concurrently with the corresponding image, the classification of the at least one anatomical feature, the determined confidence metric, and the determined motion vector.
- In another example of the present disclosure, wherein the step of detecting motion further includes generating a gyroscopic and acceleration signal associated with the motion of the housing, and determining a displacement vector based on at least the gyroscopic signal and the acceleration signal. Wherein the artificial intelligence classifier is a convolutional neural network configured to compare each image to an anatomical model.
- In an embodiment, a method may comprise: receiving, by a processor, an imagery from an endoscope imaging a cavity, wherein the imagery depicts an anatomical feature within the cavity; performing, by the processor, a classification for the anatomical feature while the endoscope images the cavity; determining, by the processor, a confidence metric for the classficiation while the endoscope images the cavity; determining, by the processor, a motion vector for the endoscope imaging the cavity while the endoscope images the cavity; and requesting, by the processor, a display to simultaneously present at least two of the imagery, the classification, the confidence metric, or the motion vector while the endoscope images the cavity.
- In an embodiment, a method may comprise: receiving, by a processor, an imagery from an endoscope imaging a cavity, wherein the imagery depicts an anatomical feature within the cavity; performing, by the processor, a classification for the anatomical feature while the endoscope images the cavity; determining, by the processor, a confidence metric for the classficiation while the endoscope images the cavity; determining, by the processor, a motion vector for the endoscope imaging the cavity while the endoscope images the cavity; and taking, by the processor, an action based on the classification, the confidence metric, and the motion vector.
- In order that the manner in which the above-recited and other advantages and objects of the disclosure are obtained, a more particular description of the disclosure briefly described above will be rendered by reference to a specific embodiment thereof which is illustrated in the appended drawings. Understanding that these drawings depict only a typical embodiment of the disclosure and are not, therefore, to be considered to be limiting of its scope, the disclosure will be described and explained with additional specificity and detail through the use of the accompanying drawings in which:
-
FIG. 1 is a structural diagram of an embodiment of a portable system; -
FIG. 2 is a structural diagram of an embodiment of the portable system disposed in a region of interest; -
FIG. 3 is a block diagram of an embodiment of a wireless imaging unit of the portable system; -
FIG. 4 a network diagram of an embodiment of the portable system is illustrated; -
FIGS. 5A-5D endoscopic views of a region of interest as displayed on the portable system are illustrated; -
FIGS. 6A-6B are flowcharts of embodiments for training an anatomical model; -
FIG. 7 is a flowchart of a method for performing an endoscopic procedure; and -
FIG. 8 is a a structural diagram of a kit for an performing an endoscopic procedure. - Hereinafter, various embodiments of the present disclosure will be described in detail with reference to the accompanying drawings so that the present disclosure may be readily implemented by those skilled in the art. However, it is to be noted that the present disclosure is not limited to the embodiments but is capable of being embodied or carried out in various other ways. In drawings, parts irrelevant to the description are omitted for the simplicity of explanation, and like reference numerals denote like parts through the whole document.
- Note that various terminology used herein can imply direct or indirect, full or partial, temporary or permanent, action or inaction. For example, when an element is referred to as being on “connected” or “coupled” to another element, then the element can be directly on, connected or coupled to the other element or intervening elements can be present, including indirect or direct variants. In contrast, when an element is referred to as being “directly connected” or “directly coupled” to another element, there are no intervening elements present.
- Likewise, as used herein, a term “or” is intended to mean an inclusive “or” rather than an exclusive “or.” That is, unless specified otherwise, or clear from context, “X employs A or B” is intended to mean any of the natural inclusive permutations. That is, if X employs A; X employs B; or X employs both A and B, then “X employs A or B” is satisfied under any of the foregoing instances.
- Similarly, as used herein, various singular forms, “an” and “the” are Intended to include various plural forms as well, unless context clearly indicates otherwise. For example, a term “a” or “an” shah mean “one or more,” even though a phrase “one or more” is also used herein. For example, “one or more” includes one, two, three, four, five, six, seven, eight, nine, ten, tens, hundreds, thousands, or more Including all intermediary whole or decimal values therebetween.
- Moreover, terms “comprises,” “includes” or “comprising,” “including” when used in this specification, specify a presence of stated features, integers, steps, operations, elements, or components, but do not preclude a presence and/or addition of one or more other features, integers, steps, operations, elements, components, or groups thereof. Furthermore, when this disclosure states that something is “based on” something else, then such statement refers to a basis which may be based on one or more other things as well. In other words, unless expressly indicated otherwise, as used herein “based on” inclusively means “based at least in part on” or “based at least partially on.”
- Additionally; although terms first, second, and others can be used herein to describe various elements, components, regions, layers, or sections, these elements, components, regions, layers, or sections should not necessarily be limited by such terms. Rather, these terms are used to distinguish one element, component, region, layer, or section from another element, component, region, layer, or section. As such, a first element, component, region, layer, or section discussed below could be termed a second element, component, region, layer, or section without departing from this disclosure.
- Also, unless otherwise defined, all terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in an art to which this disclosure belongs. As such, terms, such as those defined in commonly used dictionaries, should be interpreted as having a meaning that is consistent with their meaning in a context of a relevant art and should not be interpreted in an idealized or overly formal sense unless expressly so defined herein.
- Various features and aspects of the present disclosure are best understood by reference to the accompanying drawings, when considered during the course of the following discussion.
- With reference to the drawings,
FIG. 1 shows the main components of aportable system 10 used during an endoscopic procedure, which may be a diagnostic or therapeutic procedure (or another type of procedure whether medical or non-medical). Anendoscope 12 is inserted into a patient 14 (e.g., a mammal, a human, an animal, a pet, a bird, a fish, a male, a female) to a region of interest (ROI) 16, such as a tissue, an organ, a body part, or any other in vivo feature, although non-medical uses may employ non-patients or inanimate objects, such as tubes, cavities, tunnels, crevices, bores, channels, or other relevant non-patient or inanimate ROIs. The region ofinterest 16 is illuminated by an externallight source 18 which directs incident light along an illumination pathway, such as an optical fiber that extends along a tube of theendoscope 12 to an illumination lens at adistal tip 14. The illuminated region ofinterest 16 reflects the incident light back to an imaging lens at thedistal tip 14 to convey the reflected light along an imaging pathway, such as an optical fiber to anobservation port 20, such as an eyepiece. The reflected light is received by a wireless imaging unit (WIU) 22 via theobservation port 18. TheWIU 22 may include a digital imaging sensor that converts the reflected light into imaging data which can then be processed and displayed on adisplay 24. - In other embodiments, the
endoscope 12 may be a digital endoscope with a chip-on-a-tip arrangement. For example, theendoscope 12 may include one or more light-emitting diodes (LEDs) disposed at thedistal tip 14 for illuminating theROI 16. In this arrangement, there is no externallight source 18. Thedistal tip 14 may also include the digital imaging sensor for generating the imaging data of theROI 16. A communication pathway along the tube of theendoscope 12 may transmit and receive control signals for controlling the LEDs and the digital imaging sensor instead of the illumination and imaging pathways. TheWIU 22 may receive the imaging data from the digital imaging sensor via theobservation port 20. In one embodiment, theobservation port 20 serves as an optical observation port, such as an eyepiece, while in another embodiment, theobservation port 20 may take the form of a digital interface, such as a digital connector for conveying imaging data electronically. Theobservation port 20 interfaces with theWIU 22 via animaging coupler 26. In the illustrated embodiment, theimaging coupler 26 optically couples theWIU 22 to theobservation port 20 of theendoscope 12. In the previously mentioned chip-on-a-tip embodiment, theimaging coupler 26 digitally couples theWIU 22 to adigital observation port 20 via an electrical connector with various data channels and/or electrical channels for controlling the LEDs and/or digital imaging sensor at thedistal tip 14. TheWIU 22 includes ahousing 28 which is configured to integrate theobservation port 22,display 24,imaging coupler 26, andlight source 18 into a single device, while protecting various internal components, such as, but not limited to, electronic circuit components, power source, thermal management, and the like. - The
portable system 10 includes atherapeutic device 30 configured to be disposed in vivo into theROI 16 in tandem with theendoscope 12 to administer a therapy (or another action or technique) therein. For example, this may include prevention, forecasting, diagnosis, amelioration, monitoring, or treatment of medical conditions via or while theendoscope 12 is disposed in vivo into theROI 16. As such, in those situations, thedevice 30 may be suitably labeled/configured (e.g., thediagnosis device 30, theforecasting device 30, theprevention device 30, and so forth). In situations that are non-medical, thedevice 30 is suitably configured as well. Thetherapeutic device 30 includes anend effector 32 which delivers the therapy (or another action or technique) and includes anactuator 34 for initiating the delivery of the therapy (or another action or technique). In the illustrated embodiment, theROI 16 includes at least aprostatic urethra 40, theprostate 42, and thebladder 44, although this is illustrative and other body parts, organs, or tissues may be used (orinanimate ROI 16 may be used for non-medical uses). In this embodiment, thetherapeutic device 30 is configured to administer therapies to treat medical conditions associated with prostatic pathologies, such as, but not limited to, benign prostatic hyperplasia (BPH) and the like, although non-prostatic pathologies may be used as well. Thetherapeutic device 30 may be configured to administer one or more of the following therapeutic treatments, such as resection, incision, ablation, thermotherapy, enucleation, implantation, cryotherapy, vapor therapy, embolization, and the like. While in the illustrated embodiment thetherapeutic device 30 is shown with ahandle 36 andactuator 34, it should be appreciated that thetherapeutic device 30 may embodiment various shapes, sizes, and designs specified by the delivered therapy. For example, although thetherapeutic device 30 is embodied as pistol-shaped via thehandle 36, this form factor is not required and other form factors may be used. For example, thehandle 36 may be omitted and theactuator 34 may be embodied differently than a lever pivoting toward or away from the handle 36 (e.g., a pressable/depressable button, a rotary knob, a rotating sleeve). - The
WIU 22 is capable of wireless (e.g., radio frequency, line of sight)communication 46, such as high-speed bi-directional data communications directly (or indirectly) to one or moreexternal devices 48 simultaneously or substantially simultaneously. Theexternal devices 48 are capable of directly (or indirectly) receiving data, such as digital images, digital video, or other information pertaining to the therapeutic procedure. Theexternal device 48 can also directly (or indirectly) transmit control data or signals to theWIU 22 to remotely control theWIU 22. Theexternal device 48 can also transmit therapeutic (or other action or procedure) information regarding the therapeutic (or other action or technique) procedure, such as patient data in the form of electronic medical records (EMR) or procedure data such as instructions for performing the procedure. Examples ofexternal devices 48 may include personal computing devices such as desktop computers; portable devices such as smart devices, smartphones, personal digital assistants, tablet computers, wrist-mounted displays, smartwatches, or the like; laptops or portable computers; head-mounted displays; or other computing devices not yet contemplated. - With reference to
FIG. 2 , an expanded view of theROI 16 is illustrated. In an exemplary embodiment, theportable system 10 is configured for BPH therapy, although BPH or non BPH prevention, forecasting, diagnosis, amelioration, monitoring, or treatment is possible. BPH therapy typically involves reducing the effect of anenlarged prostate 42 has on theprostatic urethra 40. In the exemplary embodiment, thetherapeutic device 30 is configured to deploy prostatic urethral lift (PUL) implants at various treatment sites along theprostatic urethra 40 to lift and pull prostatic tissue away from aurethral channel 50 to improve flow from, for example, thebladder 44. The locations of the treatment sites are typically chosen at the discretion of the practitioner performing the procedure based on subjective criteria, such as the degree of the achieved lifting visualized through theendoscope 12. This subjectivity may result in a non-optimal placement of the PUL implants which can result in the costly deployment of excess implants, insufficient deployment of implants such that the patient does not achieve the desired outcome, improper placement damaging sensitive anatomy such as the verumontanum or piercing through the bladder neck resulting in unintended consequences as sexual and/or bladder dysfunction, infection, and the like. Due to the subjective nature of PUL implantation, practitioners have to undergo significant training and supervision to become familiar with the procedure to perform the procedure adequately. Regardless of the training a practitioner receives, the procedure still may not be performed optimally for long-lasting results. - During a procedure, the practitioner introduces the
distal tip 14 of theendoscope 12 to identify theROI 16. The practitioner may concurrently introduce theend effector 32 of thetherapeutic device 30 while identifying theROI 16, or subsequently after theROI 16 is identified. To identify theROI 16, the practitioner observes real-time imaging information on thedisplay 24 which is received by animaging assembly 52 from theROI 16 illuminated by incident light by alight emitter 54. Theimaging assembly 46 detects reflected light from theROI 16 within a field of view (FoV) 56 that includes at least a portion of theROI 16 and a portion of theend effector 32 of thetherapeutic device 30. - Once the
distal tip 14 is situated within the identifiedROI 16, the practitioner identifies atreatment region 58 between thebladder neck 60 and theverumontanum 62 so as not to damage these delicate anatomical features. Thebladder neck 60 is a group of muscles that connect the bladder to the urethra and is primarily tasked with holding urine in the bladder, if damaged can lead to incontinence and other issues. Theverumontanum 62 is an elevation in the floor of the prostatic urethra that is an important landmark distal which helps identify the entrance of the ejaculatory ducts. - Once identified, the practitioner retraces their movements to approximately 2 cm distal of the
bladder neck 60 to aproximal treatment site proximal treatment site bladder neck 60 is damaged, then such damage can lead to incontinence, bladder leakage, and other issues. If a site is chosen too proximal to thebladder neck 60, then the practitioner may pierce thebladder neck 60 and cause such dysfunction. If the site is chosen too distal to the bladder neck, then an adequate proximal opening is not achieved and symptoms of BPH, such as urinary retention and/or incomplete voiding may not be mitigated. However, even if an optimal placement is achieved at theproximal treatment sites distal treatment sites medial treatment sites medial treatment sites proximal treatment sites bladder neck 60; however, after revisiting these sites the practitioner may inadvertently deploy the implants 2.5 cm too distally from thebladder neck 60 and the optimal proximal opening at the bladder is not achieved. Thus, extraneous implants may have to be deployed to correct the non-optimally deployed implants. - The
portable system 10 aims to minimize implantation errors by identifying and tracking variousanatomical features ROI 16 and determiningoptimal treatment sites ROI 16 and a tracking system configured to track a motion vector of theWIU 22 and thus track the motion of thedistal tip 14 of theendoscope 12 and/or theend effector 32 of thetherapeutic device 30. By identifying an optimal treatment site based on a unique patient's anatomy, the therapeutic procedure can achieve enduring results while keeping costs low by increasing efficiency, reducing procedure time, and reducing non-optimal implantation errors. - With reference to
FIG. 3 , a block diagram of theWIU 22 is illustrated. The previously mentioned electronic circuitry of theWIU 22 includes asystem controller 70 which is configured to control and power theWIU 22. Thesystem controller 70 includes a plurality of circuit components that are responsible for controlling aspects of theWIU 22. Thesystem controller 70 includes amicroprocessor 72 which interfaces with several electronic components to send and receive instructions to control various aspects of theWIU 22 functions. Thesystem controller 70 includes astorage device 74 which is a memory device, such as a computer-readable medium (e.g., persistent memory, flash memory, embedded memory, ROM, RAM) for storing program instructions to be executed by themicroprocessor 72. In addition to program instructions, thestorage device 74 can store anatomical models, procedure data relevant to performing the therapeutic procedure, electronic medical record (EMR) related data, and the like. Thesystem controller 70 includes atouchscreen input controller 76 configured to receive user inputs from atouchscreen 76 which is overlayed over or included within thedisplay 24. The practitioner can interact with thetouchscreen 76 to control various aspects of theWIU 22 via a user interface displayed on thedisplay 24. - The
system controller 70 also includes anillumination controller 76 which receives instructions from themicroprocessor 72 to adjust the intensity or brightness of the incident light from thelight emitter 54 and/or one or more frequency components of the incident light produced therefrom. It should be appreciated that thelight emitter 54 may include one or more LEDs for generating incident light in theROI 16, or it may be optically coupled to thelight source 18 for transmitting incident light thereto. - An imaging processing unit (IPU) 80 may include instructions or is configured to execute instructions stored on the
storage device 74 to perform various imaging-related functions. For example, theIPU 80 is configured to receive the imaging information from theimaging assembly 52 of theFoV 56. Theimaging assembly 52 can be an optical assembly that directs reflected light from theROI 16 to the observation port 22 (e.g., eyepiece) of theendoscope 12. In the exemplary embodiment, theWIU 22 includes animaging sensor 90 integrated into thehousing 28 and in direct communication with theIPU 80. In another embodiment (e.g., a chip-on-a-tip arrangement), theimaging assembly 52 comprises of theimaging sensor 90 and is disposed at thedistal tip 14 of theendoscope 12. In this arrangement, theimaging sensor 90 transmits at least one of analog signals, digital signals, or a combination of analog and digital signals pertaining to the imaging information to theobservation port 22 which can then be transmitted to theIPU 80. - The
imaging sensor 90 may include one of the following: complementary metal-oxide-semiconductor (CMOS), charge-coupled device (CCD), or other imaging sensor devices developed in the future not yet contemplated. The imaging information can be one of a digital signal or an analog signal that is converted to a digital signal by an analog-to-digital converter (ADC) of theIPU 80 to form pixel values representing an image of a time series of theFoV 56. - The
IPU 80 may also be configured to perform several image processing and post-processing functions in real-time or substantially real-time on the images of the time series. Examples of image processing techniques to enhance the images include edge detection, geometric transformations, perspective correction, color correction, color calibration, motion compensation, data compression, noise reduction, filtering, and the like. TheIPU 80 may also be configured to control the functionality of theimaging sensor 90, such as adjusting a focal depth by controlling an integrated autofocus mechanism, pixel clock, sensitivity, offset, signal amplification, gain, gamma, and the like. TheIPU 80 may also be configured to adjust the image size that is displayed on anexternal device 48 due to the difference in screen resolution and screen size betweenexternal devices 48 and thedisplay 24. TheIPU 80 may also be configured to automatically align the images such that the images are centered in thedisplay 24 independent of the size and/or resolution of thedisplay 24 being used whether it is thedisplay 24 or a display of anexternal device 48. TheIPU 80 receives the display information from themicroprocessor 72 and formats the output image correspondingly. Post-processed images can then be stored on animage memory 82 for later retrieval to be viewed locally on thedisplay 24 or theexternal device 102. - The
WIU 22 can also be configured with awireless transceiver 100 to communicate with theexternal device 48 directly via thewireless connection 46 or indirectly via theInternet 102 withremote devices 104, aninstitutional server 106,cloud storage system 108, and the like. However, note that thetransceiver 100 can be omitted and there may be a receiver and a transmitter, or there may be a receiver or a transmitter. Theremote devices 104 may be configured for viewing endoscopic imagery, video, or examination data or for remotely receiving controls and/or EMR data. The EMR data is a collection of patient and population health information electronically stored in a digital format. The EMR data may include a range of patient information such as demographics, medical history, medication, allergies, immunization status, laboratory test results, radiology images, vital signs, personal statistics, billing information, and the link. The EMR data can be stored on theinstitutional server 106, such as those located at a hospital, insurance company, government entity, or the like. The EMR data can also be stored on cloud storage system 88. Thecloud storage system 108 can be a data storage system that includes logical pools of physical storage mediums that span multiple servers and often multiple discrete locations in a distributed fashion to ensure redundancy, fault tolerance, and durability of the data. Theinstitutional server 106 andcloud storage system 108 can include a picture archiving and communication system (PACS) which is capable of providing storage and access to medical images from multiple modalities using a universal image format such as Digital Imaging and Communications in Medicine (DICOM) format. - It should be noted that, in some situations, the
health institution server 106 and cloud storage system 88 are compliant with data protection and privacy regulation such as the Health Insurance Portability and Accountability Act (HIPAA) in the United States of America, General Data Protection Regulation (GDPR) in the European Union, Personal Information Protection and Electronic Documents Act (PIPEDA) in Canada, National Health Portal compliance set by the Insurance Regulatory and Development Authority of India (IRDAI), or other compliance regulations mandated globally. - With reference to
FIG. 4 , a network diagram of an embodiment of theportable system 10 is depicted. TheWIU 22 is wirelessly coupled to alocal network 110 via awireless access point 112 using a suitable wireless transmission protocol such as the 802.11 families of modulation techniques, IEEE 802.15.4a ultra-wideband (UWB), and the like (Bluetooth), although a suitable wired or waveguide connection/hardware is possible. Thelocal network 110 may include cables, switches, and routers that may utilize Ethernet standards for communication. At least oneinstitutional server 106 may be in communication with thelocal network 110. For example, theinstitutional server 106 may store or have access to EMR data which may be accessed by theWIU 22. Additionally, thelocal network 104 may be attached to a picture archiving and communication system (PACS) 114 which may be in communication with theinstitutional server 106 and theWIU 22. At least oneexternal device 48 is in communication with the local network either directly via a physical connection or wirelessly via thewireless access point 112. In addition, afirewall 116 or other network security technology may be connected to thelocal network 110 to control access to theInternet 102. For example, aremote device 104 may be authorized to access thelocal network 110 via theInternet 102 utilizing a secure connection facilitated by thefirewall 116. In addition, thecloud storage system 108 may be configured to store or retrieve data and may be accessed via theInternet 102 which is facilitated by thefirewall 116. - With returning reference to
FIG. 3 , the WIU includes a motion processing unit (MPU) 120 which may include instructions or is configured to execute instructions stored on thestorage device 74 to receive motion signals from amotion sensor 122. Themotion sensor 122 includes at least one of a gyroscopic sensor configured to generate gyroscopic signals and an accelerometer configured to generate acceleration signals. The motion signals (e.g., the gyroscopic and acceleration signals) detect the motion of thehousing 28 during the therapeutic (or another action or technique) procedure which can be used to estimate the motion of thedistal tip 14 and/or theend effector 32. - The
WIU 22 includes a detection processing unit (DPU) 130 which may include instructions or is configured to execute instructions stored on thestorage device 74 to perform various detection-related functions. For example, these instructions may enable theDPU 130 to compare images of the time series to an artificial intelligence classifier (AIC) based on an anatomical model of theROI 16 to classify at least one anatomical feature in each image of the series. For example, the AIC may be based on an artificial neural network (ANN), which may include a convolutional neural network (CNN), a recurrent neural network (RNN), or other suitable ANNs. For example, thestorage device 74 may locally store the AIC, which may enable edge computing. This may be technologically advantageous in various environments, which may involve poor or no network connection. The exemplary embodiment, theROI 16 includes the following anatomical features:prostatic urethra 40,prostate 42,bladder 44,verumontanum 52, and thebladder neck 60. However, other urinary tract anatomical features are also contemplated, such as but not limited to the penile urethra, membranous urethra/external urinary sphincter, bulbous urethra, median lobe, lateral lobes, ureteral orifice, ureterovesical junction, ureter, ureteropelvic junction, renal pelvis, right/left ureteral orifice, infundibulum, calyx, and the like. The anatomical features may also include pathologies, such as, but not limited to hypertrophy, trabeculations, tumors/lesions, calculus, diverticulum, and the like. Likewise, as disclosed herein, the anatomic features may or may not be anatomical, whether medical or non-medical. - The
DPU 130 receives each image from the time series from theIPU 80, which may be in real-time or substantially in real-time, and compares each received image to the anatomical model, which may be in real-time or substantially in real-time, then determines a confidence metric based on the comparison, which may be in real-time or substantially in real-time. In the exemplary embodiment, the DPU includes instructions for a neural network based AIC, such as a deep learning convolutional neural network (DLCNN); however, it should be appreciated that other classifiers are also contemplated such as, but not limited to, a perceptron, a Naïve Bayes classifier, decision trees, logistical regression, K-Nearest Neighbor, a support vector machine, CNN, RNN, and the like. The AIC initially trains the anatomical model based on individual frames of previously captured times series from similar and/or adjacent ROIs. The training can be performed on theWIU 22 itself; however, the anatomical model can also be trained on anexternal device 48,remote device 104,institutional service 106, or thecloud storage system 108. The trained anatomical model can then be transferred to the working memory of theDPU 130 or thestorage device 74 via thewireless transceiver 100. This may enable edge computing. TheDPU 130 compares each image of the time series to the trained anatomical model in real-time or substantially in real-time and determines in real-time or substantially in real-time a classification for each image and a confidence metric based on the comparison. TheDPU 130 is configured to instruct theIPU 80 to display, concurrently with the corresponding image, the classified anatomical features on thedisplay 24. - The
DPU 130 also receives the motion signals from theMPU 120 in real-time or substantially in real-time and determines a motion vector of thehousing 28 in real-time or substantially in real-time which can then be used to estimate a motion vector of thedistal tip 14 of theendoscope 12 and/orend effector 32 of thetherapeutic device 30 in real-time or substantially in real-time. The motion vector can be a displacement vector, an acceleration vector, a velocity vector, a rotation vector, or the like. For example, theDPU 130 can estimate in real-time or substantially in real-time a displacement and direction of portions of theportable system 10 disposed within theROI 16 based on the detected motion of thehousing 28. TheDPU 130 is configured to instruct theIPU 80 to display, concurrently with the corresponding image, the determined motion vector and/or classified anatomical features on thedisplay 24 in real-time. - With reference to
FIGS. 2 & 3 , theDPU 130 can be configured to identify in real-time or substantially in real-time atreatment region 58 within theROI 16. In the exemplary embodiment, thetreatment region 58 may be a region proximal to theverumontanum 62 and distal to thebladder neck 60, thus, avoiding those delicate anatomical features. Based on the identifiedtreatment region 58 and the classifiedanatomical features ROI 16, theDPU 130 may also be configured to determine in real-time or substantially in real-time the one ormore treatment sites DPU 130 is configured to instruct theIPU 80 to display, concurrently with the corresponding image, the determined motion vector, the classifiedanatomical features determined treatment sites display 24 in real-time or substantially in real-time. - With reference to
FIGS. 5A-5D , endoscopes views of theROI 16 displayed on thedisplay 24 of the exemplary embodiment of a therapeutic procedure is depicted. The practitioner will view theentire ROI 16 to classify theanatomical features display 24 via thetouchscreen 78 to select the desired therapeutic procedure, although a default procedure may be selected or no default procedure is selected. TheDPU 130 then identifies the desired anatomical model based on the user input and retrieves the anatomical model from any one of thestorage device 74, theexternal device 102, theremote device 104, theinstitutional server 106, or thecloud storage system 108. The practitioner initiates the procedure via a touch command through the user interface or by an external command by an assistant. In response, themicrocontroller 72 instructs theIPU 80 to begin collecting images of the time series and instructs theMPU 120 to begin collection of the motion signals of thehousing 28. Themicroprocessor 72 may also retrieve procedure data and/or EMR data from one of thestorage device 74,external device 48, theexternal device 102, theremote device 104, theinstitutional server 106, or thecloud storage system 108 and displays the procedure data on thedisplay 24 for the practitioner to review before commencing the procedure. The practitioner commences the procedure by introducing the portable therapeutic system into theROI 16 to image the entirety of theROI 16 as prescribed by the procedure data. After theanatomical features anatomical features 140, the displayedmotion vector 142, and the displayed confidence metric 144; or the practitioner may choose to perform the procedure in a semi-automated fashion based on thedetermined treatment region 58 and thedetermined treatment sites - During a manual procedure, the practitioner may rely on a
relative motion vector 146 such as, for example, from thebladder neck 60 to apply treatment to the proximal treatment sites 56 a, 56 b. The practitioner will introduce thedistal tip 14 andend effector 32 into theROI 16 tillbladder 54 andbladder neck 60 is displayed as the classifiedanatomical feature 140 in real-time or substantially in real-time as, for example, a textual indicator indicating the corresponding anatomical feature and the displayed confidence metric 144 meets the practitioner's expectations as illustrated inFIG. 5A . The practitioner then may interact with thetouchscreen 78 of thedisplay 24 to initiate arelative motion vector 146 therefrom and retract thedistal tip 14 and/orend effector 32 until therelative motion vector 146 displays an adequate displacement and/or rotation to locate an optimal location for theproximal treatment sites FIG. 5B . The practitioner will engage theactuator 34 to deploy the treatment thereto. - Similarly, the practitioner may repeat this process with the
verumontanum 62 and thedistal treatment sites proximal treatment sites distal tip 14 and/orend effector 32 until theverumontanum 62 is displayed as a classifiedanatomical feature 140 in real-time or substantially in real-time. From there, the practitioner will protract thedistal tip 14 and/orend effector 32 until therelative motion vector 146 displays in real-time or substantially in real-time an adequate displacement and/or rotation to locate an optimal location for thedistal treatment sites FIG. 5C . - Finally, the practitioner will repeat the process as necessary to apply therapy to the
medial treatment sites distal treatment sites distal tip 14 and/orend effector 32 along thetreatment region 58 to identify regions of excess occlusion to identify one or moremedial treatment sites relative motion vector 144 to ensure that subsequentmedial treatment sites prostatic urethra 40 without creating unnecessary bulging adjacent to previously treatedtreatment sites - During a semi-automated procedure, the practitioner may rely on the automatically classified
treatment region 58 andtreatment sites DPU 130 in real-time or substantially in real-time. After theentire ROI 16 is imaged and theanatomical features DPU 130 in real-time or substantially in real-time, theDPU 130 then determines in real-time or substantially in real-time an optimal location for thetreatment sites distal tip 14 and end effector into theROI 16 until the displayedtreatment site 146 is achieved in real-time or substantially in real-time with a confidence metric 144 deemed sufficient by the practitioner. Once the optimal location is achieved, the practitioner engages theactuator 34 to deploy the treatment thereto. Similar to the manual procedure, the practitioner first deploys treatment at theproximal treatment sites distal treatment sites medial treatment sites prostatic urethra 40 is achieved. - With reference to
FIG. 6A , a flowchart of an embodiment of amethod 200 for training the anatomical model by theDPU 130 is depicted. The images of the time series captured in real-time or substantially in real-time and classified in real-time or substantially in real-time during the therapeutic (or another action or technique) procedure are stored in theimage memory 82 with corresponding classification and confidence metric metadata (or another form of data organization) as training data, S10. The training data can be used to retrain (or reinforce or update) the corresponding anatomical model. The training data is binned by theDPU 130 based on a predetermined confidence metric threshold, S12. Classified images with a confidence metric that exceeds (or satisfies) the predetermined confidence metric threshold are deemed as high confidence images and can be used to retrain (or reinforce or update) the corresponding anatomical model without further intervention. However, classified images that do not meet (or satisfy) the confidence metric threshold are deemed low confidence images and are binned for further manual verification (e.g., via a physical or virtual keyboard) by a user via thedisplay 24 or theexternal device 48, theremote device 104, theinstitutional service 106, or thecloud storage system 108. The said devices can retrieve the low confidence images binned for manual verification from theimage memory 82 via thenetwork 104 or theInternet 102. TheDPU 130 is configured to retrieve the classified images binned as high confidence images, S14, and to retrain (or reinforce or update) the anatomical models, S16. The retrained (or reinforced or updated) anatomical model is then stored on thestorage device 74 for future therapeutic (or other actions or techniques) procedures, S18. It should be appreciated that the retrained (or reinforced or updated) anatomical model may also be stored on one or more of theexternal device 102, theremote device 104, theinstitutional server 106, or thecloud storage system 108 for future therapeutic (or other actions or techniques) procedures. - With reference to
FIG. 3 , in another embodiment, theportable system 10 includes a training processing unit (TPU) 150 disposed within at least one of theexternal device 102, the remote device 84, the institutional service 86, or the cloud storage system 88 that performs the retraining (or reinforcement or updating) of the anatomical model. With reference toFIG. 6B , another embodiment of amethod 202 for training the anatomical model by theTPU 150 is depicted. The training data is retrieved by theTPU 150 of any one of theexternal device 102, the remote device 84, the institutional service 86, or the cloud storage system 88, S20. Similarly, theTPU 150 bins the images of the training data into high confidence bins and low confidence bins based on the predetermined confidence metric threshold, S22. TheTPU 150 is configured to retrieve the classified images binned as high confidence images, S24, and to retrain (or reinforce or update) the anatomical models, S26. The retrained (or reinforced or updated) anatomical model is then stored on a storage device of any one of theexternal device 102, theremote device 104, theinstitutional server 106, or thecloud storage system 108 for future therapeutic (or other action or techniques) procedures, S28. It should be appreciated that theWIU 22 may retrieve the retrained (or reinforced or updated) anatomical model stored on any one of theexternal device 102, theremote device 104, theinstitutional server 106, or thecloud storage system 108 and store retrained (or reinforced or updated) anatomical model on the storage device 64 for future therapeutic (or other action or techniques) procedures. - With reference to
FIG. 7 , a flowchart of amethod 300 for performing an endoscopic therapeutic (or other action or technique) procedure is depicted. A practitioner (e.g., a user, a physician, a technician) operatively couples theimaging coupler 26 of theWIU 22 to theobservation port 20 of theendoscope 12, S30. Via the user interface of thedisplay 24, the practitioner selects a desired therapeutic (or other action or technique) procedure, S32. Based on the practitioner's input, theDPU 130 of theWIU 22 determines and retrieves the corresponding anatomical model of the desired therapeutic (or other action or technique) procedure from any one of thestorage device 74, theexternal device 102, theremote device 104, theinstitutional server 106, or thecloud storage system 108, S34. The practitioner initiates the procedure via a touch command through the user interface and in response, themicrocontroller 72 instructs theIPU 80 to begin collecting imaging information and instructs theMPU 120 to begin collection of the motion signals of thehousing 28, S36. TheWIU 22 receives imaging information of theFoV 56 which comprises of at least a portion of theend effector 32 of the therapeutic (or other action or technique)device 30 and a portion of theROI 16, S38, which is converted in real-time or substantially in real-time into images of a time series and displayed in real-time on thedisplay 24. Concurrently, theDPU 130 detects in real-time or substantially in real-time the motion of thehousing 28 and estimates in real-time or substantially in real-time a motion vector of theendoscope 12 and/orend effector 32 based on the detected motion signals, S40. Based on the received imaging information and a comparison in real-time or substantially in real-time with the retrieved anatomical model, the artificial intelligence classifier of theDPU 130 classifies in real-time or substantially in real-time at least one anatomical feature in each image of the times series, S42. TheDPU 130 instructs themicroprocessor 72 to display, concurrently with the corresponding image of the time series, the classification of the at least oneanatomical feature 140, the determined confidence metric 144, and thedetermined motion vector 142, S44. - With reference to
FIG. 8 , a kit for performing an endoscopic procedure is illustrated. Thekit 400 includes at least theWIU 22 according to any one of the embodiments described above and instructions 402 for performing at least one of themethod 200 for training the anatomical model by theDPU 130; themethod 202 for training the anatomical model by theTPU 150; or themethod 300 for performing an endoscopic procedure, which, in some situations, may be an endoscopic therapeutic procedure according to or adapted to any one of the embodiments described above. For example, thekit 400 can include a container (e.g., a box, a plastic bag, a package, a case) containing theWIU 22 according to any one of the embodiments described above and instructions to perform an endoscopic procedure according to any one of the embodiments described above, whether the endoscopic procedure is medical (e.g., for prevention, forecasting, diagnosis, amelioration, monitoring, or treatment of medical conditions in various mammalian pathology) or not medical (e.g., to assist visual inspection of narrow, difficult-to-reach cavities). The container may also include at least one of thesystem controller 70, thelight source 18, theobservation port 20, theimaging sensor 90, themotion sensor 122, thetouchscreen 78, the I/O port 156, thedisplay 24, theexternal device 48, or others, as disclosed or not disclosed herein. - For example, the endoscopic procedure, as disclosed herein, may be used to prevent, diagnose, monitor, ameliorate, or treat a neurological condition, such as epilepsy, headache/migraine, whether primary or secondary, whether cluster or tension, neuralgia, seizures, vertigo, dizziness, concussion, aneurysm, palsy, Parkinson's disease, Alzheimer's disease, or others, as understood to skilled artisans and which are only omitted here for brevity. For example, the endoscopic procedure, as disclosed herein, may be used to prevent, diagnose, monitor, ameliorate, or treat a neurodegenerative disease, such as Alzheimer's disease, Parkinson's disease, multiple sclerosis, postoperative cognitive dysfunction, and postoperative delirium, or others, as understood to skilled artisans and which are only omitted here for brevity. For example, the endoscopic procedure, as disclosed herein, may be used to prevent, diagnose, monitor, ameliorate, or treat an inflammatory disease or disorder, such as Alzheimer's disease, ankylosing spondylitis, arthritis (osteoarthritis, rheumatoid arthritis (RA), Sjogren's syndrome, temporal arteritis, Type 2 diabetes, psoriatic arthritis, asthma, atherosclerosis, Crohn's disease, colitis, dermatitis, diverticulitis, fibromyalgia, hepatitis, irritable bowel syndrome (IBS), systemic lupus erythematous (SLE), nephritis, fibromyalgia, Celiac disease, Parkinson's disease, ulcerative colitis, chronic peptic ulcer, tuberculosis, periodontitis, sinusitis, hepatitis, Graves disease, psoriasis, pernicious anemia (PA), peripheral neuropathy, lupus or others, as understood to skilled artisans and which are only omitted here for brevity. For example, the endoscopic procedure, as disclosed herein, may be used to prevent, diagnose, monitor, ameliorate, or treat a gastrointestinal condition, such as ileus, irritable bowel syndrome, Crohn's disease, ulcerative colitis, diverticulitis, gastroesophageal reflux disease, or others, as understood to skilled artisans and which are only omitted here for brevity. For example, the endoscopic procedure, as disclosed herein, may be used to prevent, diagnose, monitor, ameliorate, or treat a bronchial disorder, such as asthma, bronchitis, pneumonia, or others, as understood to skilled artisans and which are only omitted here for brevity. For example, the endoscopic procedure, as disclosed herein, may be used to prevent, diagnose, monitor, ameliorate, or treat a coronary artery disease, heart attack, arrhythmia, cardiomyopathy, or others, as understood to skilled artisans and which are only omitted here for brevity. For example, the endoscopic procedure, as disclosed herein, may be used to prevent, diagnose, monitor, ameliorate, or treat a urinary disorder, such as urinary incontinence, urinalysis, overactive bladder, or others, as understood to skilled artisans and which are only omitted here for brevity. For example, the endoscopic procedure, as disclosed herein, may be used to prevent, diagnose, monitor, ameliorate, or treat eat a cancer, such as bladder cancer, breast cancer, prostate cancer, lung cancer, colon or rectal cancer, skin cancer, thyroid cancer, brain cancer, leukemia, liver cancer, lymphoma, pancreatic cancer, or others, as understood to skilled artisans and which are only omitted here for brevity. For example, the endoscopic procedure, as disclosed herein, may be used to prevent, diagnose, monitor, ameliorate, or treat a metabolic disorder, such as diabetes (
type 1, type 2, or gestational), Gaucher's disease, sick cell anemia, cystic fibrosis, hemochromatosis, or others, as understood to skilled artisans and which are only omitted here for brevity. For example, the non-medical endoscopic procedure may be used for visual inspection work where the target area is inaccessible by other means, or where accessibility may require destructive, time consuming and/or expensive dismounting activities. For example, the non-medical endoscopic procedure may be used for in nondestructive testing techniques for recognizing defects or imperfections (e.g., the visual inspection of aircraft engines, gas turbines, steam turbines, diesel engines, automotive engines, truck engines, machined or cast parts, surface finishes, complete through-holes, forensic applications in law enforcement, building inspection, in gunsmithing for inspecting the interior bore of a firearm). In these situations above, whether medical or non-medical, theROI 16, the model, thedevice 30, the implants, and relevant hardware/software and techniques of manufacture and use are adapted accordingly. Some embodiments may include a method comprising: receiving, by a processor, an imagery from an endoscope imaging a cavity, wherein the imagery depicts an anatomical feature within the cavity; performing, by the processor, a classification for the anatomical feature while the endoscope images the cavity; determining, by the processor, a confidence metric for the classficiation while the endoscope images the cavity; determining, by the processor, a motion vector for the endoscope imaging the cavity while the endoscope images the cavity; and requesting, by the processor, a display to simultaneously present at least two of the imagery, the classification, the confidence metric, or the motion vector while the endoscope images the cavity. The display may simultaneously present at least three of the imagery, the classification, the confidence metric, or the motion vector while the endoscope images the cavity. The display may simultaneously present the imagery, the classification, the confidence metric, and the motion vector while the endoscope images the cavity. The display may simultaneously present the imagery and at least two of the classification, the confidence metric, or the motion vector while the endoscope images the cavity. The display may simultaneously present the imagery and the classification and at least one of the confidence metric or the motion vector while the endoscope images the cavity. - Some embodiments may include a method comprising: receiving, by a processor, an imagery from an endoscope imaging a cavity, wherein the imagery depicts an anatomical feature within the cavity; performing, by the processor, a classification for the anatomical feature while the endoscope images the cavity; determining, by the processor, a confidence metric for the classficiation while the endoscope images the cavity; determining, by the processor, a motion vector for the endoscope imaging the cavity while the endoscope images the cavity; and taking, by the processor, an action based on the classification, the confidence metric, and the motion vector. The action may include deploying (e.g., moving, extending, adjusting, powering, grasping, cutting) an end effector within the cavity being imaged by the endoscope. For example, the end effector may be a component of a robotic arm (e.g., during a surgical procedure, an investigation of a cavity). The action may be with respect to the anatomical feature within the cavity (e.g., contacting the anatomical feature by the end effector). The action may not with respect to the anatomical feature within the cavity (e.g., another anatomical feature, an object internal to the cavity, an object external to the cavity). The cavity may be a mammalian cavity or an inanimate cavity. The action may include requesting, by the processor, a display to simultaneously present at least two of the imagery, the classification, the confidence metric, or the motion vector while the endoscope images the cavity. The display may simultaneously present at least three of the imagery, the classification, the confidence metric, or the motion vector while the endoscope images the cavity. The display may simultaneously present the imagery, the classification, the confidence metric, and the motion vector while the endoscope images the cavity. The display may simultaneously present the imagery and at least two of the classification, the confidence metric, or the motion vector while the endoscope images the cavity. The display may simultaneously present the imagery and the classification and at least one of the confidence metric or the motion vector while the endoscope images the cavity.
- Various embodiments of the present disclosure may be implemented in a data processing system suitable for storing and/or executing program code that includes at least one processor coupled directly or indirectly to memory elements through a system bus. The memory elements include, for instance, local memory employed during actual execution of the program code, bulk storage, and cache memory which provide temporary storage of at least some program code in order to reduce the number of times code must be retrieved from bulk storage during execution.
- I/O devices (including, but not limited to, keyboards, displays, pointing devices, DASD, tape, CDs, DVDs, thumb drives and other memory media, etc.) can be coupled to the system either directly or through intervening I/O controllers. Network adapters may also be coupled to the system to enable the data processing system to become coupled to other data processing systems or remote printers or storage devices through intervening private or public networks. Moderns, cable moderns, and Ethernet cards are just a few of the available types of network adapters.
- The present disclosure may be embodied in a system, a method, and/or a computer program product. The computer program product may include a computer readable storage medium (or media) having computer readable program instructions thereon for causing a processor to carry out aspects of the present disclosure. The computer readable storage medium can be a tangible device that can retain and store instructions for use by an instruction execution device. The computer readable storage medium may be, for example, but is not limited to, an electronic storage device, a magnetic storage device, an optical storage device, an electromagnetic storage device, a semiconductor storage device, or any suitable combination of the foregoing. A non-exhaustive list of more specific examples of the computer readable storage medium includes the following: a portable computer diskette, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), a static random access memory (SRAM), a portable compact disc read-only memory (CD-ROM), a digital versatile disk (DVD), a memory stick, a floppy disk, a mechanically encoded device such as punch-cards or raised structures in a groove having instructions recorded thereon, and any suitable combination [0181] Computer readable program instructions described herein can be downloaded to respective computing/processing devices from a computer readable storage medium or to an external computer or external storage device via a network, for example, the Internet, a local area network, a wide area network and/or a wireless network. The network may comprise copper transmission cables, optical transmission fibers, wireless transmission, routers, firewalls, switches, gateway computers and/or edge servers. A network adapter card or network interface in each computing/processing device receives computer readable program instructions from the network and forwards the computer readable program instructions for storage in a computer readable storage medium within the respective computing/processing device.
- Computer readable program instructions for carrying out operations of the present disclosure may be assembler instructions, instruction-set-architecture (ISA) instructions, machine instructions, machine dependent instructions, microcode, firmware instructions, state-setting data, or either source code or object code written in any combination of one or more programming languages, including an object oriented programming language such as Smalltalk, C++ or the like, and conventional procedural programming languages, such as the “C” programming language or similar programming languages. A code segment or machine-executable instructions may represent a procedure, a function, a subprogram, a program, a routine, a subroutine, a module, a software package, a class, or any combination of instructions, data structures, or pro-gram statements. A code segment may be coupled to another code segment or a hardware circuit by passing and/or receiving information, data, arguments, parameters, or memory contents. Information, arguments, parameters, data, etc. may be passed, forwarded, or transmitted via any suitable means including memory sharing, message passing, token passing, network transmission, among others. The computer readable program instructions may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the latter scenario, the remote computer may be connected to the user's computer through any type of network, including a local area network (LAN) or a wide area network (WAN), or an Internet Service Provider). In some embodiments, electronic circuitry including, for example, programmable logic circuitry, field-programmable gate arrays (FPGA), or programmable logic arrays (PLA) may execute the computer readable program instructions by utilizing state information of the computer readable program instructions to personalize the electronic circuitry, in order to perform aspects of the present disclosure.
- Aspects of the present disclosure are described herein with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the disclosure. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer readable program instructions. The various illustrative logical blocks, modules, circuits, and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, computer soft-ware, or combinations of both. To clearly illustrate this interchangeability of hardware and software, various illustrative components, blocks, modules, circuits, and steps have been described above generally in terms of their functionality. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the overall system. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present disclosure.
- The flowchart and block diagrams in the Figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods, and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of instructions, which comprises one or more executable instructions for implementing the specified logical function(s). In some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems that perform the specified functions or acts or carry out combinations of special purpose hardware and computer instructions.
- Words such as “then,” “next,” etc. are not intended to limit the order of the steps; these words are simply used to guide the reader through the description of the methods. Although process flow diagrams may describe the operations as a sequential process, many of the operations can be performed in parallel or concurrently. In addition, the order of the operations may be re-arranged. A process may correspond to a method, a function, a procedure, a subroutine, a subprogram, etc. When a process corresponds to a function, its termination may correspond to a return of the function to the calling function or the main function.
- Features or functionality described with respect to certain example embodiments may be combined and sub-combined in and/or with various other example embodiments. Also, different aspects and/or elements of example embodiments, as disclosed herein, may be combined and sub-combined in a similar manner as well. Further, some example embodiments, whether individually and/or collectively, may be components of a larger system, wherein other procedures may take precedence over and/or otherwise modify their application. Additionally, a number of steps may be required before, after, and/or concurrently with example embodiments, as disclosed herein. Note that any and/or all methods and/or processes, at least as disclosed herein, can be at least partially performed via at least one entity or actor in any manner.
- Although the foregoing has been described in some detail for purposes of clarity, it will be apparent that certain changes and modifications may be made without departing from the principles thereof. It should be noted that there are many alternative ways of implementing both the processes and apparatuses described herein. Accordingly, the present embodiments are to be considered as illustrative and not restrictive, and the body of work described herein is not to be limited to the details given herein, which may be modified within the scope and equivalents of the appended claims.
Claims (31)
1. An imaging unit for an endoscopic procedure comprising:
a housing;
an imaging coupler for receiving imaging information from an imaging assembly of an endoscope, the imaging assembly having a field of view (FoV) comprising of at least a portion of an end effector and a portion of a region of interest (ROI);
a display integrated into the housing;
an imaging processor configured with instructions to process the received imaging information into pixel values representing an image of a time series and to display the image in real-time on the display;
a motion sensor configured to detect a motion of the housing during the time series; and
a detection processing unit configured with instructions to:
classify at least one anatomical feature in each image of the time series based on an artificial intelligence classifier,
determine a confidence metric of the classification,
determine a motion vector based on the detected motion,
display, concurrently with the corresponding image, the classification of the at least one anatomical feature, the determined confidence metric, and the determined motion vector.
2. The imaging unit according to claim 1 , wherein the motion sensor includes at least a gyroscope configured to generate a gyroscopic signal and an accelerometer configured to generate acceleration signals, the detection processing unit further configured to determine a displacement vector based on at least the gyroscopic signal and the acceleration signal.
3. The imaging unit according to claim 2 , wherein the detection processing unit is configured to display the displacement vector concurrently with the corresponding classification.
4. The imaging unit according to claim 2 , wherein the detection processing unit is configured to display the displacement vector relative to one or more classified anatomical features.
5. The imaging unit according to claim 2 , wherein the detection processing unit is configured to display a plurality of displacement vectors each one relative to a unique classified anatomical feature.
6. The imaging unit according to claim 1 , wherein the artificial intelligence classifier is a convolutional neural network configured to compare each image to an anatomical model.
7. The imaging unit according to claim 6 , wherein the detection processing unit determines the confidence metric based on the comparison.
8. The imaging unit according to claim 1 , wherein the detection processing unit is further configured to:
identify at least one treatment site based on the at least one classified anatomical feature; and
display, concurrently with the corresponding image, the at least one identified treatment site and a relative motion vector between the classified anatomical feature and the identified treatment site.
9. The imaging unit according to claim 1 , wherein the region of the interest includes at least a prostatic urethra the administered therapy includes prostatic treatment, the detection processing unit is further configured to classify a prostatic pathology.
10. A method for endoscopic imaging including:
operatively coupling an imaging coupler of an imaging unit to an observation port of an endoscope;
receiving imaging information from an imaging assembly of the endoscope, the imaging assembly having an FoV comprising of at least a portion of an end effector and a portion of a ROI;
processing the received imaging information into pixel values representing an image of a time series;
displaying the image in real-time on a display integrated into the housing of the imaging unit;
detecting motion of the housing during the time series;
classifying at least one anatomical feature in each image of the time series based on an artificial intelligence classifier;
determining a confidence metric of the classification;
determining a motion vector based on the detected motion; and
displaying, concurrently with the corresponding image, the classification of the at least one anatomical feature, the determined confidence metric, and the determined motion vector.
11. The method according to claim 10 , wherein the step of detecting motion further includes generating a gyroscopic and acceleration signal associated with the motion of the housing, the method further including:
determining a displacement vector based on at least the gyroscopic signal and the acceleration signal.
12. The method according to claim 11 , further including displaying the displacement vector concurrently with the corresponding classification.
13. The method according to claim 12 , further including displaying the displacement vector relative to one or more classified anatomical features.
14. The method according to claim 1 , further including displaying a plurality of displacement vectors each one relative to a unique classified anatomical feature.
15. The method according to claim 10 , wherein the artificial intelligence classifier is a convolutional neural network configured to compare each image to an anatomical model.
16. The method according to claim 16 , wherein the confidence metric is based on the comparison.
17. The method according to claim 10 , further including:
identifying at least one treatment site based on the at least one classified anatomical feature; and
displaying, concurrently with the corresponding image, the at least one identified treatment site and the determined motion vector
18. The method according to claim 17 , wherein the region of the interest includes at least a prostatic urethra and the administered therapy includes prostatic treatment, the method further including classifying a prostatic pathology.
19. A kit for an endoscopic therapeutic procedure, comprising:
an endoscopic imaging unit comprising:
a housing;
an imaging coupler for receiving imaging information from an imaging assembly of the endoscope, the imaging assembly having a field of view (FoV) comprising of at least a portion of an end effector of and a portion of a region of interest (ROI);
a display integrated into the housing;
an image processing unit;
a motion sensor configured to detect a motion of the housing during the time series;
a detection processing unit; and
instructions to perform the following method:
operatively coupling the imaging coupler of the imaging unit to an observation port of the endoscope,
receiving the imaging information from the imaging assembly,
processing the received imaging information into pixel values representing an image of a time series,
displaying the image in real-time on the display,
detecting motion of the housing during the time series,
classifying at least one anatomical feature in each image of the time series based on an artificial intelligence classifier,
determining a confidence metric of the classification,
determining a motion vector based on the detected motion,
displaying, concurrently with the corresponding image, the classification of the at least one anatomical feature, the determined confidence metric, and the determined motion vector.
20. The kit according to claim 19 , wherein:
the step of detecting motion further includes generating a gyroscopic and acceleration signal associated with the motion of the housing, the method further including determining a displacement vector based on at least the gyroscopic signal and the acceleration signal; and
the artificial intelligence classifier is a convolutional neural network configured to compare each image to an anatomical model.
21. A method comprising:
receiving, by a processor, an imagery from an endoscope imaging a cavity, wherein the imagery depicts an anatomical feature within the cavity;
performing, by the processor, a classification for the anatomical feature while the endoscope images the cavity;
determining, by the processor, a confidence metric for the classficiation while the endoscope images the cavity;
determining, by the processor, a motion vector for the endoscope imaging the cavity while the endoscope images the cavity; and
requesting, by the processor, a display to simultaneously present at least two of the imagery, the classification, the confidence metric, or the motion vector while the endoscope images the cavity.
22. The method of claim 21 , wherein the display simultaneously presents at least three of the imagery, the classification, the confidence metric, or the motion vector while the endoscope images the cavity.
23. The method of claim 22 , wherein the display simultaneously presents the imagery, the classification, the confidence metric, and the motion vector while the endoscope images the cavity.
24. The method of claim 21 , wherein the display simultaneously presents the imagery and at least two of the classification, the confidence metric, or the motion vector while the endoscope images the cavity.
25. The method of claim 21 , wherein the display simultaneously presents the imagery and the classification and at least one of the confidence metric or the motion vector while the endoscope images the cavity.
26. A method comprising:
receiving, by a processor, an imagery from an endoscope imaging a cavity, wherein the imagery depicts an anatomical feature within the cavity;
performing, by the processor, a classification for the anatomical feature while the endoscope images the cavity;
determining, by the processor, a confidence metric for the classficiation while the endoscope images the cavity;
determining, by the processor, a motion vector for the endoscope imaging the cavity while the endoscope images the cavity; and
taking, by the processor, an action based on the classification, the confidence metric, and the motion vector.
26. The method of claim 25 , wherein the action includes deploying an end effector within the cavity being imaged by the endoscope.
27. The method of claim 25 , wherein the action is with respect to the anatomical feature within the cavity.
28. The method of claim 25 , wherein the action is not with respect to the anatomical feature within the cavity.
29. The method of claim 25 , wherein the cavity is a mammalian cavity.
30. The method of claim 25 , wherein the cavity is an inanimate cavity.
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US17/572,332 US20230218146A1 (en) | 2022-01-10 | 2022-01-10 | Systems, apparatuses, and methods for endoscopy |
PCT/US2023/010457 WO2023133339A1 (en) | 2022-01-10 | 2023-01-10 | Systems, apparatuses, and methods for endoscopy |
US18/103,022 US11864730B2 (en) | 2022-01-10 | 2023-01-30 | Systems, apparatuses, and methods for endoscopy |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US17/572,332 US20230218146A1 (en) | 2022-01-10 | 2022-01-10 | Systems, apparatuses, and methods for endoscopy |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US18/103,022 Continuation US11864730B2 (en) | 2022-01-10 | 2023-01-30 | Systems, apparatuses, and methods for endoscopy |
Publications (1)
Publication Number | Publication Date |
---|---|
US20230218146A1 true US20230218146A1 (en) | 2023-07-13 |
Family
ID=85221850
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US17/572,332 Pending US20230218146A1 (en) | 2022-01-10 | 2022-01-10 | Systems, apparatuses, and methods for endoscopy |
US18/103,022 Active US11864730B2 (en) | 2022-01-10 | 2023-01-30 | Systems, apparatuses, and methods for endoscopy |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US18/103,022 Active US11864730B2 (en) | 2022-01-10 | 2023-01-30 | Systems, apparatuses, and methods for endoscopy |
Country Status (2)
Country | Link |
---|---|
US (2) | US20230218146A1 (en) |
WO (1) | WO2023133339A1 (en) |
Family Cites Families (75)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS5861210U (en) | 1981-10-22 | 1983-04-25 | 富士写真光機株式会社 | Endoscope camera adapter |
US5311859A (en) | 1992-09-11 | 1994-05-17 | Welch Allyn, Inc. | Add-on video camera arrangement for optical laparoscope |
US5822546A (en) | 1996-03-08 | 1998-10-13 | George; Stanley W. | Hand held docking station with deployable light source, rechargeable battery pack and recessed grip, for connecting to a palm top computer |
US5808813A (en) | 1996-10-30 | 1998-09-15 | Smith & Nephew, Inc. | Optical coupler |
US6657654B2 (en) | 1998-04-29 | 2003-12-02 | International Business Machines Corporation | Camera for use with personal digital assistants with high speed communication link |
US20020103420A1 (en) | 2001-01-26 | 2002-08-01 | George Coleman | Endoscope with alterable viewing angle |
US6692431B2 (en) | 2001-09-07 | 2004-02-17 | Smith & Nephew, Inc. | Endoscopic system with a solid-state light source |
US8723936B2 (en) | 2002-03-12 | 2014-05-13 | Karl Storz Imaging, Inc. | Wireless camera coupling with rotatable coupling |
ATE543426T1 (en) | 2002-03-22 | 2012-02-15 | Ethicon Endo Surgery Inc | INTEGRATED VISUALIZATION SYSTEM |
US6646866B2 (en) | 2002-03-27 | 2003-11-11 | Chi-Lie Kao | Protective case for a tablet personal computer |
AUPS219002A0 (en) | 2002-05-08 | 2002-06-06 | Lion Eye Institute, The | Digital hand-held imaging device |
US6952343B2 (en) | 2002-06-11 | 2005-10-04 | Fujitsu Limited | Functional expansion apparatus and method for attaching electronic apparatus to the functional expansion apparatus |
JP3869324B2 (en) * | 2002-06-26 | 2007-01-17 | オリンパス株式会社 | Image processing device for fluorescence observation |
CN101264001B (en) * | 2003-04-25 | 2010-11-10 | 奥林巴斯株式会社 | Image display apparatus |
NL1024115C2 (en) | 2003-08-15 | 2004-10-01 | P J M J Beheer B V | Device is for coupling endoscope to mobile picture telephone and is recorded by a camera contained in housing of telephone |
EP1709474A4 (en) | 2003-09-26 | 2010-01-06 | Tidal Photonics Inc | Apparatus and methods relating to color imaging endoscope systems |
EP1645219B1 (en) | 2004-02-16 | 2016-11-09 | Olympus Corporation | Endoscope system |
DE102004009384B4 (en) | 2004-02-26 | 2005-12-22 | Olympus Winter & Ibe Gmbh | Video endoscopic system |
US20070268280A1 (en) * | 2004-08-23 | 2007-11-22 | Manabu Fujita | Image Display Apparatus, Image Display Method, and Image Display Program |
US8858425B2 (en) | 2004-09-24 | 2014-10-14 | Vivid Medical, Inc. | Disposable endoscope and portable display |
US8029439B2 (en) | 2005-01-28 | 2011-10-04 | Stryker Corporation | Disposable attachable light source unit for an endoscope |
JP4418400B2 (en) * | 2005-05-20 | 2010-02-17 | オリンパスメディカルシステムズ株式会社 | Image display device |
CA2509590A1 (en) | 2005-06-06 | 2006-12-06 | Solar International Products Inc. | Portable imaging apparatus |
WO2007023771A1 (en) * | 2005-08-22 | 2007-03-01 | Olympus Corporation | Image display device |
US20100145146A1 (en) | 2005-12-28 | 2010-06-10 | Envisionier Medical Technologies, Inc. | Endoscopic digital recording system with removable screen and storage device |
US7849250B2 (en) | 2006-10-31 | 2010-12-07 | Sonosite, Inc. | Docking station with hierarchal battery management for use with portable medical equipment |
US20080183910A1 (en) | 2006-12-28 | 2008-07-31 | Natoli Joseph D | Personal medical device (PMD) docking station |
JP2009050321A (en) * | 2007-08-23 | 2009-03-12 | Olympus Corp | Image processor |
US8367235B2 (en) | 2008-01-18 | 2013-02-05 | Mophie, Inc. | Battery pack, holster, and extendible processing and interface platform for mobile devices |
WO2009135255A1 (en) | 2008-05-07 | 2009-11-12 | Signostics Pty Ltd | Docking system for medical diagnostic scanning using a handheld device |
WO2009137114A2 (en) | 2008-05-09 | 2009-11-12 | Ipowerup, Inc. | Portable and universal hybrid-charging apparatus for portable electronic devices |
CN101721199B (en) * | 2008-10-14 | 2012-08-22 | 奥林巴斯医疗株式会社 | Image display device and image display method |
US7782610B2 (en) | 2008-11-17 | 2010-08-24 | Incase Designs Corp. | Portable electronic device case with battery |
CN102421350B (en) * | 2009-03-11 | 2014-12-17 | 奥林巴斯医疗株式会社 | Image processing system, external device therefor, and image processing method therefor |
US20120162401A1 (en) * | 2009-04-20 | 2012-06-28 | Envisionier Medical Technologies, Inc. | Imaging system |
US20100279418A1 (en) | 2009-05-04 | 2010-11-04 | Loren Robert Larson | Glucose meter adaptable for use with handheld devices, and associated communication network |
US20110015496A1 (en) | 2009-07-14 | 2011-01-20 | Sherman Lawrence M | Portable medical device |
US9258394B2 (en) | 2009-11-12 | 2016-02-09 | Arun Sobti & Associates, Llc | Apparatus and method for integrating computing devices |
US20110195753A1 (en) | 2010-02-11 | 2011-08-11 | Jason Mock | Smartphone Case with LEDS |
JP5242852B2 (en) * | 2010-09-28 | 2013-07-24 | オリンパスメディカルシステムズ株式会社 | Image display device and capsule endoscope system |
US8711552B2 (en) | 2010-10-06 | 2014-04-29 | Compal Electronics Inc. | Modular system having expandable form factor |
US10142448B2 (en) | 2011-03-04 | 2018-11-27 | Blackberry Limited | Separable mobile device having a control module and a docking station module |
WO2012154578A1 (en) | 2011-05-06 | 2012-11-15 | The Trustees Of The University Of Pennsylvania | Ped - endoscope image and diagnosis capture system |
US20150073285A1 (en) | 2011-05-16 | 2015-03-12 | Alivecor, Inc. | Universal ecg electrode module for smartphone |
US20120320340A1 (en) | 2011-06-18 | 2012-12-20 | Intuitive Medical Technologies, Llc | Smart-phone adapter for ophthalmoscope |
US20130083185A1 (en) | 2011-09-30 | 2013-04-04 | Intuitive Medical Technologies, Llc | Optical adapter for ophthalmological imaging apparatus |
EP3584799B1 (en) | 2011-10-13 | 2022-11-09 | Masimo Corporation | Medical monitoring hub |
US10029079B2 (en) | 2011-10-18 | 2018-07-24 | Treble Innovations | Endoscopic peripheral |
US20130102359A1 (en) | 2011-10-20 | 2013-04-25 | Tien-Hwa Ho | Smart phone-combinable otologic inspection device |
AU2012335072B2 (en) | 2011-11-09 | 2016-09-08 | Welch Allyn, Inc. | Digital-based medical devices |
US20130281155A1 (en) | 2012-03-22 | 2013-10-24 | Kyocera Corporation | System, electronic device, and charger |
DE102012206413A1 (en) | 2012-04-18 | 2013-10-24 | Karl Storz Gmbh & Co. Kg | Rotary device and method for rotating an endoscope |
USD710856S1 (en) | 2012-05-23 | 2014-08-12 | Isaac S. Daniel | Slideable cover which includes biometric verification means for a tablet device |
US9310300B2 (en) | 2012-08-03 | 2016-04-12 | Ingeneron Incorporated | Compact portable apparatus for optical assay |
US20140073969A1 (en) | 2012-09-12 | 2014-03-13 | Neurosky, Inc. | Mobile cardiac health monitoring |
US9107573B2 (en) | 2012-10-17 | 2015-08-18 | Karl Storz Endovision, Inc. | Detachable shaft flexible endoscope |
US20140140049A1 (en) | 2012-11-16 | 2014-05-22 | Manuel Cotelo | Electronic device case with illuminated magnifier |
US9451874B2 (en) | 2012-11-16 | 2016-09-27 | Clearwater Clinical Limited | Adapter to couple a mobile phone to an endoscope |
US9642563B2 (en) | 2012-12-18 | 2017-05-09 | Crawford Capital Investments, Llc | Glucose monitoring device in a protective smartphone case |
US20140195180A1 (en) | 2013-01-04 | 2014-07-10 | 1 Oak Technologies, LLC | Electronic device power management |
US20140200054A1 (en) | 2013-01-14 | 2014-07-17 | Fraden Corp. | Sensing case for a mobile communication device |
US20140249405A1 (en) | 2013-03-01 | 2014-09-04 | Igis Inc. | Image system for percutaneous instrument guidence |
US20140364711A1 (en) | 2013-03-27 | 2014-12-11 | AkibaH Health Corporation | All-in-one analyte sensor in a detachable external mobile device case |
US9092300B2 (en) | 2013-04-18 | 2015-07-28 | Ottr Products, Llc | Peripheral device and method for updating firmware thereof |
US9075906B2 (en) | 2013-06-28 | 2015-07-07 | Elwha Llc | Medical support system including medical equipment case |
US8928746B1 (en) | 2013-10-18 | 2015-01-06 | Stevrin & Partners | Endoscope having disposable illumination and camera module |
US20170099479A1 (en) | 2014-05-20 | 2017-04-06 | University Of Washington Through Its Center For Commercialization | Systems and methods for mediated-reality surgical visualization |
WO2015191954A1 (en) | 2014-06-12 | 2015-12-17 | Endoluxe Inc. | Encasement platform for smartdevice for attachment to endoscope |
JP6234333B2 (en) | 2014-06-25 | 2017-11-22 | オリンパス株式会社 | Imaging system and imaging method |
US11330963B2 (en) | 2015-11-16 | 2022-05-17 | Lazurite Holdings Llc | Wireless medical imaging system |
US10499792B2 (en) | 2016-03-25 | 2019-12-10 | University Of Washington | Phone adapter for flexible laryngoscope and rigid endoscopes |
US10051166B2 (en) | 2016-04-27 | 2018-08-14 | Karl Storz Imaging, Inc. | Light device and system for providing light to optical scopes |
JP2021527522A (en) * | 2018-06-21 | 2021-10-14 | プロセプト バイオロボティクス コーポレイション | Surgical Robot Artificial Intelligence for Surgical Surgery |
WO2020257533A1 (en) * | 2019-06-19 | 2020-12-24 | Neotract, Inc. | Devices and methods for targeting implant deployment in tissue |
US20230014490A1 (en) * | 2020-03-21 | 2023-01-19 | Smart Medical Systems Ltd. | Artificial intelligence detection system for mechanically-enhanced topography |
-
2022
- 2022-01-10 US US17/572,332 patent/US20230218146A1/en active Pending
-
2023
- 2023-01-10 WO PCT/US2023/010457 patent/WO2023133339A1/en unknown
- 2023-01-30 US US18/103,022 patent/US11864730B2/en active Active
Also Published As
Publication number | Publication date |
---|---|
US11864730B2 (en) | 2024-01-09 |
WO2023133339A1 (en) | 2023-07-13 |
US20230248211A1 (en) | 2023-08-10 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11600385B2 (en) | Medical image processing device, endoscope system, diagnosis support method, and program | |
EP3845194B1 (en) | Analyzing surgical trends by a surgical system and providing user recommandations | |
JP4994737B2 (en) | Medical image processing apparatus and medical image processing method | |
US20070015989A1 (en) | Endoscope Image Recognition System and Method | |
Gulati et al. | The future of endoscopy: Advances in endoscopic image innovations | |
EP3936026B1 (en) | Medical image processing device, processor device, endoscopic system, medical image processing method, and program | |
US20210361142A1 (en) | Image recording device, image recording method, and recording medium | |
EP3994702B1 (en) | Surgery support system, surgery support method, information processing apparatus, and information processing program | |
EP4091532A1 (en) | Medical image processing device, endoscope system, diagnosis assistance method, and program | |
WO2014069608A1 (en) | Examination information management device and examination information management system | |
Fonolla et al. | Automatic image and text-based description for colorectal polyps using BASIC classification | |
US11864730B2 (en) | Systems, apparatuses, and methods for endoscopy | |
US20220338717A1 (en) | Endoscopic examination support device, endoscopic examination support method, and endoscopic examination support program | |
US20230122835A1 (en) | Methods and systems for generating clarified and enhanced intraoperative imaging data | |
US20230122179A1 (en) | Procedure guidance for safety | |
EP4290529A1 (en) | Method for training artificial neural network having use for detecting prostate cancer from turp pathological images, and computing system performing same | |
US20210241457A1 (en) | Endoscope system, and image processing apparatus and image processing method used in endoscope system | |
Gettman et al. | Initial experimental evaluation of wireless capsule endoscopes in the bladder: implications for capsule cystoscopy | |
US20230306592A1 (en) | Image processing device, medical diagnosis device, endoscope device, and image processing method | |
WO2024004013A1 (en) | Program, information processing method, and information processing device | |
US20230000319A1 (en) | Method and apparatus for biometric tissue imaging | |
Boini et al. | Scoping review: autonomous endoscopic navigation | |
WO2023181353A1 (en) | Image processing device, image processing method, and storage medium | |
WO2023162216A1 (en) | Image processing device, image processing method, and storage medium | |
Swamy et al. | Design and Development of Innovative Integrated Technology for Endoscopic Surgeries |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: ENDOLUXE INC., PENNSYLVANIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:PATEL, NEAL;ZHAO, PHILIP;BREAM, DEVON;REEL/FRAME:058730/0117 Effective date: 20220110 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |