CN115708658A - Panoramic endoscope and image processing method thereof - Google Patents

Panoramic endoscope and image processing method thereof Download PDF

Info

Publication number
CN115708658A
CN115708658A CN202110970203.3A CN202110970203A CN115708658A CN 115708658 A CN115708658 A CN 115708658A CN 202110970203 A CN202110970203 A CN 202110970203A CN 115708658 A CN115708658 A CN 115708658A
Authority
CN
China
Prior art keywords
endoscope
camera
panoramic
polyp
cameras
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110970203.3A
Other languages
Chinese (zh)
Inventor
杜武华
徐健玮
佴广金
钱大宏
黄显峰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shandong Weigao Hongrui Medical Technology Co Ltd
Original Assignee
Shandong Weigao Hongrui Medical Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shandong Weigao Hongrui Medical Technology Co Ltd filed Critical Shandong Weigao Hongrui Medical Technology Co Ltd
Priority to CN202110970203.3A priority Critical patent/CN115708658A/en
Priority to PCT/CN2022/102948 priority patent/WO2023024701A1/en
Publication of CN115708658A publication Critical patent/CN115708658A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/005Flexible endoscopes
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/04Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/04Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
    • A61B1/05Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances characterised by the image sensor, e.g. camera, being in the distal end portion
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/06Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/06Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements
    • A61B1/07Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements using light-conductive means, e.g. optical fibres
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/31Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor for the rectum, e.g. proctoscopes, sigmoidoscopes, colonoscopes

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Surgery (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Biomedical Technology (AREA)
  • Optics & Photonics (AREA)
  • Pathology (AREA)
  • Radiology & Medical Imaging (AREA)
  • Biophysics (AREA)
  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Endoscopes (AREA)

Abstract

The application relates to and discloses a panoramic endoscope, include: the main camera is arranged right in front of the front end of the endoscope; a plurality of sub-cameras disposed on a side wall of the tip; the visual field of the main camera and the visual fields of the plurality of auxiliary cameras cover a spherical visual field. The panoramic endoscope uses the plurality of auxiliary cameras to be matched with the main camera to form a spherical view, and simultaneously observes the right front part, the side part and the rear part of the lens, thereby overcoming the view blind area caused by the shielding of the intestinal wall and reducing the missed diagnosis of polyps; polyps appearing behind the lens can be collected by the auxiliary camera in time, an operator does not need to manipulate the snake bone to enable the endoscope lens to be bent backwards and observed, and the operation of endoscope entering or exiting is simplified.

Description

Panoramic endoscope and image processing method thereof
Technical Field
The invention belongs to the technical field of medical detection, and particularly relates to a panoramic endoscope and a medical image identification and monitoring system.
Background
Endoscopic surgery refers to the search and screening of the interior of the human body within a hollow organ or cavity of the human body by a medical device (endoscope) for medical purposes. Unlike most other medical imaging devices, endoscopes are inserted directly into organs and typically use optical cameras at or near visible frequencies (such as infrared beams) to produce images and video frames from inside the organ. Typically, endoscopic surgery is used to examine the organ being tested for evidence of suspicious local regions, such as polyps, tumors, or cancer cells.
Colorectal cancer is the second most lethal cancer in china and the third most lethal cancer in the world. Early detection and resection of colorectal polyps is critical to reduce mortality in patients with colorectal cancer. Colonoscopic surgery is the endoscopic examination of the large intestine (large bowel) and the distal part of the small bowel with an optical camera, usually with a CCD or CMOS camera, over an optical fiber or flexible tube through the anus. As a gold standard for colorectal cancer screening, colonoscopy can reduce the risk of death from colorectal cancer by early detection of tumors and removal of precancerous lesions. It provides a visual diagnosis (e.g. ulcers, polyps) and an opportunity for biopsy or resection of suspected colorectal cancer lesions.
A major interest of the general public in colonoscopy is the removal of polyps smaller than 1 millimeter (mm) or less. Once polyps are removed, they can be studied with the aid of a microscope to determine whether they are precancerous. Polyps take 15 years or less to become cancerous. For every 1.0% increase in Adenoma Detection Rate (ADR), the risk of colorectal cancer will be reduced by 3.0%; conversely, missed diagnosis of adenomas may lead to further development of the tumor, delaying the timing of treatment: the error rate for polyp detection is estimated to be 4% -12%, but recent clinical findings have shown that the error rate can be as high as 25%. False detection of colorectal cancer may lead to subsequent metastatic colon cancer with less than 10% survival. The detection rate of adenoma is highly dependent on the quality of intestinal tract preparation and the experience of doctors, so that the detection rate is highly subjective. Two problems are faced in performing endoscopy: (1) Polyps appear within the field of view of the endoscope, but are not detected by the physician; (2) A polyp never appears in the field of view of the endoscope.
Since the introduction of the concept of deep neural networks, many studies related to artificial intelligence assisted endoscopic analysis have been reported. However, these focus primarily on assisting the polyp detection task, and are only relevant to problem (1). The current computer-aided polyp detection system can only help doctors to find polyps appearing in an endoscope visual field, but cannot help the current computer-aided polyp detection system to detect blind areas caused by operation experience problems of the doctors and complex intestinal environment, so that some polyps never appear in the endoscope visual field to cause missed diagnosis. Therefore, how to realize the blind zone quality control of the endoscopy by a new hardware design and a computer-aided system is a problem to be considered.
Disclosure of Invention
It is an object of the present application to provide a panoramic endoscope that helps a physician to find polyps that are not present in the field of view due to bowel wall fold occlusion in a timely manner.
The application discloses panorama endoscope includes:
a main camera disposed right in front of a tip of the endoscope;
a plurality of sub-cameras disposed on a side wall of the tip;
wherein the visual field of the main camera and the visual fields of the plurality of auxiliary cameras jointly cover a spherical visual field.
In a preferred embodiment, the method further comprises,
the main illuminating lens is arranged right in front of the front end, is connected to the light source through a light guide beam and is used for providing illumination for the main camera; and
and the auxiliary illuminating lens is arranged on the side wall of the front end, is connected to the light source through a light guide beam and is used for providing illumination for the auxiliary camera.
In a preferred example, the main camera is higher in pixel quantity than the sub camera.
In a preferred embodiment, the view angle of the spherical view is not less than 330 degrees.
The application also discloses a panoramic endoscope image processing method, which comprises the following steps:
synchronously acquiring images from a main camera and a plurality of auxiliary cameras of the panoramic endoscope as described above;
polyp detection is respectively carried out on images synchronously acquired from the main camera and the plurality of auxiliary cameras;
and if the polyp is detected, sending out a polyp alarm, and outputting prompt information of a camera to which the image of the polyp is detected.
In a preferred embodiment, the step of polyp detection of images synchronously acquired from the main camera and the plurality of sub-cameras comprises: center enhancement, denoising and color rendering processing are carried out to improve the image quality and the recognition degree.
In a preferred embodiment, when a polyp is detected, the alert mode includes at least one of: highlighting, framing, circling, flashing, or audio prompting on an external monitor screen.
In a preferred example, the images collected by the main camera and the auxiliary camera are shot in a white light mode.
In a preferred embodiment, the captured image is input to a trained object detection model to detect whether polyps are present in real time;
the object detection model comprises a first encoder and a detector;
the first encoder is configured to perform feature extraction on an image obtained in a white light mode to obtain a feature map;
the detector is configured to regress the input feature map to obtain coordinates of a location where a polyp is located.
In a preferred embodiment, polyp detection is performed by a convolutional neural network, and the extracted target features include at least one of: vessel color features, duct features, edge features.
The panoramic endoscope and the medical image identification and monitoring system at least have the following technical effects:
1. the multiple auxiliary cameras and the main camera are matched to form a spherical view field, and the right front, the side and the rear of the lens are observed simultaneously, so that a view field blind area caused by intestinal wall shielding is overcome, and polyp missed diagnosis is reduced;
2. polyps appearing behind the lens can be collected by the auxiliary camera in time without an operator manipulating a snake bone to enable the endoscope lens to be bent backwards and observed, and operation of endoscope entering or withdrawing is simplified.
The present specification describes a number of technical features distributed throughout the various technical aspects, and if all possible combinations of technical features (i.e. technical aspects) of the present specification are listed, the description is made excessively long. In order to avoid this problem, the respective technical features disclosed in the above summary of the invention of the present application, the respective technical features disclosed in the following embodiments and examples, and the respective technical features disclosed in the drawings may be freely combined with each other to constitute various new technical solutions (which should be regarded as having been described in the present specification) unless such a combination of the technical features is technically infeasible. For example, in one example, the feature a + B + C is disclosed, in another example, the feature a + B + D + E is disclosed, and the features C and D are equivalent technical means for the same purpose, and technically only one feature is used, but not simultaneously employed, and the feature E can be technically combined with the feature C, then the solution of a + B + C + D should not be considered as being described because the technology is not feasible, and the solution of a + B + C + E should be considered as being described.
Drawings
FIG. 1 is a schematic view of a panoramic endoscope according to the present application;
FIG. 2 is a schematic view of a light source of a panoramic endoscope according to one embodiment of the present application;
FIG. 3 is a schematic diagram of the operation of a panoramic endoscope according to an embodiment of the present application;
FIG. 4 is a perspective schematic view of a panoramic endoscope according to an embodiment of the present application;
FIG. 5 is a flow diagram of a medical image identification monitoring system according to an embodiment of the present application;
FIG. 6 is a schematic structural diagram of a panoramic endoscope including a structured light pathway according to an embodiment of the present application;
fig. 7 is a schematic structural diagram of an object detection model of a lesion absolute size measurement method under a panoramic endoscope according to the present application;
fig. 8 is a schematic structural diagram of a U-NET network based on an encoder-decoder structure according to the focus absolute size measurement method under the panoramic endoscope of the present application;
fig. 9 is a schematic view of a lesion size calculation model of a lesion absolute size measuring method under a panoramic endoscope according to the present application.
Description of reference numerals:
1-panoramic endoscope
10-front end
110-main camera
111-main lighting lens
112-dominant light beam
1121-dominant beam interface
120-first sub camera
121-first sub-illuminating lens
122-guiding light from a bundle
1221 interface from light guide Beam
130-second sub-camera
131-second sub-illumination lens
140-third sub-camera
141-third sub-illuminating lens
2-Lamp box
201 light source
300-structured light path
31-structured light projector
32-optical fiber
33-coupler
34-structure light generator
Detailed Description
In the following description, numerous technical details are set forth in order to provide a better understanding of the present application. However, it will be understood by those skilled in the art that the technical solutions claimed in the present application may be implemented without these technical details and with various changes and modifications based on the following embodiments.
Term(s) for
As used herein, "panoramic endoscope," "endoscope," are used interchangeably to refer to a panoramic endoscope of the present invention.
Spherical field of view: the spherical field of view referred to in the present application is a spherical or nearly spherical field of view formed by centering one point near the main camera and the sub camera on the distal end 10 of the panoramic endoscope. Since the sphere can be composed of an infinite number of spherical sectors, the spherical view of the present application can also be pieced together by a specified number of spherical sectors.
A first embodiment of the present invention relates to a panoramic endoscope. As shown in fig. 1, and more particularly, to a panoramic endoscope 1. The panoramic endoscope 1 has a bendable, rigid tip 10, optionally formed of a snake bone. A plurality of cameras are arranged on the front end 10, specifically, one main camera 110 is included, and the shooting direction of the main camera 110 faces the advancing direction of the panoramic endoscope 1; a plurality of sub-cameras, arranged on the end side peripheral wall of the panoramic endoscope 1 and distributed uniformly, optionally 2-6, preferably 3 (as shown in fig. 1): a first sub-camera 120, a second sub-camera 130, and a third sub-camera 140.
As shown in fig. 4, the field of view of the main camera and the fields of view of the plurality of sub-cameras collectively cover one spherical field of view. Referring to the left half of fig. 4, in one embodiment, when only the main camera exists, the field of view is a spherical sector with an apex angle of 140 degrees (the main camera can be regarded as the center of sphere, and the apex angle is located at the center of sphere), when three sub-cameras exist, the field of view of each sub-camera (e.g., the first sub-camera 120) covers a spherical sector with an apex angle of 120 degrees, respectively, so that the field of view combination of the main camera and the sub-camera can cover a spherical field of view of not less than 330 degrees.
The main camera 110 is provided with a main illumination lens 111 around it, and one or more sub-illumination lenses are provided around each sub-camera to provide side illumination, such as the first sub-illumination lens 121, the second sub-illumination lens 131, and the third sub-illumination lens 141 of fig. 2. The main lighting lens and the auxiliary lighting lens are respectively connected to a main light beam interface and a secondary light beam interface of the light box through a main light beam and a secondary light beam. Alternatively, the light source in the light box can be one of a xenon lamp, a laser or an LED, and light intensity adjustment and illumination spectrum switching can be performed. The dominant beam interface can provide strong light intensity and switch NBI or multiple spectra, from a light guiding beam with weak light intensity, only provide auxiliary illumination, and does not need the function of spectrum switching.
In one embodiment, the secondary light beam is generated by splitting the primary light beam. The light splitting means may be a light splitter. The splitter distributes the light of the leading light beam to the links from the light guide beam in a proportion corresponding to the optical power.
The main camera and the auxiliary camera have different image quality. The main camera 110 is a high definition camera, optionally 200 ten thousand pixels, and has a resolution of 1920 × 1080; the three secondary cameras are standard definition cameras, have small volumes, optionally 10-200 ten thousand pixels, preferably 16 ten thousand pixels, and have a resolution of 480 × 320. Along with the development of technology, the volume of camera can be littleer and littleer, and the cost can be lower and lower, and the resolution ratio of main camera and vice camera can further promote. However, the resolution of the primary camera is typically higher than that of the secondary camera.
As shown in fig. 3, when in use, the tip 10 is inserted into the image acquisition device set and electrically connected to an external processor, so as to acquire a plurality of images with a spherical view angle of not less than 330 ° in real time and transmit the images to the external image processor.
Another embodiment of the present invention relates to a panoramic endoscope image processing method.
In this embodiment, the panoramic endoscope of the present invention is used to acquire images in a colon exam and identify the location of different types of suspicious tissue, in this case polyps. Polyps can be: large/medium polyps, small polyps, and most importantly flat polyps are detected. In addition to polyps, the system can identify other suspicious tissue anywhere or on the side of a colon fold. Detection of polyps or other suspicious tissue is performed on-line when actually performing colonoscopy surgery, sigmoidoscopy, etc. In one embodiment, the plurality of images are acquired from at least one of the following positions of the patient's body: abdominal cavity, esophagus, stomach, nasal cavity, trachea, bronchus, uterine cavity, vagina.
Step 501 of the method: and synchronously acquiring images by using a main camera and a plurality of auxiliary cameras of the panoramic endoscope.
Step 502: polyp detection is performed on images synchronously acquired from the main camera and the plurality of sub-cameras respectively. In some embodiments, the images synchronously acquired by the main camera and the multiple auxiliary cameras are further subjected to one or more of the following pre-processing: a histogram improvement algorithm; adaptively enhancing contrast, brightness and color normalization of the image frame according to predefined criteria; improving the super-resolution of the image frame; the unbalanced scaling of the luminance and color channels in the image frame to obtain a dominant color frequency, wherein each color channel is individually equalized and controlled to eliminate noise enhancement; applying signal noise measurements and reducing noise or filtering frames accordingly; verifying that the image is in focus and filtering unfocused frames; or any combination thereof.
Step 503: and if the polyp is detected, sending out a polyp alarm, and outputting prompt information of a camera to which the image of the polyp is detected. In one embodiment, when polyps are detected in an image captured by one of the sub-cameras (e.g., first sub-camera 120), the reminder information for first sub-camera 120 is output to the medical operator via an output device (e.g., a display, etc.). When a polyp is detected, the alert mode includes at least one of: highlighting, framing, circling, flashing, or audio prompting on an external monitor screen. The operator can control the main camera to collect the area of the alarm prompt in the subsequent interaction with the I/O device so as to obtain an image with higher resolution for observation.
Optionally, in an embodiment, the convolutional neural network model includes three models trained according to a qualified picture library, a part library and a part feature library, and the three models are respectively used for determining whether the endoscopic image is qualified, judging the part, identifying the part feature and determining the cancer range. The model is Resnet50, is developed by adopting Python language, and is called by other modules after being packaged into RESTful API (REST-style network interface).
The panoramic endoscope shoots in a white light mode, and shot images are input into the trained target detection model to detect whether polyps appear in real time.
If polyps are detected, the polyp regions are automatically segmented in real time using a deep neural network based on the encoder-decoder structure.
In one embodiment, the panoramic endoscope also has a structured light pathway 300, i.e., another pathway than the ordinary illumination light pathway, which can be used to measure lesion size under the endoscope. As shown in FIG. 6, the structured light pathway 300 includes a structured light generator 34, a coupler 33, an optical fiber 32, and a structured light projector 31.
The actual choice of the structured light generator 34 to produce visible light at wavelengths between 400-700nm or to produce light with a pass band width below 5% of the center wavelength value can be determined experimentally to minimize reflections to obtain the clearest structured light image.
The coupler 33 is configured to achieve coupling between the light source 201 and the optical fiber 32; the structured light generator 34 is configured to be cut in and out between the coupler 33 and the light source 201; one end of the optical fiber 32 is connected to the coupler 33 and the other end is connected to the structured light projector 31; the main illumination lens 111 and the structured light projector 31 are disposed at the end of the endoscope. The structured light is projected onto the tissue surface by the structured light projector 31 after passing through the coupler 33 and the structured light fiber 32. The structured light is projected to the surface of the object to be measured and then is modulated by the height of the object to be measured, and the modulated structured light is collected by the camera system and is transmitted to the computer for analysis and calculation to obtain the three-dimensional data of the object to be measured.
Thus, the illumination of the panoramic endoscope has the following two modes: structured light mode, white light mode;
when the structure light mode is switched, the illumination path of the illumination lens is blocked, and the structure light generator 34 cuts between the coupler 33 and the light source 201 to generate structure light;
when switched to white light mode, both structured light channel 300 and the channel of the illumination lens provide common illumination light.
And then switching the focus absolute size measuring system under the endoscope to a structured light mode to perform structured light imaging.
The acquired structured light image is then analyzed using an image processing system to calculate polyp size in conjunction with the polyp segmentation results.
Optionally, in one embodiment, the object detection model includes a first encoder and a detector; the first encoder is configured to perform feature extraction on an image obtained in the white light mode to obtain a feature map; the extracted target features include at least one of: vessel color features, duct features, edge features. The detector is configured to regress the input feature map, resulting in coordinates of where the polyp is located.
Optionally, in an embodiment, the encoder-decoder structure-based deep neural network includes:
a second encoder for inputting an image, including a convolutional layer, an active layer, and a pooling layer;
the decoder used for outputting the result comprises a convolution layer, an activation layer, a serial layer and an upper sampling layer;
and fusing the feature maps with the size difference smaller than the preset threshold value together between the second encoder and the decoder through cross-layer feature fusion.
Optionally, in an embodiment, as shown in the target detection model shown in fig. 7, an image x under white light is used as an input, an encoder is used to perform feature extraction, and a feature map obtained after feature extraction is regressed by a detector to obtain a coordinate y of a position where a polyp is located. The encoder and the detector are composed of a deep neural network and comprise a convolutional layer, a downsampling layer, an upsampling layer, a pooling layer, a batch normalization layer, an activation layer and the like. With the above object detection model, when the system detects the presence of a polyp, the system will prompt the physician with a red frame on the monitor for the rectangular region where the polyp is located and sound an alarm. The system then performs a fine segmentation of the polyp region.
The fine segmentation of polyp regions is based on a deep neural network of encoder-decoder architecture. The left half part is an encoder which comprises a convolution layer, an activation layer and a pooling layer. The right half is a decoder, which comprises a convolutional layer, an active layer, a serial layer and an up-sampling layer. The characteristic graphs with similar sizes are fused together directly through cross-layer characteristic fusion between the encoder and the decoder, and context information and global information can be extracted better. Fig. 8 shows a U-NET network based on an encoder-decoder architecture.
Once the focus is segmented, the system automatically switches or manually switches to a structured light imaging mode, and meanwhile, the rear end of the illumination path is closed through a shielding lens to prevent the illumination light source from passing through. And the structured light is projected to the surface of the tissue through the structured light channel, and the deformation of the structured light image is collected by the endoscope camera and is transmitted to the image processing center for processing. After the acquired structured light image is transmitted to an image processing center, the image processing center combines the segmentation result and the deformation of the structured light image, calculates the three-dimensional contour of the focus by a phase shift method, and gives the size of the focus.
Alternatively, in one embodiment, as shown in FIG. 9, a lesion size calculation model. P is the projector optical center and C is the camera optical center. O is the intersection of the camera optical axis and the projector optical axis. And a horizontal plane passing through the point O is taken as a reference X axis in the calculation. L1 and L2 are the distances from the camera and projector optical centers, respectively, to the X-axis. d is the distance from the camera optical center to the projector optical center along the X-axis direction. The a-B-O plane is a hypothetical imaginary plane parallel to the connecting line PC between the projector optical center and the camera optical center. For a certain point Q on the surface of the object, the height Z of the point Q relative to the reference plane can be solved by the trigonometric relation:
Figure BDA0003225670690000111
since the plane-to-plane projection is a linear mapping, the projector pattern mapping also has a fixed period on the imaginary plane ABO, AB and
Figure BDA0003225670690000112
and has a linear relationship.
Then, the absolute phase of the Q point is calculated
Figure BDA0003225670690000113
When projecting a sinusoidal fringe image onto a three-dimensional diffuse reflective surface, the image of a certain point Q (x, y) observed from the camera can be represented as:
Figure BDA0003225670690000121
where A (x, y) is the background light intensity, B (x, y)/A (x, y) represents the contrast of the grating fringes,
Figure BDA0003225670690000122
is a phase value. For the N-step phase shift method, the phase difference between the phase shift of the sinusoidal grating in each step and the projection phase of the phase shift of the sinusoidal grating in the previous step is 2 pi/N, namely the following equation:
Figure BDA0003225670690000123
wherein i =1,2. The system of equations has N equations, three unknowns, which can be solved when N > = 3. Obtaining by solution:
Figure BDA0003225670690000124
Figure BDA0003225670690000125
by phase unwrapping, the absolute phase value of the Q point (x, y) can be obtained
Figure BDA0003225670690000126
Will be provided with
Figure BDA0003225670690000127
And substituting the formula to obtain the height Z of the Q point from the virtual plane ABO.
By calibrating the camera parameters, the absolute coordinates (X, Y, Z) of the Q point in the real world can be calculated according to Z, the absolute coordinates (X, Y, Z) of each point on the acquired structured light image are calculated, and the three-dimensional size of the polyp can be obtained by combining the polyp segmentation result.
All documents mentioned in this application are to be considered as being integrally included in the disclosure of this application so as to be subject to modification as necessary. Further, it should be understood that various changes or modifications can be made to the present application by those skilled in the art after reading the above disclosure of the present application, and these equivalents also fall within the scope of the present application as claimed.

Claims (10)

1. A panoramic endoscope, comprising:
a main camera disposed right in front of a tip of the endoscope;
a plurality of sub-cameras disposed on a side wall of the tip;
the visual field of the main camera and the visual fields of the plurality of auxiliary cameras cover a spherical visual field.
2. The panoramic endoscope of claim 1, further comprising,
the main illuminating lens is arranged right in front of the front end, is connected to the light source through a light guide beam and is used for providing illumination for the main camera; and
and the auxiliary illuminating lens is arranged on the side wall of the front end, is connected to the light source through a light guide beam and is used for providing illumination for the auxiliary camera.
3. The panoramic endoscope of claim 1, wherein the primary camera is higher in pixel count than the secondary camera.
4. The panoramic endoscope of claim 1, wherein the field of view angle of the spherical field of view is not less than 330 degrees.
5. A panoramic endoscope image processing method is characterized by comprising the following steps:
synchronously capturing images from a primary camera and a plurality of secondary cameras of a panoramic endoscope as recited in any of claims 1-4;
polyp detection is carried out on images synchronously acquired from the main camera and the plurality of auxiliary cameras respectively;
and if the polyp is detected, sending out a polyp alarm, and outputting prompt information of a camera to which the image of the polyp is detected.
6. The panoramic endoscopic image processing method of claim 5, wherein the step of polyp detection of images synchronously acquired from the main camera and the plurality of sub-cameras comprises: center enhancement, denoising and color rendering processing are carried out to improve the image quality and the recognition degree.
7. The panoramic endoscopic image processing method of claim 5, wherein when a polyp is detected, the alert mode comprises at least one of: highlighting, framing, circling, flashing or audio prompting on an external monitor screen.
8. A panoramic endoscopic image processing method as defined in claim 5, wherein said primary and secondary camera acquisition images are taken in a white light mode.
9. The panoramic endoscopic image processing method according to claim 5, characterized in that the photographed image is input to a trained object detection model to detect whether or not polyps are present in real time;
the object detection model comprises a first encoder and a detector;
the first encoder is configured to perform feature extraction on an image obtained in a white light mode to obtain a feature map;
the detector is configured to regress the input feature map to obtain coordinates of a location where a polyp is located.
10. A panoramic endoscopic image processing method according to claim 5, wherein polyp detection is performed by a convolutional neural network, and the extracted target features comprise at least one of: vessel color features, duct features, edge features.
CN202110970203.3A 2021-08-23 2021-08-23 Panoramic endoscope and image processing method thereof Pending CN115708658A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202110970203.3A CN115708658A (en) 2021-08-23 2021-08-23 Panoramic endoscope and image processing method thereof
PCT/CN2022/102948 WO2023024701A1 (en) 2021-08-23 2022-06-30 Panoramic endoscope and image processing method thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110970203.3A CN115708658A (en) 2021-08-23 2021-08-23 Panoramic endoscope and image processing method thereof

Publications (1)

Publication Number Publication Date
CN115708658A true CN115708658A (en) 2023-02-24

Family

ID=85230329

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110970203.3A Pending CN115708658A (en) 2021-08-23 2021-08-23 Panoramic endoscope and image processing method thereof

Country Status (2)

Country Link
CN (1) CN115708658A (en)
WO (1) WO2023024701A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116091366A (en) * 2023-04-07 2023-05-09 成都华域天府数字科技有限公司 Multi-dimensional shooting operation video and method for eliminating moire
CN117243700A (en) * 2023-11-20 2023-12-19 北京云力境安科技有限公司 Method and related device for detecting endoscope conveying length

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116935051A (en) * 2023-07-20 2023-10-24 深圳大学 Polyp segmentation network method, system, electronic equipment and storage medium

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010279539A (en) * 2009-06-04 2010-12-16 Fujifilm Corp Diagnosis supporting apparatus, method, and program
US9706908B2 (en) * 2010-10-28 2017-07-18 Endochoice, Inc. Image capture and video processing systems and methods for multiple viewing element endoscopes
JP6054874B2 (en) * 2010-12-09 2016-12-27 エンドチョイス イノベーション センター リミテッド Flexible electronic circuit board for multi-camera endoscope
WO2014061023A1 (en) * 2012-10-18 2014-04-24 Endochoice Innovation Center Ltd. Multi-camera endoscope
CN104856635A (en) * 2015-05-14 2015-08-26 珠海视新医用科技有限公司 Tip end structure of panoramic endoscope
WO2017044987A2 (en) * 2015-09-10 2017-03-16 Nitesh Ratnakar Novel 360-degree panoramic view formed for endoscope adapted thereto with multiple cameras, and applications thereof to reduce polyp miss rate and facilitate targeted polyp removal

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116091366A (en) * 2023-04-07 2023-05-09 成都华域天府数字科技有限公司 Multi-dimensional shooting operation video and method for eliminating moire
CN116091366B (en) * 2023-04-07 2023-08-22 成都华域天府数字科技有限公司 Multi-dimensional shooting operation video and method for eliminating moire
CN117243700A (en) * 2023-11-20 2023-12-19 北京云力境安科技有限公司 Method and related device for detecting endoscope conveying length
CN117243700B (en) * 2023-11-20 2024-03-08 北京云力境安科技有限公司 Method and related device for detecting endoscope conveying length

Also Published As

Publication number Publication date
WO2023024701A1 (en) 2023-03-02

Similar Documents

Publication Publication Date Title
US7744528B2 (en) Methods and devices for endoscopic imaging
Lin et al. Dual-modality endoscopic probe for tissue surface shape reconstruction and hyperspectral imaging enabled by deep neural networks
WO2023024701A1 (en) Panoramic endoscope and image processing method thereof
JP5865606B2 (en) Endoscope apparatus and method for operating endoscope apparatus
JP6599317B2 (en) Imaging probe
JP2012511361A5 (en) Apparatus and image processing unit for improved infrared image processing and functional analysis of blood vessel structures such as blood vessels
CN113573654A (en) AI system for detecting and determining lesion size
US20040220478A1 (en) Method and devices for imaging and biopsy
JP2017534322A (en) Diagnostic mapping method and system for bladder
US11423318B2 (en) System and methods for aggregating features in video frames to improve accuracy of AI detection algorithms
JP2005514144A (en) Apparatus and method for spectroscopic examination of the colon
JP2010279539A (en) Diagnosis supporting apparatus, method, and program
CN112105284B (en) Image processing device, endoscope system, and image processing method
JP6132901B2 (en) Endoscope device
KR20140108066A (en) Endoscope system and control method thereof
US20100262000A1 (en) Methods and devices for endoscopic imaging
KR20160067869A (en) Optical speculum
JP2023087014A (en) Endoscope system and method for operating endoscope system
JPWO2014148184A1 (en) Endoscope system
CN109068035B (en) Intelligent micro-camera array endoscopic imaging system
US20030016856A1 (en) Method and apparatus for image processing and display
Bernal et al. Building up the future of colonoscopy–a synergy between clinicians and computer scientists
US20150073210A1 (en) Imaging system, method and distal attachment for multidirectional field of view endoscopy
WO2020008920A1 (en) Medical observation system, medical observation device, and medical observation device driving method
US20170055815A1 (en) Imaging system, method and distal attachment for multidirectional field of view endoscopy

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination