CN111803013A - Endoscope imaging method and endoscope imaging system - Google Patents

Endoscope imaging method and endoscope imaging system Download PDF

Info

Publication number
CN111803013A
CN111803013A CN202010704320.0A CN202010704320A CN111803013A CN 111803013 A CN111803013 A CN 111803013A CN 202010704320 A CN202010704320 A CN 202010704320A CN 111803013 A CN111803013 A CN 111803013A
Authority
CN
China
Prior art keywords
image
fluorescence
light
sensor
light source
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010704320.0A
Other languages
Chinese (zh)
Inventor
陆汇海
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Bosheng Medical Technology Co ltd
Original Assignee
Shenzhen Bosheng Medical Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Bosheng Medical Technology Co ltd filed Critical Shenzhen Bosheng Medical Technology Co ltd
Priority to CN202010704320.0A priority Critical patent/CN111803013A/en
Publication of CN111803013A publication Critical patent/CN111803013A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/04Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
    • A61B1/043Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances for fluorescence imaging
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00163Optical arrangements
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00163Optical arrangements
    • A61B1/00186Optical arrangements with imaging filters

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Surgery (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Biomedical Technology (AREA)
  • Optics & Photonics (AREA)
  • Pathology (AREA)
  • Radiology & Medical Imaging (AREA)
  • Biophysics (AREA)
  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Endoscopes (AREA)

Abstract

The invention discloses an endoscope imaging method and an endoscope imaging system, wherein the endoscope imaging method comprises the following steps: illumination, visible light and fluorescence collection, image signal collection and image processing. Collecting visible light reflected by a human body object and excited fluorescence by two sensors positioned at the front end of the endoscope tube to generate corresponding image signals, and transmitting the image signals to a rear end for processing to obtain a final endoscope image; compared with the prior art that visible light and fluorescence are transmitted to the rear end through the light path for collection and processing, the endoscope solves the problem of accuracy reduction caused by light path transmission, improves imaging accuracy, and saves light path cost and light path occupation space; in addition, due to the design of independent light paths of visible light and fluorescence, the visible light and the fluorescence can be respectively focused and collected, so that the definition consistency of a visible light image and a fluorescence image is ensured; the endoscope imaging system is binocular stereoscopic vision, and can obtain three-dimensional information to compensate the fluorescence brightness of different depth areas.

Description

Endoscope imaging method and endoscope imaging system
Technical Field
The invention relates to the technical field of medical instruments, in particular to an endoscope imaging method and an endoscope imaging system.
Background
In a minimally invasive surgery, the position of cancer can be accurately positioned by applying an excitation light endoscope (or a fluorescence endoscope), so that cancerated tissues can be more accurately resected. The main principle is as follows: after an excitation light drug is scattered or injected into a target site of a living body, excitation light from a light source device is irradiated to a subject, and fluorescence from cancer is captured, whereby the presence diagnosis of cancer, the degree of malignancy, and the like are qualitatively diagnosed.
The prior art is mainly realized by single-camera time-sharing imaging and dual-camera light-sharing imaging. The spectroscopic technique mainly adopts the separation of white light, exciting light and exciting light at the handle end, and adopts 2 or more than 2 sensors (CCD or CMOS) to respectively sense the white light and the fluorescent monochromatic image. One implementation is a color sensor with a single excitation light monochrome sensor. The other is realized by a three-way R, G, B monochrome sensor and an additional excitation light monochrome sensor.
The prior art requires sophisticated light splitting and filtering techniques to avoid contamination of the fluorescent monochrome image with white light or contamination of the white color image with excitation light. In the prior art, no matter time sharing or light splitting is performed, visible light, return excitation light and excitation light all share a set of front-end optical link, and due to the fact that focal lengths are different due to different wavelengths of the visible light and the excitation light, the definition of a visible light image is not consistent with that of a fluorescent monochromatic image.
Disclosure of Invention
The invention provides an endoscope imaging method and an endoscope imaging system with simple optical path and clear imaging.
According to a first aspect, there is provided in one embodiment an endoscopic imaging method comprising the steps of:
controlling the first light source and the second light source to respectively emit white light and excitation light, wherein the white light and the excitation light are fused to form mixed light to irradiate on a human body;
acquiring visible light reflected by a human body through a first sensor positioned at the front end of the endoscope, wherein the first sensor converts the visible light into a first image signal; meanwhile, fluorescence generated by the excited human body is obtained through a second sensor positioned at the front end of the endoscope, and the second sensor converts the fluorescence into a second image signal;
and acquiring a first image signal and a second image signal, converting the first image signal into a white light color image, converting the second image signal into a fluorescent monochrome image, and outputting the white light color image and the fluorescent image to an endoscope image through image processing.
Further, the image processing comprises the following steps:
converting the white light color image into an RGB image;
converting the RGB image into a gray image;
carrying out image normalization processing on the gray information of the fluorescent monochrome image and the gray image, and then obtaining a stereo parallax image of the fluorescent monochrome image matched with the white light color image by using a stereo matching algorithm;
reconstructing the fluorescent monochromatic image into a coordinate system of the white light color image according to the stereo disparity map to obtain a reconstructed fluorescent monochromatic image;
and superposing the reconstructed fluorescence monochromatic image on the RGB image to form the endoscope image.
Further, in the image processing step, after the stereo disparity map is calculated, the distance between the object in the human body and the end face of the camera is calculated through the stereo disparity map, so that the depth information of different fluorescence areas is obtained, and then the fluorescence brightness of the fluorescence areas with far distance is compensated.
Further, the compensation of the fluorescence brightness comprises the following steps:
calculating the average value of the fluorescence light of each fluorescence area, and finding out the fluorescence area with the highest average value;
and multiplying the other fluorescence areas except the fluorescence area with the highest average value by a corresponding gain coefficient respectively to ensure that the fluorescence brightness of the other fluorescence areas is consistent with that of the fluorescence area with the highest average value.
Further, the compensation of the fluorescence brightness comprises the following steps:
designing and calculating a fluorescence compensation gain curve according to the distance between a human body object and the end face of the camera in advance;
and after depth information of different fluorescence areas is obtained, fluorescence compensation is carried out on each area according to the compensation gain curve.
According to a second aspect, there is provided in an embodiment an endoscopic imaging system comprising:
a handle having a threading channel;
the lens tube is connected with the front end of the handle;
the light source assembly comprises a first light source, a second light source and a light guide beam, wherein the first light source is used for emitting white light, the second light source is used for emitting exciting light, one end of the light guide beam penetrates through the lens tube and extends to the front end in the lens tube, and the other end of the light guide beam is connected with the first light source and the second light source;
the first front end assembly is arranged at the front end in the lens tube and comprises a first lens group and a first sensor, the first lens group and the first sensor are sequentially arranged from front to back, the first lens group is used for acquiring visible light reflected by a human body object, and the first sensor is used for acquiring the visible light filtered by the first lens group and generating a first image signal;
the second front end component and the first front end component are arranged at the front end in the lens tube side by side, the second front end component comprises a second lens group and a second sensor which are sequentially arranged from front to back, the second lens group is used for acquiring fluorescence excited by a human body object, and the second sensor is used for acquiring the fluorescence filtered by the second lens group and generating a second image signal;
the control device is connected with the first light source, the second light source, the first sensor and the second sensor; the control device is used for controlling the first light source to emit white light and the second light source to emit exciting light respectively, collecting first image information generated by the first sensor and second image information generated by the second sensor, converting the first image signal into a white light color image, converting the second image signal into a fluorescent monochromatic image, and outputting the endoscope image by image processing of the white light color image and the fluorescent monochromatic image.
Further, the control device comprises a controller, an image acquisition module and an image processing module, wherein the controller is used for controlling the first light source to emit white light and the second light source to emit excitation light respectively, the image acquisition module is used for acquiring first image information generated by the first sensor and second image information generated by the second sensor, the image processing module is used for converting the first image signal into a white light color image, converting the second image signal into a fluorescent monochrome image, and then, the white light color image and the fluorescent monochrome image are subjected to image processing to output an endoscope image.
Furthermore, the control device further comprises a sensor driving module, the input end of the sensor driving module is connected with the controller, the output end of the sensor driving module is respectively connected with the first sensor and the second sensor, and the sensor driving module is used for outputting the sensor setting information obtained by calculation of the controller to the first sensor and the second sensor.
Further, the light source assembly further comprises a light source control module, an input end of the light source control module is connected with the controller, an output end of the light source control module is connected with the first light source and the second light source, and the light source control module is used for receiving a control signal of the controller and controlling light intensity of light emitted by the first light source and the second light source.
Furthermore, the first lens group comprises a fluorescence cut-off filter lens, and the second lens group comprises a visible light cut-off filter lens.
According to the endoscope imaging method and the endoscope imaging system of the embodiment, the visible light reflected by a human subject and the excited fluorescence can be respectively collected by the two sensors positioned at the front end of the endoscope tube to generate corresponding image signals, and then the image signals are transmitted to the back end to be processed to obtain a final endoscope image; compared with the prior art that visible light and fluorescence are transmitted from the front end of the endoscope tube to the rear end through the light path for collection and processing, the endoscope solves the problem of accuracy reduction caused by light path transmission, improves imaging accuracy, and saves light path cost and light path occupation space; in addition, due to the design of independent light paths of visible light and fluorescence, the visible light and the fluorescence can be respectively focused and collected, so that the definition consistency of a visible light image and a fluorescence image is ensured; the endoscope imaging system is binocular stereoscopic vision, and can obtain three-dimensional information to compensate the fluorescence brightness of different depth areas.
Drawings
FIG. 1 is a schematic diagram of an endoscopic imaging system in one embodiment;
FIG. 2 is a flow chart of a method of endoscopic imaging in one embodiment;
FIG. 3 is a flow diagram of image processing for an endoscopic imaging method in one embodiment.
Detailed Description
The present invention will be described in further detail with reference to the following detailed description and accompanying drawings. Wherein like elements in different embodiments are numbered with like associated elements. In the following description, numerous details are set forth in order to provide a better understanding of the present application. However, those skilled in the art will readily recognize that some of the features may be omitted or replaced with other elements, materials, methods in different instances. In some instances, certain operations related to the present application have not been shown or described in detail in order to avoid obscuring the core of the present application from excessive description, and it is not necessary for those skilled in the art to describe these operations in detail, so that they may be fully understood from the description in the specification and the general knowledge in the art.
Furthermore, the features, operations, or characteristics described in the specification may be combined in any suitable manner to form various embodiments. Also, the various steps or actions in the method descriptions may be transposed or transposed in order, as will be apparent to one of ordinary skill in the art. Thus, the various sequences in the specification and drawings are for the purpose of describing certain embodiments only and are not intended to imply a required sequence unless otherwise indicated where such sequence must be followed.
The numbering of the components as such, e.g., "first", "second", etc., is used herein only to distinguish the objects as described, and does not have any sequential or technical meaning. The term "connected" and "coupled" when used in this application, unless otherwise indicated, includes both direct and indirect connections (couplings). The front end in this text is the end close to the human subject, and the rear end is the end far from the human subject.
The first embodiment is as follows:
the embodiment provides an endoscope imaging system which is a binocular endoscope and is mainly used for detecting canceration of a human body.
As shown in FIG. 1, the present endoscopic imaging system generally includes a handle 10, a scope tube 20, a light source assembly 30, a first front end assembly 40, a second front end assembly 50, and a control device 60, and other portions of the endoscopic imaging system are not referred to in this application and will not be described in detail.
The endoscope tube 20 is a hard endoscope tube or a soft endoscope tube, the rear end of the endoscope tube 20 is connected to the front end of the handle 10, a threading channel communicated with the endoscope tube 20 is arranged in the handle 10, and the front end of the endoscope tube 20, which is used for a doctor to operate, of the handle 10 extends into a human body.
The light source assembly 30 includes a first light source 31, a second light source 32 and a light guide beam 33, the first light source 31 and the second light source 32 are located behind the handle 10, for example, installed in the control device 60, or independently installed in a single device, two ends of the light guide beam 33 are respectively an input end and an output end, the input end and the output end of the light guide beam 33 both have two branch beams, the two branch beams at the input end of the light guide beam 33 are respectively connected with the first light source 31 and the second light source 32, the two branch beams at the output end of the light guide beam 33 pass through the handle 10 and extend to the front position in the mirror tube 20, and the two output branch beams are symmetrically arranged. The first light source 31 is configured to emit white light (visible light), the second light source 32 is configured to emit excitation light, and the white light emitted from the first light source 31 and the excitation light emitted from the second light source 32 are fused into mixed light in the light guide beam 33. The front ends of the first light source 31 and the second light source 32 can also be provided with a fusion light path 34, the fusion light path 34 is composed of two input ends and an output end, and the interior of the fusion light path is provided with a refractor and other structures, so that the two paths of light are merged into one path of light output, the output end of the fusion light path 34 is connected with the input end of the light guide beam 33, the two input ends of the fusion light path 34 are aligned with the emergent light paths of the first light source 31 and the second light source 32, and the fusion light path 34 fuses the white light emitted by the first light source 31 and the excitation light emitted by the second light source 32 into mixed light which is then irradiated on a human body shot object. The output end of the light guiding beam 33 is provided with two branch beams so that the mixed light can be uniformly irradiated onto the human subject.
The first and second front end assemblies 40, 50 are mounted side-by-side in the mirror tube 20 at front positions, with the first and second front end assemblies 40, 50 being located centrally within the mirror tube 20 and the output ends of the light guide bundle 33 being located at the edge within the mirror tube 20. The first front end assembly 40 is used to collect visible light reflected by the human subject, and the second front end assembly 50 is used to collect fluorescence excited by the human subject.
The first front end assembly 40 includes a first lens group 41 and a first sensor 42, and the first lens group 41 and the first sensor 42 are arranged in order in a front-to-rear direction and aligned on an optical axis. The first lens group 41 has a fluorescence cut filter 411 at the rear end thereof, the fluorescence cut filter 411 is used for filtering fluorescence in the mixed light, and the fluorescence cut filter 411 irradiates the filtered visible light (white light) onto the first sensor 42. The first sensor 42 is a color sensor, and the first sensor 42 is used for acquiring visible light (white light) and generating a corresponding first image signal.
The second front end assembly 50 includes a second lens group 51 and a second sensor 52, and the second lens group 51 and the second sensor 52 are arranged in order in a front-to-rear direction and aligned on the optical axis. The second lens group 51 has a visible light cut filter 511 at the rear end thereof, the visible light cut filter 511 is used for filtering the visible light in the mixed light, and the visible light cut filter 511 irradiates the filtered fluorescence onto the second sensor 52. The second sensor 52 is a monochrome sensor, and the second sensor 52 is used for acquiring fluorescence and generating a corresponding second image signal. The first lens group 41 and the second lens group 51 are focusing lens groups respectively configured to focus in visible light and fluorescence modes, and the lenses of the first lens group 41 and the second lens group 51 except the filter lens may be the same. In other embodiments, the fluorescence cut-off filter 411 and the visible light cut-off filter 511 are filter films attached to the front or rear lens of the second lens group 51.
In this embodiment, the control device 60 is installed in the apparatus behind the handle 10, the control device 60 is a main body part of the endoscopic imaging system, and the control device 60 has control and processing functions. The control device 60 includes a controller 61, an image capturing module 62 and an image processing module 63, and the controller 61 is connected to the image capturing module 62, the image processing module 63, the first light source 31, the second light source 32, the first sensor 42 and the second sensor 52, respectively. The controller 61 is a control center for controlling the entire endoscopic imaging work.
The light source assembly 30 further includes a light source control module 35, an input end of the light source control module 35 is connected to the controller 61, and two output ends of the light source control module 35 are respectively connected to the first light source 31 and the second light source 32. The light source control module 35 is configured to obtain a control signal from the receiving controller 61 and control the light intensity of the light emitted by the first light source 31 and the second light source 32.
The control device 60 further comprises a sensor drive module 64, an input end of the sensor drive module 64 is connected with the controller 61, one output end of the sensor drive module 64 is connected with the first sensor 42 in the mirror tube 20 through the handle 10 by a cable 43, and the other output end of the sensor drive module 64 is connected with the second sensor 52 in the mirror tube 20 through the handle 10 by a cable 53. The sensor driving module 64 is configured to output the sensor setting information calculated by the controller 61 to the first sensor 42 and the second sensor 52.
The control device 60 further comprises an input device 65, the input device 65 is a keyboard, a touch screen or the like, the input device 65 is connected with the controller 61, the input device 65 is used for inputting operation instructions and parameters to the controller 61, and the controller 61 responds to the input instructions to control other parts and workpieces.
The input end of the image acquisition module 62 is connected with the first sensor 42 in the lens tube 20 through the handle 10 by the cable 44, and the image acquisition module 62 acquires a first image signal generated by the first sensor 42 through the cable 44. The input end of the image acquisition module 62 is also connected with the second sensor 52 in the lens tube 20 through the handle 10 by a cable 54, and the image acquisition module 62 acquires a second image signal generated by the second sensor 52 through the cable 54. The output end of the image acquisition module 62 is connected to the image processing module 63, the image processing module 63 is configured to obtain a first image signal and a second image signal acquired by the image acquisition module 62, the image processing module 63 is further configured to convert the first image signal into a white light color image, convert the second image signal into a fluorescent monochrome image, and output the endoscope image by image processing the white light color image and the fluorescent monochrome image.
The image processing module 63 is connected to the display 70, and the image processing module 63 outputs the endoscope image generated by calculation to the display 70 for display.
In the endoscopic imaging system of the present embodiment, since the first front end assembly 40 and the second front end assembly 50 are mounted at the front end of the endoscope 20, the first front end assembly 40 and the second front end assembly 50 collect visible light and fluorescence respectively, and convert the visible light and the fluorescence into corresponding image signals, and transmit the image signals to the image processing module 63 at the rear end of the handle 10 for calculation and processing. Compared with the prior art that visible light and fluorescence are transmitted to the rear end from the front end of the endoscope tube through the light path for collection and processing, the endoscope solves the problem of accuracy reduction caused by light path transmission, improves imaging accuracy, and saves light path cost and light path occupation space. In addition, due to the design of independent light paths of visible light and fluorescence, the visible light and the fluorescence can be respectively focused and collected, so that the definition consistency of a visible light image and a fluorescence image is ensured; the endoscope imaging system is binocular stereoscopic vision, and can obtain three-dimensional information to compensate the fluorescence brightness of different depth areas.
Example two:
the present embodiment provides an endoscopic imaging method, which is implemented based on the endoscopic imaging system in the above embodiments.
As shown in fig. 2, the endoscopic imaging method of the present embodiment includes the steps of:
s10, illumination;
the light source control module 35 controls the first light source 31 to emit white light, and controls the second light source 32 to emit excitation light, the white light and the excitation light are fused into mixed light through a light path, and the mixed light is irradiated onto a tissue (subject) in a human body through the light guide beam 33; the tissue (object) in the human body reflects white light (visible light), and the cancerated area in the human body is excited to generate fluorescence after being irradiated by the exciting light;
s20, collecting visible light and fluorescence;
visible light is collected by a first front end assembly 40 located at the front end within the tube 20, and fluorescence is collected by a second front end assembly 50 located at the front end within the tube 20.
The first lens group 41 transmits a mixed light of the visible light reflected and excited by the human subject and the fluorescence to the fluorescence cut filter 411, the fluorescence cut filter 411 filters the fluorescence and irradiates the visible light to the first sensor 42, and the first sensor 42 converts the visible light into the first image signal.
Meanwhile, the second lens group 51 transmits the mixed light of the visible light and the fluorescence reflected and excited by the human subject to the visible light cut filter 511, the visible light cut filter 511 filters the visible light to irradiate the fluorescence to the second sensor 52, and the second sensor 52 converts the fluorescence into the second image signal.
S30, collecting image signals;
the first image signal and the second image signal are acquired by the image acquisition module 62, and are transmitted to the image processing module 63.
And S40, image processing.
The image processing module 63 converts the first image signal into a white light color image and the second image signal into a fluorescent monochromatic image after acquiring the first image signal and the second image signal, and outputs the endoscope image by performing image processing on the white light color image and the fluorescent monochromatic image.
As shown in fig. 3, the image processing of the white light color image and the fluorescent monochromatic image specifically includes the following sub-steps:
s41, converting the white light color image into an RGB image;
because the white light color image is a Bayer pattern image, the white balance of the white light color image is adjusted, the white balance is converted into an RGB image after the white balance is adjusted, and then the RGB image is subjected to color correction.
The white light color image adjusts the white balance to R, G, B three color channel ratios so that the R, G, B component values in the white or gray image are equal.
And after the white balance of the white light color image is adjusted, the R, G, B component values on each pixel point in the white light color image are complemented by using a difference algorithm. Namely, G and B components are supplemented on an R pixel point in a Bayer pattern, R and B components are supplemented on a G pixel point, and R and G components are supplemented on a B pixel point. And finally, outputting the RGB three-channel color image.
In the color correction process, the color restoration degree of the image is corrected by the following color correction matrix:
Figure BDA0002594106450000081
wherein M is a color correction matrix,
Figure BDA0002594106450000082
in order to be an input, the user can select,
Figure BDA0002594106450000083
is the corrected output.
S42, converting the RGB image into a gray image;
converting the color corrected RGB image into a gray scale image, wherein the gray scale value can be obtained by adopting the following method:
the first method is as follows: taking R, G, B average value, namely (R + G + B)/3;
the second method comprises the following steps: adopting a coding mode specified in ITU-R BT.709, wherein Y is 0.2125R +0.7153G + 0.0721B;
the third method comprises the following steps: to L channels in L x a x b color space.
S43, normalizing the image;
and carrying out image normalization processing on the gray information of the fluorescent monochromatic image and the gray image.
The fluorescent monochromatic image is preprocessed and then is subjected to image normalization processing, and the preprocessing of the fluorescent monochromatic image comprises operations including image denoising, smoothing, region segmentation, morphology and the like. The important point is that the fluorescent monochromatic image is subjected to region division, and the fluorescent region is extracted from the background region.
And in the image normalization processing process, the fluorescence area extracted by segmentation and the gray level image are adopted for image normalization processing.
The image normalization process can be calculated in two ways:
the first method is as follows:
let U { U1, U2, …, Un } be the fluorescence regions segmented on the fluorescence image, and n be the number of fluorescence regions.
Figure BDA0002594106450000084
Mean value of fluorescence image wherein
Figure BDA0002594106450000085
The area average of the ith fluorescence area is shown. Is provided with
Figure BDA0002594106450000086
Is the average of the grayscale images. Is provided with
Figure BDA0002594106450000087
The normalized fluorescence image is Ut ═ { W × U1, W × U2, …, W × Un }.
The second method comprises the following steps:
similar to the first method, the difference is the calculation method of the average value of the gray scale map. Let XUIs a set of coordinate points corresponding to U, NUThe sum of the areas of all the fluorescence regions (i.e., the total number of pixels corresponding to U). The average value of the gray image is calculated as
Figure BDA0002594106450000091
S44, calculating a stereo disparity map;
and obtaining a stereo parallax image of the fluorescent monochromatic image matched to the white light color image by using a stereo matching algorithm.
The binocular camera module requires binocular correction (stereo) before the following algorithm can be used. The corrected left and right images of the subject to be photographed have the same horizontal position, that is, the same y coordinate value. When the matching algorithm is carried out, only transverse search is needed, so that the calculation amount is reduced. The binocular correction algorithm is a mature algorithm, and is not described in detail again. When binocular calibration is performed, the visible light cut filter in the fluorescence camera can be temporarily changed into the fluorescence cut filter, so that the white light image can be conveniently used for calibration.
Let the first lens (white light camera) be the left camera, the second lens (fluorescence camera) be the right camera. The stereo matching algorithm has the following two modes:
the first method is as follows: simple block matching algorithm
The following operations are performed for all the pixel points in the fluorescence region:
let a pixel coordinate in the current fluorescence image be XiTaking the adjacent 3x3 or 5x5 region and marking as Yi. On a gray scale map, from coordinate XiStart, search right and YiThe most similar gray block (block matching) and note down the coordinate X of the matching block on the gray mapm。dXi=Xm-XiAs a fluorescent image pixel XiThe corresponding parallax.
The second method comprises the following steps: SGM (Semi-Global Matching)
The first method, although simple and fast in operation, causes more noise and inaccurate parallax calculation. In practical applications, the SGM algorithm is more applied.
Ref:H.Hirschmuller,"Stereo Processing by Semiglobal Matching andMutual Information,"in IEEE Transactions on Pattern Analysis and MachineIntelligence,vol.30,no.2,pp.328-341,Feb.2008,doi:10.1109/TPAMI.2007.1166.
The purpose of the two methods is to obtain the parallax (disparity) value of each fluorescent pixel point corresponding to the gray scale image.
S45, reconstructing a fluorescence monochromatic image;
and reconstructing the fluorescent monochromatic image into a coordinate system of the white light color image according to the stereo parallax image to obtain a reconstructed fluorescent monochromatic image.
Reconstructing a fluorescent monochromatic image can be achieved in two ways:
the first method is as follows: pixel based (pixel based)
For any fluorescent pixel XiObtain its corresponding dXiWith X coordinate on the gray scale (white light) mapm=Xi+dXi
The second method comprises the following steps: region-based (region based)
For any fluorescence region UiCalculating an average value of the parallaxes corresponding to all the pixels
Figure BDA0002594106450000101
Will fluoresce the area UiAll pixels in the image shift to the right
Figure BDA0002594106450000102
Then the U can be obtainediReconstructed into a white light map coordinate system.
S46, image superposition;
and superposing the reconstructed fluorescence monochromatic image on the RGB image after color correction to form an endoscope image.
The RGB image is also subjected to image adding and image parameter adjusting processing in superposition. The image adding comprises common sharpening, denoising, edge enhancement and other processing, and the image parameter adjusting comprises adjusting parameters such as brightness, contrast, saturation and the like of the image. The RGB image after image addition and image parameter adjustment has higher image quality, so that a clearer endoscope image can be obtained through subsequent superposition processing.
According to the endoscope imaging method provided by the embodiment, the visible light and the fluorescence are collected through the two eyes at the front end, and the visible light and the fluorescence can be respectively collected in a focusing manner, so that the definition of a visible light image is consistent with that of a fluorescence image; and the fluorescence image and the white light image are subjected to stereo parallax calculation, and the fluorescence image is reconstructed into a white light coordinate system, so that clear and accurate images can be obtained.
In this other embodiment, in order to further improve the imaging effect, fluorescence intensity compensation is also performed after the step of S44. The fluorescence capability of the lens for acquiring the fluorescence area with a longer distance is weaker, so that the intensity of the fluorescence area far away from the fluorescence area after imaging is weaker, and the fluorescence areas with different distances from the lens can show the same fluorescence intensity by compensating the fluorescence intensity.
The principle steps of fluorescence intensity compensation are as follows: and calculating the distance between the shot object in the human body and the end face of the camera through the stereo disparity map, further obtaining the depth information of different fluorescence areas, and then compensating the fluorescence brightness of the fluorescence areas with far distance.
The depth information is calculated as follows:
let a pixel coordinate in the current fluorescence image be XiCorresponding parallax is dXi. The depth value formula is:
Figure BDA0002594106450000103
wherein f is the camera focus, and T is two camera centre-to-centre distances about, and these two values all can be obtained in the binocular correction result.
The fluorescence intensity compensation specifically includes the following two modes:
the first method is as follows:
calculating the average value of the fluorescence light of each fluorescence area, and finding out the fluorescence area with the highest average value;
and multiplying the other fluorescence areas except the fluorescence area with the highest average value by a corresponding gain coefficient respectively to ensure that the fluorescence brightness of the other fluorescence areas is consistent with that of the fluorescence area with the highest average value.
The second method comprises the following steps:
designing and calculating a fluorescence compensation gain curve according to the distance between a human body object and the end face of the camera in advance;
and after depth information of different fluorescence areas is obtained, fluorescence compensation is carried out on each area according to the compensation gain curve.
Wherein the compensation curve is calculated as follows:
a fluorescence target is adopted, and a fluorescence camera is used for collecting fluorescence images at the positions 1cm,2cm and 20cm away from the target respectively. The central 32x32 area image was taken and the average calculated. And (4) taking the distance as an abscissa and the fluorescence value as an ordinate, and drawing an attenuation curve of the fluorescence value intensity along with the change of the distance. Assuming that 5cm (or any distance deemed appropriate) is the optimal fluorescence distance, the attenuation curve is normalized using the fluorescence value corresponding to 5cm, and a compensation value for the fluorescence intensity as a function of distance is obtained. One way to implement this is to set ybFor optimum fluorescence values, curve b is compensatedi=yb/yi,i={1,2,...,20},yiThe measured fluorescence value is divided by the distance i cm.
The fluorescence target is a pure color uniform target, and in order to reduce noise interference, when images are collected at various distances, a plurality of images can be continuously collected and averaged to obtain an average image.
The compensation curve is a discrete curve, and in application, an interpolation algorithm can be used to obtain a compensation coefficient at the current distance.
The specific method for applying the compensation curve to the compensation is as follows:
1) calculating a mean depth value of a pixel of any fluorescence area in the fluorescence image;
2) calculating a corresponding compensation coefficient alpha in the fluorescence depth compensation curve by using an interpolation algorithm;
3) the compensation factor α is multiplied for each fluorescent pixel in the current region.
The present invention has been described in terms of specific examples, which are provided to aid understanding of the invention and are not intended to be limiting. For a person skilled in the art to which the invention pertains, several simple deductions, modifications or substitutions may be made according to the idea of the invention.

Claims (10)

1. An endoscopic imaging method, comprising the steps of:
controlling a first light source and a second light source to respectively emit white light and excitation light, wherein the white light and the excitation light are fused to form mixed light to irradiate a human body;
acquiring visible light reflected by a human body through a first sensor positioned at the front end of an endoscope, wherein the first sensor converts the visible light into a first image signal; meanwhile, fluorescence generated by the excited human body is obtained through a second sensor positioned at the front end of the endoscope, and the second sensor converts the fluorescence into a second image signal;
acquiring a first image signal and a second image signal, converting the first image signal into a white light color image, converting the second image signal into a fluorescent monochrome image, and performing image processing on the white light color image and the fluorescent image to output an endoscope image.
2. An endoscopic imaging method as defined in claim 1, wherein the image processing comprises the steps of:
converting the white light color image into an RGB image;
converting the RGB image into a gray level image;
carrying out image normalization processing on the gray information of the fluorescent monochrome image and the gray image, and then obtaining a stereo parallax image of the fluorescent monochrome image matched with the white light color image by using a stereo matching algorithm;
reconstructing the fluorescent monochromatic image into a coordinate system of the white light color image to obtain a reconstructed fluorescent monochromatic image according to the stereo disparity map;
and superposing the reconstructed fluorescence monochromatic image on the RGB image to form an endoscope image.
3. An endoscopic imaging method according to claim 2, wherein in the image processing step, after the stereoscopic parallax map is calculated, the distance between the subject in the human body and the end face of the camera is calculated from the stereoscopic parallax map, and further, the depth information of different fluorescence regions is obtained, and then the fluorescence brightness of the fluorescence region at a far distance is compensated.
4. The endoscopic imaging method according to claim 3, wherein said compensation of fluorescence brightness comprises the steps of:
calculating the average value of the fluorescence light of each fluorescence area, and finding out the fluorescence area with the highest average value;
and multiplying the other fluorescence areas except the fluorescence area with the highest average value by a corresponding gain coefficient respectively to ensure that the fluorescence brightness of the other fluorescence areas is consistent with that of the fluorescence area with the highest average value.
5. The endoscopic imaging method according to claim 3, wherein said compensation of fluorescence brightness comprises the steps of:
designing and calculating a fluorescence compensation gain curve according to the distance between a human body object and the end face of the camera in advance;
and after depth information of different fluorescence areas is obtained, fluorescence compensation is carried out on each area according to the compensation gain curve.
6. An endoscopic imaging system, comprising:
a handle having a threading channel;
the endoscope tube is connected with the front end of the handle;
the light source assembly comprises a first light source, a second light source and a light guide beam, wherein the first light source is used for emitting white light, the second light source is used for emitting exciting light, one end of the light guide beam penetrates through the endoscope and extends to the front end in the endoscope, and the other end of the light guide beam is connected with the first light source and the second light source;
the first front end assembly is mounted at the front end in the lens tube and comprises a first lens group and a first sensor, the first lens group and the first sensor are sequentially arranged in front and behind, the first lens group is used for acquiring visible light reflected by a human subject, and the first sensor is used for acquiring the visible light filtered by the first lens group and generating a first image signal;
the second front end component and the first front end component are arranged at the front end in the lens tube side by side, the second front end component comprises a second lens group and a second sensor which are sequentially arranged in front and back, the second lens group is used for acquiring fluorescence excited by a human subject, and the second sensor is used for acquiring the fluorescence filtered by the second lens group and generating a second image signal;
the control device is connected with the first light source, the second light source, the first sensor and the second sensor; the control device is used for controlling the first light source to emit white light and the second light source to emit exciting light respectively, collecting first image information generated by the first sensor and second image information generated by the second sensor, converting the first image signal into a white light color image, converting the second image signal into a fluorescent monochromatic image, and outputting the endoscope image by image processing of the white light color image and the fluorescent monochromatic image.
7. An endoscopic imaging system as defined in claim 6, wherein said control means includes a controller for controlling the first light source to emit white light and the second light source to emit excitation light, respectively, an image acquisition module for acquiring first image information generated by the first sensor and second image information generated by the second sensor, and an image processing module for converting the first image signal into a white color image and the second image signal into a fluorescent monochrome image, and then subjecting the white color image and the fluorescent monochrome image to image processing to output the endoscopic image.
8. The endoscopic imaging system of claim 7, wherein the control device further comprises a sensor driver module, an input of the sensor driver module is connected to the controller, an output of the sensor driver module is connected to the first and second sensors, respectively, and the sensor driver module is configured to output sensor setting information calculated by the controller to the first and second sensors.
9. The endoscopic imaging system of claim 7, wherein the light source assembly further comprises a light source control module, an input of the light source control module is connected to the controller, an output of the light source control module is connected to the first and second light sources, and the light source control module is configured to receive a control signal from the controller and to control the intensity of light emitted by the first and second light sources.
10. The endoscopic imaging system according to claim 6, wherein said first lens group comprises a fluorescence cut filter and said second lens group comprises a visible light cut filter.
CN202010704320.0A 2020-07-21 2020-07-21 Endoscope imaging method and endoscope imaging system Pending CN111803013A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010704320.0A CN111803013A (en) 2020-07-21 2020-07-21 Endoscope imaging method and endoscope imaging system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010704320.0A CN111803013A (en) 2020-07-21 2020-07-21 Endoscope imaging method and endoscope imaging system

Publications (1)

Publication Number Publication Date
CN111803013A true CN111803013A (en) 2020-10-23

Family

ID=72861101

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010704320.0A Pending CN111803013A (en) 2020-07-21 2020-07-21 Endoscope imaging method and endoscope imaging system

Country Status (1)

Country Link
CN (1) CN111803013A (en)

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112022109A (en) * 2020-11-06 2020-12-04 南京诺源医疗器械有限公司 Medical fluorescence imaging image data acquisition system and acquisition method thereof
CN112807096A (en) * 2021-02-23 2021-05-18 珠海维尔康生物科技有限公司 Novel optical design fluorescent camera and imaging method thereof
CN113229783A (en) * 2021-05-13 2021-08-10 珠海维尔康生物科技有限公司 Image acquisition system, method and device for fluorescence imaging
CN113367638A (en) * 2021-05-14 2021-09-10 广东欧谱曼迪科技有限公司 Method and device for acquiring high-precision three-dimensional fluorescence image, storage medium and terminal
CN113421234A (en) * 2021-06-17 2021-09-21 韩从辉 Mathematical algorithm microscopic bladder endoscope imaging system and image processing method
CN113504636A (en) * 2021-08-03 2021-10-15 嘉兴智瞳科技有限公司 Imaging system with 3D imaging and fluorescence imaging functions
CN114081424A (en) * 2021-10-08 2022-02-25 深圳迈瑞生物医疗电子股份有限公司 Endoscopic imaging system and control method thereof
WO2022105902A1 (en) * 2020-11-20 2022-05-27 上海微创医疗机器人(集团)股份有限公司 Fluorescence endoscope system, control method and storage medium
CN115153399A (en) * 2022-09-05 2022-10-11 浙江华诺康科技有限公司 Endoscope system
CN115316919A (en) * 2022-09-15 2022-11-11 广东欧谱曼迪科技有限公司 Dual-camera 3D optical fluorescence endoscope camera shooting system and method and electronic equipment
CN117281451A (en) * 2023-11-14 2023-12-26 杭州显微智能科技有限公司 3D endoscope fluorescence imaging system and imaging method thereof
CN117398042A (en) * 2023-12-14 2024-01-16 深圳市博盛医疗科技有限公司 AI-assisted detection 3D endoscope system and imaging method

Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005319212A (en) * 2004-05-11 2005-11-17 Pentax Corp Fluorescence endoscope apparatus
JP2007014633A (en) * 2005-07-08 2007-01-25 Pentax Corp Fluorescent observation device and light source device
CN102770062A (en) * 2010-03-03 2012-11-07 奥林巴斯株式会社 Fluorescence observation device
CN102781305A (en) * 2010-03-09 2012-11-14 奥林巴斯株式会社 Fluorescent endoscope device
CN102802493A (en) * 2010-03-23 2012-11-28 奥林巴斯株式会社 Fluorescence imaging device
CN103732117A (en) * 2011-12-07 2014-04-16 奥林巴斯医疗株式会社 Endoscope device
CN104619236A (en) * 2013-08-01 2015-05-13 奥林巴斯医疗株式会社 Imaging device
CN105934191A (en) * 2014-01-31 2016-09-07 奥林巴斯株式会社 Fluorescence viewer
CN107105977A (en) * 2015-01-21 2017-08-29 奥林巴斯株式会社 Endoscope apparatus
CN107440669A (en) * 2017-08-25 2017-12-08 北京数字精准医疗科技有限公司 A kind of binary channels spy imaging system
CN108095701A (en) * 2018-04-25 2018-06-01 上海凯利泰医疗科技股份有限公司 Image processing system, fluorescence endoscope illumination imaging device and imaging method
CN108577791A (en) * 2018-05-16 2018-09-28 广东欧谱曼迪科技有限公司 A kind of fluorescence navigation endoscopic system and its method for enhancing fluorescence imaging sensitivity
CN109744986A (en) * 2019-01-31 2019-05-14 广东欧谱曼迪科技有限公司 A kind of exposure feedback-type fluorescence navigation endoscopic system and image procossing self-regulating method
CN109758094A (en) * 2019-01-31 2019-05-17 广东欧谱曼迪科技有限公司 A kind of focusing feedback-type fluorescence navigation endoscopic system and image are from processing method
CN210228085U (en) * 2019-01-31 2020-04-03 广东欧谱曼迪科技有限公司 Focusing feedback type fluorescence navigation endoscope system based on image self-processing
CN212326346U (en) * 2020-07-21 2021-01-12 深圳市博盛医疗科技有限公司 Endoscope imaging system

Patent Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005319212A (en) * 2004-05-11 2005-11-17 Pentax Corp Fluorescence endoscope apparatus
JP2007014633A (en) * 2005-07-08 2007-01-25 Pentax Corp Fluorescent observation device and light source device
CN102770062A (en) * 2010-03-03 2012-11-07 奥林巴斯株式会社 Fluorescence observation device
CN102781305A (en) * 2010-03-09 2012-11-14 奥林巴斯株式会社 Fluorescent endoscope device
CN102802493A (en) * 2010-03-23 2012-11-28 奥林巴斯株式会社 Fluorescence imaging device
CN103732117A (en) * 2011-12-07 2014-04-16 奥林巴斯医疗株式会社 Endoscope device
CN104619236A (en) * 2013-08-01 2015-05-13 奥林巴斯医疗株式会社 Imaging device
CN105934191A (en) * 2014-01-31 2016-09-07 奥林巴斯株式会社 Fluorescence viewer
CN107105977A (en) * 2015-01-21 2017-08-29 奥林巴斯株式会社 Endoscope apparatus
CN107440669A (en) * 2017-08-25 2017-12-08 北京数字精准医疗科技有限公司 A kind of binary channels spy imaging system
CN108095701A (en) * 2018-04-25 2018-06-01 上海凯利泰医疗科技股份有限公司 Image processing system, fluorescence endoscope illumination imaging device and imaging method
CN108577791A (en) * 2018-05-16 2018-09-28 广东欧谱曼迪科技有限公司 A kind of fluorescence navigation endoscopic system and its method for enhancing fluorescence imaging sensitivity
CN109744986A (en) * 2019-01-31 2019-05-14 广东欧谱曼迪科技有限公司 A kind of exposure feedback-type fluorescence navigation endoscopic system and image procossing self-regulating method
CN109758094A (en) * 2019-01-31 2019-05-17 广东欧谱曼迪科技有限公司 A kind of focusing feedback-type fluorescence navigation endoscopic system and image are from processing method
CN210228085U (en) * 2019-01-31 2020-04-03 广东欧谱曼迪科技有限公司 Focusing feedback type fluorescence navigation endoscope system based on image self-processing
CN212326346U (en) * 2020-07-21 2021-01-12 深圳市博盛医疗科技有限公司 Endoscope imaging system

Cited By (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112022109A (en) * 2020-11-06 2020-12-04 南京诺源医疗器械有限公司 Medical fluorescence imaging image data acquisition system and acquisition method thereof
WO2022105902A1 (en) * 2020-11-20 2022-05-27 上海微创医疗机器人(集团)股份有限公司 Fluorescence endoscope system, control method and storage medium
CN112807096A (en) * 2021-02-23 2021-05-18 珠海维尔康生物科技有限公司 Novel optical design fluorescent camera and imaging method thereof
CN113229783A (en) * 2021-05-13 2021-08-10 珠海维尔康生物科技有限公司 Image acquisition system, method and device for fluorescence imaging
CN113367638A (en) * 2021-05-14 2021-09-10 广东欧谱曼迪科技有限公司 Method and device for acquiring high-precision three-dimensional fluorescence image, storage medium and terminal
CN113367638B (en) * 2021-05-14 2023-01-03 广东欧谱曼迪科技有限公司 Method and device for acquiring high-precision three-dimensional fluorescence image, storage medium and terminal
CN113421234A (en) * 2021-06-17 2021-09-21 韩从辉 Mathematical algorithm microscopic bladder endoscope imaging system and image processing method
CN113504636B (en) * 2021-08-03 2022-03-29 嘉兴智瞳科技有限公司 Imaging system with 3D imaging and fluorescence imaging functions
CN113504636A (en) * 2021-08-03 2021-10-15 嘉兴智瞳科技有限公司 Imaging system with 3D imaging and fluorescence imaging functions
CN114081424A (en) * 2021-10-08 2022-02-25 深圳迈瑞生物医疗电子股份有限公司 Endoscopic imaging system and control method thereof
CN115153399A (en) * 2022-09-05 2022-10-11 浙江华诺康科技有限公司 Endoscope system
CN115316919A (en) * 2022-09-15 2022-11-11 广东欧谱曼迪科技有限公司 Dual-camera 3D optical fluorescence endoscope camera shooting system and method and electronic equipment
CN115316919B (en) * 2022-09-15 2023-06-30 广东欧谱曼迪科技有限公司 Dual-camera 3D optical fluorescence endoscope imaging system, method and electronic equipment
CN117281451A (en) * 2023-11-14 2023-12-26 杭州显微智能科技有限公司 3D endoscope fluorescence imaging system and imaging method thereof
CN117398042A (en) * 2023-12-14 2024-01-16 深圳市博盛医疗科技有限公司 AI-assisted detection 3D endoscope system and imaging method
CN117398042B (en) * 2023-12-14 2024-03-19 深圳市博盛医疗科技有限公司 AI-assisted detection 3D endoscope system and imaging method

Similar Documents

Publication Publication Date Title
CN111803013A (en) Endoscope imaging method and endoscope imaging system
US11330237B2 (en) Medical inspection apparatus, such as a microscope or endoscope using pseudocolors
CN212326346U (en) Endoscope imaging system
US20190005641A1 (en) Vascular information acquisition device, endoscope system, and vascular information acquisition method
US7123756B2 (en) Method and apparatus for standardized fluorescence image generation
KR101854189B1 (en) Augmented stereoscopic visualization for a surgical robot
US8358821B2 (en) Image processing system, image processing method, and computer readable medium
US7539335B2 (en) Image data processor, computer program product, and electronic endoscope system
CN102361580B (en) Fluorescence observation device, fluorescence observation system, and fluorescence image processing method
WO2022257946A1 (en) Multispectral imaging system and method, and storage medium
US20190274518A1 (en) Medical observation device, such as a microscope or an endoscope, and method using a pseudo-color pattern having temporal and/or spatial modulation
US20110109761A1 (en) Image display method and apparatus
CN110772208B (en) Method, device and equipment for acquiring fluorescence image and endoscope system
US20220318969A1 (en) Method for providing an image representation by means of a surgical microscope, and surgical microscope
JP5592715B2 (en) Image processing apparatus and image processing method
EP1210907A1 (en) Fluorescent image obtaining apparatus
CN114027765B (en) Fluorescence endoscope system, control method, and storage medium
EP4201298A1 (en) Endoscope system with adaptive lighting control
WO2021095672A1 (en) Information processing device and information processing method
CN115804561A (en) Method and apparatus for video endoscopy using fluorescent light

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination