CN114529477A - Binocular endoscope with high dynamic range, system and imaging method - Google Patents

Binocular endoscope with high dynamic range, system and imaging method Download PDF

Info

Publication number
CN114529477A
CN114529477A CN202210188171.6A CN202210188171A CN114529477A CN 114529477 A CN114529477 A CN 114529477A CN 202210188171 A CN202210188171 A CN 202210188171A CN 114529477 A CN114529477 A CN 114529477A
Authority
CN
China
Prior art keywords
image
lens
pyramid
endoscope
binocular
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210188171.6A
Other languages
Chinese (zh)
Inventor
王炳强
游庆虎
徐栋
詹世涛
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shandong Weigao Surgical Robot Co Ltd
Original Assignee
Shandong Weigao Surgical Robot Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shandong Weigao Surgical Robot Co Ltd filed Critical Shandong Weigao Surgical Robot Co Ltd
Priority to CN202210188171.6A priority Critical patent/CN114529477A/en
Publication of CN114529477A publication Critical patent/CN114529477A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • G06T5/70
    • G06T5/90
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00002Operational features of endoscopes
    • A61B1/00004Operational features of endoscopes characterised by electronic signal processing
    • A61B1/00009Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope
    • A61B1/000095Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope for image enhancement
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00064Constructional details of the endoscope body
    • A61B1/00071Insertion part of the endoscope body
    • A61B1/0008Insertion part of the endoscope body characterised by distal tip features
    • A61B1/00096Optical elements
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00163Optical arrangements
    • A61B1/00193Optical arrangements adapted for stereoscopic vision
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/04Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
    • A61B1/05Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances characterised by the image sensor, e.g. camera, being in the distal end portion
    • A61B1/051Details of CCD assembly
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration by the use of more than one image, e.g. averaging, subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10068Endoscopic image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20016Hierarchical, coarse-to-fine, multiscale or multiresolution image processing; Pyramid transform
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20172Image enhancement details
    • G06T2207/20208High dynamic range [HDR] image processing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20221Image fusion; Image merging

Abstract

The invention provides a binocular endoscope with a high dynamic range, a system and an imaging method, wherein the binocular endoscope comprises a lens tube, two lenses are arranged at the head part of the lens tube in a bilateral symmetry manner, an optical mechanism for reducing the image brightness is arranged in one of the lenses, two CMOS image sensors are also arranged in the lens tube, the two CMOS image sensors correspond to the two lenses one by one, and the CMOS image sensors are used for being connected with an image processor; the endoscope body tube is also internally provided with a light guide optical fiber for illumination. According to the binocular endoscope, the binocular endoscope system and the binocular imaging method with the high dynamic range, the optical mechanism for reducing the image brightness is added into the target lens, the two original images with high and low exposure at the same shooting moment can be obtained through single exposure, the two original images do not have shooting time delay, the two original images obtained through the binocular lens are synthesized through an image fusion method, and the target image with moderate brightness and meeting the observation requirements of human eyes is obtained.

Description

Binocular endoscope with high dynamic range, system and imaging method
Technical Field
The invention relates to the technical field of endoscopes, in particular to a binocular endoscope with a high dynamic range, a binocular endoscope system and an imaging method.
Background
Dynamic Range (DR) of an image refers to the ratio of the maximum brightness to the minimum brightness of a visible region in an image. The ratio of the maximum luminance to the minimum luminance in a scene is called the dynamic range of the scene. A High Dynamic Range (HDR) image represents an image recording a real scene whose dynamic range span is large. For a sensor device CCD/CMOS for recording an image, there is also a dynamic range index for representing the performance of the image sensor, and the dynamic range of the image sensor refers to the ratio of the full well capacity (on-chip pixel) to the readout noise. The higher the dynamic range of the image sensor, the higher the dynamic range in which a scene can be recorded. When the dynamic range of the image sensor is lower than the dynamic range of the scene, the sensor can only record images in a part of brightness intervals, and image information of other brightness cannot be recorded, so that the images are locally over-bright or over-dark.
The endoscope needs to observe the in-vivo image with a large field angle and a large depth of field, and the dynamic range of the scene is higher, which means that the dynamic range of the whole image which needs to be recorded by the endoscope is higher, thus great requirements are put forward on the dynamic range of the CMOS chip of the image sensor; because the dynamic range of a scene is very large and is larger than that of a CMOS, the CMOS cannot record all information of image brightness in the scene under single exposure, and the condition of local overexposure or over-darkness of an endoscope image occurs.
At present, there are 2 main ways to improve the dynamic range of an endoscope, 2 ways to implement algorithms and hardware. The algorithm is realized by using an HDR algorithm, namely, an HDR algorithm synthesis or image fusion is carried out on images with different brightness by recording images with different brightness obtained by successively exposing two or more frames at high and low levels to obtain an image with high quality and moderate brightness. This HDR algorithm has a certain effect on processing still images. But the effect of the application on the endoscope platform is often not good because the endoscope needs real-time observation, which puts great demands on the speed of HDR algorithm processing. Firstly, the HDR algorithm needs to acquire at least two successive frames of images to synthesize an image, which means that the frame rate of the synthesized image is reduced by at least half; meanwhile, in the use process of the endoscope, the shot lens is likely to move relative to the shot object, the original images of two or more frames are dislocated, and the like, and the dislocated pixels need to be corrected by a complex registration algorithm due to the fact that the dislocation condition of the original images is unknown, so that the application difficulty is further increased.
The hardware implementation is mainly to record high dynamic range scenes by adding hardware (such as a rotating filter) to use a high dynamic monochrome CCD sensor, and to synthesize monochrome images recorded by the high dynamic range monochrome CCD into high dynamic color RGB images, which requires additional optical hardware and matched synchronization and control hardware and circuits.
Disclosure of Invention
In order to solve the problems in the prior art, the application provides a binocular endoscope with a high dynamic range, a system and an imaging method. Different from a conventional HDR method, the original images are always shot at the same time, and the problem of image dislocation of the original images to be fused in HDR due to endoscope movement caused by the fact that the conventional HDR method needs to process adjacent frame images shot by a monocular camera does not exist. The HDR method is simple in implementation mode, two original images obtained through binocular synthesis are obtained through an image fusion method, the brightness of the target image obtained through processing is moderate, and the target image meets the observation requirements of human eyes.
In order to achieve the above object, in one aspect, the present application provides a binocular endoscope with a high dynamic range, including a lens tube, two lenses are symmetrically arranged at the left and right of the head of the lens tube, and respectively marked as a left eyepiece lens and a right eyepiece lens, an optical mechanism for reducing image brightness is arranged in the left eyepiece lens or the right eyepiece lens, two CMOS image sensors are further arranged in the lens tube, the two CMOS image sensors are in one-to-one correspondence with the two lenses, and the CMOS image sensors are used for being connected with an image processor; and a light guide optical fiber for illumination is also arranged in the endoscope body tube.
In some embodiments, the brightness of the image obtained by the lens provided with the optical mechanism is 30% to 60% of the brightness of the image obtained by the lens not provided with the optical mechanism.
In some embodiments, the structures of the lens groups in the left eyepiece lens and the right eyepiece lens are the same, the lens groups sequentially comprise a first lens, a second lens, a third lens and a cemented lens from an object plane to an image plane, and a diaphragm is arranged between the second lens and the third lens.
In some embodiments, the optical mechanism for reducing image brightness disposed in the left eyepiece lens or the right eyepiece lens is a neutral filter disposed at a position between the cemented lens and a protective glass on the CMOS image sensor, the object side surface and the image side surface of the neutral filter being both planar.
In some embodiments, the optical mechanism for reducing the brightness of the image disposed in the left eyepiece lens or the right eyepiece lens is a film structure, in which case the image side surface of the cemented lens of the lens disposed with the film structure is a plane on which the film is formed.
In some embodiments, the left eyepiece lens satisfies the following conditional expression: TTL is more than or equal to 3.5mm and less than or equal to 7mm, wherein TTL is the total optical length of the left eyepiece of the binocular endoscope; f is more than or equal to 0.6mm and less than or equal to 1.2mm, wherein f represents the effective focal length of the left eyepiece of the binocular endoscope; the FOV is more than or equal to 80 degrees and less than or equal to 100 degrees, wherein the FOV represents the entrance pupil field angle of the left ocular lens; BFL is more than or equal to 1mm, wherein BFL represents the optical back focus of the left eyepiece lens; the depth of field of the left eyepiece is not less than 20 mm-100 mm; and the conditional expression satisfied by the right eyepiece is consistent with that satisfied by the left eyepiece.
Another aspect of the present application provides a binocular endoscope system having a high dynamic range, including the binocular endoscope, an image processor connected with the binocular endoscope through a data transmission line, and a display connected with the image processor through a video signal transmission line.
The application also provides an imaging method based on the binocular endoscope system with the high dynamic range, which comprises the following steps:
step 1, acquiring two images shot by a binocular endoscope as original images to be processed;
step 2, preprocessing the two original images to improve the signal-to-noise ratio of the two preprocessed images; step 3, calculating a Laplacian pyramid for the RGB components of each image to obtain a corresponding Laplacian pyramid image;
step 4, fusing the subimages of the two Laplace pyramid images at the corresponding layers to obtain a fused Laplace pyramid image;
and 5, restoring a corresponding Gaussian pyramid image through the fused Laplace pyramid image, wherein the zeroth-layer sub-image of the Gaussian pyramid image is the final target image.
In some embodiments, in said step 2, the pre-processing comprises: and denoising the two original images respectively to improve the image quality.
In some embodiments, in step 5, for the laplacian pyramid image with pyramid level n and the gaussian pyramid image, the following recursion relationship is given:
Figure BDA0003523530350000041
wherein L isnDenotes the highest level subimage of the Laplacian pyramid image with n layers, which is a known quantity, LkA k-th layer sub-image, which is a known quantity, G, representing a Laplace pyramid imagenSub-image of the highest layer of the Gaussian pyramid image, G* k+1The (k +1) th layer sub-image of the gaussian pyramid image is obtained by means of interpolation, namely, the resolution of the sub-image is increased by two times in the horizontal direction and the vertical direction.
The binocular endoscope, the system and the imaging method have the advantages that the binocular endoscope is used, the optical mechanism for reducing the image brightness is added into the objective lens, the two original images with high and low exposure at the same shooting moment can be obtained through single exposure, and shooting time delay does not exist in the two original images. Different from a conventional HDR method, the original images are always shot at the same time, and the problem of image dislocation of the original images to be fused in HDR due to endoscope movement caused by the fact that the conventional HDR method needs to process adjacent frame images shot by a monocular camera does not exist. The HDR method is simple in implementation mode, two original images obtained through binocular synthesis are obtained through an image fusion method, the brightness of the target image obtained through processing is moderate, and the target image meets the observation requirements of human eyes.
Drawings
Fig. 1 shows a schematic structural view of a binocular endoscope in an embodiment.
Fig. 2 shows a schematic structural view of the head of the binocular endoscope in the embodiment.
Fig. 3 shows a schematic structural diagram of the left eyepiece in the embodiment.
Fig. 4 shows a schematic optical path diagram of the left eyepiece lens in the embodiment.
Fig. 5 shows a schematic structural diagram of a right eyepiece in an embodiment.
Fig. 6 shows a schematic optical path diagram of the right eyepiece lens in the embodiment.
Fig. 7 shows an imaging schematic in the case of no neutral filter in the embodiment.
Fig. 8 shows a schematic image of the case where a neutral filter is present in the embodiment.
Fig. 9 shows a schematic diagram of neutral density filters of different transmission rates in an embodiment.
Fig. 10 is a graph showing the transmittance of visible light by the neutral density filter in the example.
Fig. 11 shows a schematic view of the plating film in the example.
Fig. 12 is a schematic structural view showing a binocular endoscope system in the embodiment.
Fig. 13 shows a prior art HDR implementation.
FIG. 14 shows a schematic diagram of the pyramid layering algorithm in the embodiment.
FIG. 15 shows a schematic diagram of a pyramid fusion algorithm process, which is standard in the prior art.
FIG. 16 is a schematic diagram illustrating the pyramid fusion algorithm process in the embodiment.
Fig. 17(a) shows a schematic diagram of an original image acquired by the right eyepiece in the embodiment, and (b) shows a luminance histogram of the original image.
Fig. 18(a) shows a schematic diagram of an original image obtained by the left eyepiece in the embodiment, and (b) shows a luminance histogram of the original image.
Fig. 19(a) shows a schematic view of a target image finally obtained in the embodiment, and (b) shows a luminance histogram of the target image.
Fig. 20 is a diagram showing the display effect of an image after gamma is set to 1.5 in the display.
Fig. 21 shows a flowchart of an imaging method in an embodiment.
Reference numerals: 1-binocular endoscope, 2-data transmission line, 21-first data transmission line, 22-second data transmission line, 3-image processor, 4-video signal transmission line, 5-display, 6-lens, 61-left eye lens, 62-right eye lens, 7-optical fiber port, 71-first optical fiber port, 72-second optical fiber port, 8-CMOS image sensor, 81-first CMOS image sensor, 82-second CMOS image sensor, 9-lens tube, 10-lens group, 11-light source interface, 12-lens, L1-first lens, L2-second lens, L3-third lens, L4-cemented lens, ST-diaphragm, protective glass on L5-CMOS image sensor, F1-neutral filter.
Detailed Description
The following further describes embodiments of the present application with reference to the drawings.
In the description of the present application, it is to be understood that the terms "first", "second", and the like are used for distinguishing similar objects and not for describing or indicating a particular order or sequence, and that the terms "upper", "lower", "front", "rear", "left", "right", "top", "bottom", "inner", "outer", and the like indicate orientations or positional relationships based on those shown in the drawings, and are used merely for convenience of description and simplification of description, but do not indicate or imply that the referred device or element must have a particular orientation, be constructed in a particular orientation, and be operated, and thus should not be considered as limiting the present application.
As shown in fig. 1-2, the binocular endoscope 1 with a high dynamic range according to the present invention includes a lens tube 9, two lenses 6 are symmetrically disposed at a head portion of the lens tube 9, and are respectively referred to as a left eyepiece lens 61 and a right eyepiece lens 62, two CMOS image sensors 8 are further disposed in the lens tube 9, and are respectively referred to as a first CMOS image sensor 81 and a second CMOS image sensor 82, the two CMOS image sensors 8 correspond to the two lenses 6 one by one, the first CMOS image sensor 81 and the second CMOS image sensor 82 are configured to be connected to an image processor 3, and specifically, the first CMOS image sensor 81 and the second CMOS image sensor 82 are respectively connected to the image processor 3 through a first data transmission line 21 and a second data transmission line 22. Still be equipped with two leaded light optic fibre in the mirror body pipe 9, leaded light optic fibre is used for being connected with the light source through setting up light source interface 11 on the mirror body pipe 9, leaded light optic fibre is in through setting up the optic fibre mouth 7 of mirror body pipe 9 tip is lighted, mirror body pipe 9 tip symmetry from top to bottom sets up two optic fibre mouths 7, marks as first optic fibre mouth 71 and second optic fibre mouth 72 respectively, two leaded light optic fibre and two optic fibre mouths 7 one-to-one.
In order to achieve a high dynamic range, the left eyepiece lens 61 or the right eyepiece lens 62 according to the present application is provided with an optical mechanism for reducing the brightness of an image such that the brightness of an image obtained by a lens provided with the optical mechanism is 30% to 60% of the brightness of an image obtained by a lens not provided with the optical mechanism. In the present embodiment, an optical mechanism for reducing the brightness of an image is provided in the right eyepiece lens 62, and the brightness of the image obtained by the right eyepiece lens 62 is 30% to 60% of the brightness of the image obtained by the left eyepiece lens 61.
As shown in fig. 3-4, in the present embodiment, the lens group 10 in the left eyepiece lens 61 includes a first lens L1, a second lens L2, a third lens L3, and a cemented lens L4 in order from an object plane S100 to an image plane S200, and a stop ST is disposed between the second lens L2 and the third lens L3.
The lens group 10 in the right eyepiece lens 62 also includes a first lens L1, a second lens L2, a third lens L3, and a cemented lens L4 in this order from the object plane S100 to the image plane S200, and a stop ST is provided between the second lens L2 and the third lens L3. The optical mechanism for reducing the image brightness disposed in the right eyepiece lens 62 may be a neutral filter F1, the neutral filter F1 is disposed at any position from the object plane S100 to the protective glass L5 on the CMOS image sensor, the object side surface S12 and the image side surface S13 of the neutral filter F1 are both planar, and preferably, the neutral filter F1 is disposed at a position between the cemented lens L4 and the protective glass L5 on the CMOS image sensor, as shown in fig. 5-6, the neutral filter F1 is used for controlling the transmittance of the visible light, so as to adjust the imaging brightness. Obviously, when the neutral filter F1 is added to the right eyepiece 62, the brightness of the image obtained by the right eyepiece 62 is darker than the brightness of the image obtained by the left eyepiece 61 as a whole under the same exposure parameters; in addition, on the imaging optical path, since the neutral filter F1 has no optical power, the right eyepiece lens 62 has the same imaging quality as the left eyepiece lens 61 without the neutral filter F1, as shown in fig. 7-8, which is very beneficial for the subsequent image processing process (i.e., the fusion of the original images of high and low exposures).
As shown in fig. 9, the neutral filter F1 attenuates visible light substantially in equal proportion, and the brightness of the image obtained by the right eyepiece lens 62 is different using the neutral filter F1 having different transmittances. In the present example, a neutral density filter F1 having a transmittance of 30% to 60% was used, and fig. 10 shows a neutral density filter F1 having a transmittance of 30%.
The optical mechanism for reducing the brightness of the image disposed in the right eyepiece lens 62 may also be a film structure. At this time, among the lenses of the right eyepiece lens 62, up toThe object side or the image side of at least one lens is a plane, and a film is formed on one plane to form a film structure, as shown in fig. 11. Preferably, the image side surface of the cemented lens L4 of the right eyepiece lens 62 is a plane, and a film is formed on the plane to form a film structure. Specifically, the film layer structure is SiO2And TiO2Due to SiO as a composite film layer2And TiO2Meets the biocompatibility requirement and has no harm to human body.
In the present embodiment, the first lens element L1 has negative power, the object-side surface S1 is a plane surface, the image-side surface S2 is a concave surface, and the first lens element L1 is a spherical glass lens.
The second lens L2 has positive power, the object-side surface S3 is convex, the image-side surface S4 is convex, and the second lens L2 is a glass spherical lens.
The third lens L3 has positive power, the object-side surface S5 is concave, the image-side surface S6 is convex, and the third lens L3 is a glass spherical lens.
The cemented lens L4 has positive optical power, with the object side surface S7 being a convex surface and the image side surface S9 being a flat surface, and the cemented lens L4 being a glass spherical lens.
That is, in the present embodiment, the object-side surface S1 of the first lens L1 is a planar surface type, and the image-side surface S9 of the cemented lens L4 is a planar surface type. The object side surface S1 of the first lens L1 is designed to be a plane, so that protective glass (such as sapphire) can be directly glued to the outer side of the object side surface S1 of the first lens L1, the purposes of protecting the lens and resisting scratch are achieved, and the long-term use effectiveness of the lens is guaranteed. The image side S9 of the cemented lens L4 is designed to be a plane so that a film structure can be formed on the plane. The beneficial effect of placing the plane to be coated on the last surface of the last lens in the right eyepiece 62 is: the coating stage may be performed either before the lens-in-lens assembly process or after the lens is assembled, to coat the entire right eyepiece lens 62. During film coating, the right eyepiece 62 is only required to be placed in the film coating equipment, and the last plane of the right eyepiece 62 receives the deposited film layer, so that the purpose of film coating can be achieved. The film layer of the coating film is very thin, in the order of um, and has no influence on the imaging quality basically relative to the whole imaging optical system. The reason why the coating surface is designed to be a plane is to ensure that no difference exists in the optical imaging quality of the lens before and after coating.
In the cemented lens L4, the power introduced by the surface type is: and phi is n/r, wherein phi is the optical power, n is the refractive index of the lens, and r is the curvature radius of the surface type. If the image-side surface S9 of the cemented lens L4 has curvature, the refractive index of the image-side surface S9 of the cemented lens L4 after coating will slightly change, and the optical power will not be changed by 100%. In contrast, the image side surface S9 of the cemented lens L4 is designed to be a plane surface, r is ∞, and the focal power introduced by the plane surface is always 0 before and after coating, so that 100% of the image quality is not affected by coating, and the accuracy of subsequent image fusion is ensured.
In the present embodiment, the left eyepiece lens 61 further satisfies the following conditional expression:
1) TTL is not less than 3.5mm and not more than 7mm, wherein TTL is the total optical length of the left eyepiece lens 61 of the binocular endoscope, and the total optical length is the distance from the first lens to the CMOS focal plane. The physical length of the lens plus the optical back focus of the lens is equal to the optical total length of the lens. The purpose of setting the optical total length of the left eyepiece lens 61 between 3.5mm and 7mm is: the size of the left eyepiece of the endoscope is constrained, and if the left eyepiece length of the endoscope is too large, the expandability of the endoscope in use is limited. For an endoscope with a bendable head, the lens is too long to facilitate the bending function.
2) F is more than or equal to 0.6mm and less than or equal to 1.2mm, wherein f represents the effective focal length of the left eyepiece lens 61 of the binocular endoscope.
3) The FOV is more than or equal to 80 degrees and less than or equal to 100 degrees, wherein the FOV represents the entrance pupil field angle of the left ocular lens 61.
4) BFL ≧ 1mm, where BFL denotes the optical back focus of the left eyepiece lens, which refers to the distance from the last face S9 of the last lens in the left eyepiece lens 61 to the image plane S200 (which contains the thickness of the cover glass L5 on the CMOS image sensor, which in this embodiment is 0.4 mm). Specifically, the left eyepiece lens 61 is designed to have a back focal length large enough to ensure that a neutral filter F1 can be incorporated into the lens.
5) The depth of field of the left eyepiece 61 is not less than 20mm to 100 mm.
In the present embodiment, as shown in fig. 3, the relevant parameters of the left eyepiece lens 61 and the protective glass L5 on the CMOS image sensor (S10 is an object side surface, S11 is an image side surface, and S10 and S11 are both planar) are shown in table 1.
TABLE 1
Figure BDA0003523530350000101
The conditional expression satisfied by the right eyepiece 62 is identical to that satisfied by the left eyepiece 61. When the optical mechanism for reducing the image brightness adopts a film structure, the parameters of the right eyepiece lens 62 are completely consistent with the parameters of the left eyepiece lens 61. When a neutral filter F1 is added to the right eyepiece lens 62 (S12 is an object side surface, and S13 is an image side surface), and the neutral filter F1 is disposed at a position between the cemented lens L4 and the protective glass L5 on the CMOS image sensor, as shown in fig. 5, the relevant parameters of the right eyepiece lens 62 and the protective glass L5 on the CMOS image sensor are shown in table 2.
TABLE 2
Figure BDA0003523530350000111
Figure BDA0003523530350000121
As shown in fig. 12, the binocular endoscope system with a high dynamic range according to the present application includes the above-described binocular endoscope 1, an image processor 3 connected to the binocular endoscope 1 through a data transmission line 2, and a display 5 connected to the image processor 3 through a video signal transmission line 4.
In a specific using process, the binocular simultaneously observes an object to be imaged, the formed images are respectively recorded by the two CMOS image sensors 8, the CMOS image sensors 8 convert optical signals into electric signals, and the electric signals are respectively transmitted to the image processor 3 through the first data transmission line 21 and the second data transmission line 22. The image processor 3 is responsible for processing the original image signal output by the CMOS image sensor 8, so that the output image meets the requirement of endoscope observation in color and definition; meanwhile, the most important fusion of binocular images with different brightness is also completed on the image processor 3. Finally, the processed image data is transmitted to a display 5 through a video signal transmission line 4 (e.g., HDMI, DVI, SDI, etc.) for display.
As shown in fig. 13, 2 HDR algorithm implementations are presented. In the conventional HDR technology, firstly, a captured LDR (low dynamic range) image needs to be synthesized into an HDR high dynamic file, which cannot be directly displayed, and then, the HDR image file is converted into a low dynamic LDR image conforming to the visual observation effect of human eyes based on the response function and the tone mapping relationship of the capturing camera. This process is time consuming and requires full acquisition of the internal parameters of the camera. In view of the disadvantages of the above-mentioned approaches, the imaging method according to the present application utilizes an algorithm for image fusion (e.g. pyramid fusion algorithm), because image fusion does not depend on the parameters of the camera, but only on the luminance parameters of the original image to be fused.
The image pyramid is a collection of a series of sub-images at different resolutions. The image pyramid operation is a process of processing images according to different image resolutions. Specifically, the gaussian pyramid is a set of image sequences that are halved layer by layer in image resolution. The Gaussian pyramid is obtained by performing low-pass filtering on each layer of image in the image sequence and then performing interlaced and alternate downsampling on the previous layer of image. The laplacian pyramid is obtained by performing the following mathematical operations on the laplacian pyramid:
Li=Gi-PyrUp(PyrDown(Gi))
wherein L isiAn ith layer sub-image which is a Laplace pyramid image; giThe ith layer sub-image is a Gaussian pyramid image; PyrUp is pyramid upsampling process, that is, the image is interpolatedThe resolution is increased by two times in the horizontal and vertical directions respectively; PyrDown is the downsampling process of the pyramid, i.e. the resolution of the image is reduced by half in the horizontal and vertical directions by eliminating even number of rows or columns in the horizontal and vertical directions.
In addition, based on the characteristic that the brightness of the endoscope image is smooth in transition, the image pyramid layered fusion algorithm is utilized in the application, two LDR source images with different exposure degrees are fused through the image to form a high dynamic HDR image and the high dynamic HDR image is displayed without calculating and generating a high dynamic HDR file. Compared with the conventional HDR processing process, the process based on image fusion is relatively simple, and has the effect of improving the dynamic range of the image.
Specifically, as shown in fig. 21, the imaging method according to the present application includes the steps of:
step 1, acquiring two images shot by a binocular endoscope as original images to be processed. As shown in fig. 17-18, due to the limitation of the dynamic range of the imaging CMOS chip, there are significant situations of over-darkness and over-exposure in the right-eye and left-eye original images, and by a single exposure, an original image with relatively moderate brightness in the whole image cannot be obtained by the single exposure.
And 2, preprocessing the two original images to improve the signal-to-noise ratio of the two preprocessed images. The specific pretreatment comprises the following steps: and denoising the two original images respectively to improve the image quality.
And 3, calculating a Laplacian pyramid for the RGB components of each image to obtain a corresponding Laplacian pyramid image.
And 4, fusing the sub-images of the two Laplace pyramid images at the corresponding layers to obtain a fused Laplace pyramid image. That is, the subimages of the two laplacian pyramid images at the same resolution (i.e., at the same level) are accumulated and fused to obtain a fused laplacian pyramid image. Thus, in the resulting fused laplacian pyramid image, the sub-images of each hierarchy are known, as shown in fig. 14.
And 5, restoring a corresponding Gaussian pyramid image through the fused Laplace pyramid image, wherein the zeroth-layer sub-image G0 of the Gaussian pyramid image is the final target image, as shown in FIG. 19. The brightness of the target image obtained through processing is moderate, and the target image is relatively in line with the observation requirements of human eyes. There are basically no very dark and very bright areas in the target image, the whole image area contains effective information, and the relatively dark image area can be improved by the gamma parameter setting in the display of the subsequent step. However, when there are too dark regions (fig. 17a) and too bright regions (fig. 18a) in the image, it cannot be adjusted by the display parameter setting in the subsequent steps.
Specifically, the fused laplacian pyramid image has a series of high-frequency sub-image information with different resolutions, and mainly embodies details such as edges and contours of the image under different resolutions. In order to obtain the target image, the fused laplacian pyramid image needs to be restored to the gaussian pyramid image by means of interpolation, so as to obtain a zeroth-layer sub-image G0 of the gaussian pyramid image, where the zeroth-layer sub-image G0 is the target image.
In step 5, for the laplacian pyramid image with pyramid level n and the gaussian pyramid image, the following recurrence relation is given:
Figure BDA0003523530350000141
wherein L isnDenotes the highest level subimage of the Laplacian pyramid image with n layers, which is a known quantity, LkA k-th layer sub-image, which is a known quantity, G, representing a Laplace pyramid imagenThe highest-level subimage G of the Gaussian pyramid image* k+1The (k +1) th layer sub-image of the gaussian pyramid image is obtained by means of interpolation, namely, the resolution of the sub-image is increased by two times in the horizontal direction and the vertical direction.
The corresponding gaussian pyramid image can be restored from the fused laplacian pyramid image through the above formula, and the zeroth-layer sub-image G0 of the gaussian pyramid image is the final target image.
And 6, displaying the obtained target image on a display, and using gamma parameters to optimize the brightness of the displayed image.
The gamma parameter in the display is preferably 1.5. The gamma parameter of the display can improve the brightness of the dark part of the image. It can be seen that the dark part of the displayed image is significantly improved after the gamma is 1.5, as shown in fig. 20.
In the conventional pyramid image fusion algorithm process, it is necessary to evaluate the contrast, saturation, and exposure parameters of the source image and calculate the weight values of the three parameters corresponding to the pixels in each color channel, and this complicated process is not favorable for implementing real-time HDR, as shown in fig. 15. The imaging method is specific to the endoscope use scene, and as the endoscope image transition is smooth, the contrast and saturation parameters do not need extra attention; the exposure condition of the image can finally reach moderate image exposure through pyramid fusion, so that the exposure does not need initial attention and evaluation, the complex processes of original image contrast, saturation, exposure calculation and weight parameter calculation are omitted, the algorithm running time is greatly reduced, and meanwhile, the effect of a standard pyramid image fusion algorithm can also be achieved, as shown in fig. 16. Thus, the migration of an improved pyramid image fusion algorithm can be successfully implemented on an endoscope hardware platform.
According to the binocular endoscope, the binocular endoscope system and the imaging method with the high dynamic range, the binocular endoscope is used, the optical mechanism for reducing the image brightness is added into a target lens, two original images with high and low exposure at the same shooting moment can be obtained through single exposure, and shooting time delay does not exist in the two original images. Different from the conventional HDR method, the original images are always shot at the same time, and the problem of image dislocation of the images to be fused due to endoscope motion caused by the fact that the conventional HDR method needs to process adjacent frame images shot by a monocular head is solved. The HDR method is simple in implementation mode, two original images obtained through binocular fusion are synthesized through an image fusion method to obtain a target image, and the target image obtained through processing is moderate in brightness and meets the observation requirements of human eyes.
In addition, the optical mechanism for reducing the image brightness adopts an optical design scheme of a film layer structure or a neutral filter, so that the binocular imaging quality is completely consistent, the transmittance is different, two HDR source images are always synchronous, and the imaging quality is completely consistent. In the HDR image synthesis process, the problems of complicated image dislocation and the like caused by image motion are not required.
According to the imaging method, the image pyramid layered fusion method is adopted, and the images are subjected to layered processing of different resolution scales, so that the brightness fusion of the images is more sufficient, and the image pyramid layered fusion method is more suitable for the visual perception of human eyes. The pyramid fusion algorithm does not need to generate an HDR image file in the image processing process, and in addition, the color deviation which may occur in the traditional tone mapping algorithm can be reduced to the maximum extent.
The above description is only for the preferred embodiments of the present application, but the scope of the present application is not limited thereto, and any person skilled in the art can substitute or change the technical solution of the present application and its concept within the technical scope of the present application, and shall be covered by the scope of the present application.

Claims (10)

1. A binocular endoscope having a high dynamic range, comprising: the head of the lens tube is bilaterally and symmetrically provided with two lenses which are respectively marked as a left eyepiece lens and a right eyepiece lens, the left eyepiece lens or the right eyepiece lens is internally provided with an optical mechanism for reducing the image brightness, the lens tube is also internally provided with two CMOS image sensors, the two CMOS image sensors correspond to the two lenses one by one, and the CMOS image sensors are connected with an image processor; and a light guide optical fiber for illumination is also arranged in the endoscope body tube.
2. The binocular endoscope having a high dynamic range of claim 1, wherein: the brightness of the image obtained by the lens provided with the optical mechanism is 30-60% of the brightness of the image obtained by the lens not provided with the optical mechanism.
3. The binocular endoscope having a high dynamic range of claim 1, wherein: the structure of the lens group in the left eyepiece lens and the structure of the lens group in the right eyepiece lens are the same, the lens group sequentially comprises a first lens, a second lens, a third lens and a cemented lens from an object plane to an image plane, and a diaphragm is arranged between the second lens and the third lens.
4. The binocular endoscope having a high dynamic range of claim 3, wherein: the optical mechanism for reducing the image brightness, which is arranged in the left eyepiece lens or the right eyepiece lens, is a neutral filter, the neutral filter is arranged at a position between the cemented lens and the protective glass on the CMOS image sensor, and the object side surface and the image side surface of the neutral filter are both planes.
5. The binocular endoscope having a high dynamic range of claim 3, wherein: the optical mechanism for reducing the image brightness, which is disposed in the left eyepiece lens or the right eyepiece lens, is a film structure, in which case the image side surface of the cemented lens of the lens provided with the film structure is a plane on which a film is formed to form the film structure.
6. The binocular endoscope having a high dynamic range of claim 3, wherein: the left eyepiece meets the following conditional expression: TTL is more than or equal to 3.5mm and less than or equal to 7mm, wherein TTL is the total optical length of the left eyepiece of the binocular endoscope; f is more than or equal to 0.6mm and less than or equal to 1.2mm, wherein f represents the effective focal length of the left eyepiece of the binocular endoscope; the FOV is more than or equal to 80 degrees and less than or equal to 100 degrees, wherein the FOV represents the entrance pupil field angle of the left ocular lens; BFL is more than or equal to 1mm, wherein BFL represents the optical back focus of the left eyepiece lens; the depth of field of the left eyepiece is not less than 20 mm-100 mm; and the conditional expression satisfied by the right eyepiece is consistent with that satisfied by the left eyepiece.
7. A binocular endoscope system having a high dynamic range, characterized by: comprising the binocular endoscope of any one of claims 1 to 6, an image processor connected to the binocular endoscope through a data transmission line, and a display connected to the image processor through a video signal transmission line.
8. An imaging method based on the binocular endoscope system with high dynamic range of claim 7, characterized in that: the method comprises the following steps:
step 1, acquiring two images shot by a binocular endoscope, and using the two images as original images to be processed;
step 2, preprocessing the two original images to improve the signal-to-noise ratio of the two preprocessed images;
step 3, calculating a Laplacian pyramid for the RGB components of each image to obtain a corresponding Laplacian pyramid image;
step 4, fusing the subimages of the two Laplace pyramid images at the corresponding layers to obtain a fused Laplace pyramid image;
and 5, restoring a corresponding pyramid-of-gaussians image through the fused Laplace pyramid images, wherein the zeroth-layer sub-image of the pyramid-of-gaussians image is the final target image.
9. The imaging method according to claim 8, characterized in that: in the step 2, the pretreatment includes: and denoising the two original images respectively to improve the image quality.
10. The imaging method according to claim 8 or 9, characterized in that: in step 5, for the laplacian pyramid image and the gaussian pyramid image with the pyramid level n, the following recursion relationship is given:
Figure FDA0003523530340000021
wherein L isnDenotes the highest level subimage of the Laplacian pyramid image with n layers, which is a known quantity, LkA k-th layer sub-image, which is a known quantity, G, representing a Laplace pyramid imagenSub-image of the highest layer of the Gaussian pyramid image, G* k+1The (k +1) th layer sub-image of the gaussian pyramid image is obtained by means of interpolation, namely, the resolution of the sub-image is increased by two times in the horizontal direction and the vertical direction.
CN202210188171.6A 2022-02-28 2022-02-28 Binocular endoscope with high dynamic range, system and imaging method Pending CN114529477A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210188171.6A CN114529477A (en) 2022-02-28 2022-02-28 Binocular endoscope with high dynamic range, system and imaging method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210188171.6A CN114529477A (en) 2022-02-28 2022-02-28 Binocular endoscope with high dynamic range, system and imaging method

Publications (1)

Publication Number Publication Date
CN114529477A true CN114529477A (en) 2022-05-24

Family

ID=81624467

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210188171.6A Pending CN114529477A (en) 2022-02-28 2022-02-28 Binocular endoscope with high dynamic range, system and imaging method

Country Status (1)

Country Link
CN (1) CN114529477A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117314754A (en) * 2023-11-28 2023-12-29 深圳因赛德思医疗科技有限公司 Double-shot hyperspectral image imaging method and system and double-shot hyperspectral endoscope

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117314754A (en) * 2023-11-28 2023-12-29 深圳因赛德思医疗科技有限公司 Double-shot hyperspectral image imaging method and system and double-shot hyperspectral endoscope
CN117314754B (en) * 2023-11-28 2024-03-19 深圳因赛德思医疗科技有限公司 Double-shot hyperspectral image imaging method and system and double-shot hyperspectral endoscope

Similar Documents

Publication Publication Date Title
CN107959778B (en) Imaging method and device based on dual camera
CN103764013B (en) Automatic exposure control device, control device, endoscope apparatus and automatic exposure control method
CN103841879B (en) Image processing device for use with endoscope, endoscope, and image processing method
CN104203081B (en) The method that the eyes image of plural number is combined into the full optical image of multi-focus
CN107948519A (en) Image processing method, device and equipment
CN108055452A (en) Image processing method, device and equipment
CN108154514B (en) Image processing method, device and equipment
CN107948500A (en) Image processing method and device
JP4197915B2 (en) Stereoscopic imaging device
US20150334366A1 (en) Image processing device, image processing method, non-transitory tangible medium having image processing program, and image-pickup device
CN108024054B (en) Image processing method, device, equipment and storage medium
CN108712608A (en) Terminal device image pickup method and device
US20090059364A1 (en) Systems and methods for electronic and virtual ocular devices
CN111308690B (en) Optical field electronic endoscopic equipment and imaging method thereof
CN105651384A (en) Full-light information collection system
JP2007214964A (en) Video display device
CN107872631B (en) Image shooting method and device based on double cameras and mobile terminal
CN108737696A (en) Picture pick-up device, control method and non-transitory storage medium
CN101540822A (en) Device and method for high-resolution large-viewing-field aerial image forming
CN108024057A (en) Background blurring processing method, device and equipment
CN108024056A (en) Imaging method and device based on dual camera
WO2017133160A1 (en) Smart eyeglass perspective method and system
TW201222139A (en) Stereoscopic imaging apparatus
CN107995396B (en) Two camera modules and terminal
CN109428987A (en) A kind of 360 degree of stereo photographic devices of wear-type panorama and image pickup processing method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination