CN118250542A - Three-wafer endoscope imaging method and imaging device - Google Patents

Three-wafer endoscope imaging method and imaging device Download PDF

Info

Publication number
CN118250542A
CN118250542A CN202410659521.1A CN202410659521A CN118250542A CN 118250542 A CN118250542 A CN 118250542A CN 202410659521 A CN202410659521 A CN 202410659521A CN 118250542 A CN118250542 A CN 118250542A
Authority
CN
China
Prior art keywords
image
exposure
imaging
light
channel
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202410659521.1A
Other languages
Chinese (zh)
Inventor
陆汇海
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Bosheng Medical Technology Co ltd
Original Assignee
Shenzhen Bosheng Medical Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Bosheng Medical Technology Co ltd filed Critical Shenzhen Bosheng Medical Technology Co ltd
Priority to CN202410659521.1A priority Critical patent/CN118250542A/en
Publication of CN118250542A publication Critical patent/CN118250542A/en
Pending legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/555Constructional details for picking-up images in sites, inaccessible due to their dimensions or hazardous conditions, e.g. endoscopes or borescopes
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/55Optical parts specially adapted for electronic image sensors; Mounting thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/56Cameras or camera modules comprising electronic image sensors; Control thereof provided with illuminating means
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/741Circuitry for compensating brightness variation in the scene by increasing the dynamic range of the image compared to the dynamic range of the electronic image sensors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/95Computational photography systems, e.g. light-field imaging systems
    • H04N23/951Computational photography systems, e.g. light-field imaging systems by using two or more images to influence resolution, frame rate or aspect ratio

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Computing Systems (AREA)
  • Theoretical Computer Science (AREA)
  • Endoscopes (AREA)

Abstract

The application provides a three-wafer endoscope imaging method and an imaging device, wherein the imaging device comprises a light source assembly, an optical lens assembly, a light splitting element, a first image sensor, a second image sensor, a third image sensor, a first exposure controller, a second exposure controller and an image processor, wherein the light source assembly is provided with a white light mode and a fluorescence mode, the optical lens assembly is used for receiving imaging light rays from an area to be observed, and the light splitting element is used for dividing the imaging light rays into three in the fluorescence mode to respectively form first imaging light rays, second imaging light rays and fluorescence imaging light rays. Through the cooperation of beam split component and exposure controller, image processor for the formation of image of endoscope can both realize the double exposure HDR function under two modes, and has that the whole field luminance of image is unanimous, color is reduced high, the resolution is good, signal to noise ratio is higher advantage, makes the operation smoother, and efficiency is higher, has broken through the application of double exposure HDR technique in the endoscope.

Description

Three-wafer endoscope imaging method and imaging device
Technical Field
The application relates to the technical field of endoscope imaging, in particular to a three-wafer endoscope imaging method and an imaging device.
Background
Double exposure HDR means that images are fused after high and low exposure, so that the brightness of a picture in a field of view is ensured to be uniform; neither the low exposure (under exposed) nor the over exposure (over exposed) areas. One disadvantage of double exposure HDR is that it requires two frames of images, one short and one long, and therefore causes a drop in the actual frame rate and is prone to motion blur.
Because the endoscope is a device for acquiring and displaying in real time, HDR can not be applied to the endoscope through double exposure. However, due to the complexity of the tissue structure in the body, especially when the scene contains a distant view and a close view, the overexposure phenomenon can not be avoided at a certain picture proportion. If the far view and the near view are applied in the same scene, the near view overexposure can be caused in order to meet the illumination requirement of the far view, and the far view image is too dark and the display is unclear in order to meet the illumination requirement of the near view.
Therefore, in order to solve this problem, in the prior art, different exposure strategies are performed by using images acquired by two image sensors, and then the two images after exposure are fused to realize double exposure HDR in the endoscope in the white light mode, but in the fluorescent mode, one path is required to acquire fluorescent signals, so that the double exposure HDR cannot be realized.
Disclosure of Invention
The application provides a three-wafer endoscope imaging method and an imaging device, which can realize double exposure HDR functions in a fluorescence mode and a white light mode.
According to a first aspect, the present application provides a three-wafer endoscopic imaging device comprising:
A light source assembly having a fluorescent mode for providing illumination to an area to be observed;
an optical lens assembly for receiving imaging light from an area to be observed;
The light splitting element is used for splitting the imaging light into three in a fluorescence mode to respectively form a first imaging light, a second imaging light and a fluorescence imaging light;
the first image sensor is arranged on the optical path of the first imaging light ray and is used for receiving the first imaging light ray and performing photoelectric conversion to form a first image;
the second image sensor is arranged on the optical path of the second imaging light ray and is used for receiving the second imaging light ray and performing photoelectric conversion to form a second image;
The third image sensor is arranged on the optical path of the fluorescence imaging light ray and is used for receiving the fluorescence imaging light ray and performing photoelectric conversion to form a fluorescence image;
the first exposure controller is used for exposing the first image to obtain a first exposure image;
the second exposure controller is used for exposing the second image to obtain a second exposure image;
and an image processor for fusing the first exposure image and the second exposure image into a high dynamic white light image, which is fused with a fluorescent image, and outputting as an endoscopic image.
In an embodiment, the light source assembly further includes a white light mode, and in the white light mode, the light splitting element is configured to split the imaging light into two parts, so as to form a first imaging light and a second imaging light respectively.
In one embodiment, the light source assembly includes a white light source for providing white light illumination to the region to be observed in a white light mode and a fluorescence excitation light source for providing laser illumination to the region to be observed in a fluorescence mode.
In one embodiment, the first image sensor is a bayer color sensor, the second image sensor is a monochrome sensor, and the third image sensor is one of a bayer color sensor or a monochrome sensor.
In one embodiment, the endoscope further comprises a display assembly electrically connected to the image processor for displaying the endoscopic image.
In one embodiment, the first exposure controller is electrically connected to the first image sensor and the image processor, respectively, and is configured to expose the first image with a first exposure policy to obtain a first exposure image; the second exposure controller is respectively and electrically connected with the second image sensor and the image processor and is used for exposing the second image with a second exposure strategy to obtain a second exposure image, and the image processor is used for fusing the first exposure image and the second exposure image into a high dynamic white light image; the third image sensor is electrically connected with the image processor and is used for receiving the fluorescence imaging light rays and performing photoelectric conversion to form a fluorescence image, and the image processor can process and fuse the high-dynamic white light image and the fluorescence image and output the high-dynamic white light image and the fluorescence image into an endoscope image.
According to a second aspect, the present application provides a three-wafer endoscopic imaging method comprising the steps of:
In a fluorescence mode, dividing imaging light of a region to be observed into three through a light splitting element to respectively form first imaging light, second imaging light and fluorescence imaging light;
Receiving a first control command, performing photoelectric conversion on first imaging light rays by using a first imaging sensor to form a first image, and performing first exposure on the first image by adopting a first exposure strategy to obtain a first exposure image;
Receiving a second control command, performing photoelectric conversion on second imaging light rays by using a second imaging sensor to form a second image, and performing second exposure on the second image by adopting a second exposure strategy to obtain a second exposure image;
Photoelectric conversion is carried out on fluorescent imaging light rays by using a third imaging sensor to form a fluorescent image;
And receiving a third control command, and fusing the first exposure image, the second exposure image and the fluorescent image by adopting an algorithm to output an endoscope image.
In one embodiment, the first exposure strategy is to split the first image into an R channel image, a G channel image and a B channel image, calculate exposure parameters of the first exposure strategy according to optimal exposure values of the R channel image and the B channel image, and expose the first image to form a first exposure image, wherein the first exposure image is an RGB channel image; the second exposure strategy is to split the second image into an R channel image, a G channel image and a B channel image, calculate exposure parameters of the second exposure strategy according to the optimal exposure value of the G channel image, and expose the second image to form a second exposure image, wherein the second exposure image is the G channel image; the method for fusing and outputting the endoscope image by adopting the algorithm to the first exposure image, the second exposure image and the fluorescent image comprises the following steps: replacing the G-channel image in the first exposure image with the second exposure image; demosaicing the replaced three channel images to form a high-dynamic white light image; and fusing and outputting the high dynamic white light image and the fluorescent image into an endoscope image.
In one embodiment, the method for calculating the exposure parameters of the first exposure strategy by using the optimal exposure values of the R-channel image and the B-channel image is as follows: dividing a first image into a plurality of areas, carrying out regional photometry on the areas at the same time, respectively obtaining optimal exposure values of the R channel image and the B channel image in a mode of optimal brightness and no overexposure, and taking the optimal exposure values of the R channel image or the B channel image as exposure parameters or taking a weighted average of the optimal exposure values of the R channel image and the B channel image as exposure parameters; the method for calculating the exposure parameters of the second exposure strategy by using the optimal exposure value of the G channel image comprises the following steps: respectively acquiring the optimal exposure value of the G channel image by one of a first photometry, a second photometry and a third photometry, and taking the optimal exposure value of the G channel image as an exposure parameter of a second exposure strategy; the first photometry is to divide the second image into a plurality of areas, and simultaneously photometry a plurality of areas, so that the brightness of each pixel of the areas of the second image is at least partially in an overexposure mode; the second photometry is to divide a second image into a central area and an edge area, and perform photometry on the central area so that the brightness of each pixel in the central area is in an optimal brightness mode, wherein the central area is an area to be observed carefully; and the third photometry is to determine the instrument area in the second image by utilizing an image recognition and segmentation algorithm, and perform photometry on the residual area excluding the instrument area, so that the brightness of each pixel of the residual area is in an optimal brightness mode.
In one embodiment, the method further includes splitting the imaging light of the region to be observed into two by the light splitting element in a white light mode to form a first imaging light and a second imaging light respectively;
Receiving a first control command, performing photoelectric conversion on first imaging light rays by using a first imaging sensor to form a first image, and performing first exposure on the first image by adopting a first exposure strategy to obtain a first exposure image;
Receiving a second control command, performing photoelectric conversion on second imaging light rays by using a second imaging sensor to form a second image, and performing second exposure on the second image by adopting a second exposure strategy to obtain a second exposure image;
and receiving a third control command, and fusing and outputting the endoscope image by adopting an algorithm on the first exposure image and the second exposure image.
According to the endoscope imaging device in the embodiment, through the arrangement of the light splitting element and the three image sensors, each sensor is correspondingly arranged on the optical path of one imaging light ray after the light splitting element splits light and is matched with the exposure controller and the image processor, two paths of white light imaging light rays are formed in a white light mode, the two paths of white light imaging light rays are fused to form a double-exposure HDR function, the light splitting element can split white light and fluorescence to form different images in a fluorescence mode and finally fused, and the double-exposure HDR function can be realized in the fluorescence mode, so that the imaging of the endoscope has the advantages of consistent brightness of an image full field, high color reduction, good resolution and higher signal-to-noise ratio in the two modes, the whole operation is smoother, the efficiency is higher, and the application of the double-exposure HDR technology in the endoscope is broken through.
Drawings
FIG. 1 is a schematic diagram of an imaging device;
FIG. 2 is a schematic diagram of a spectroscopic element;
FIG. 3 is a flow chart of a three wafer endoscopic imaging method;
fig. 4 is a schematic view of area division during calculation of exposure parameters of the first exposure strategy.
Wherein: 100. a light source assembly; 200. an optical lens assembly; 300. a spectroscopic element; 400. a first image sensor; 500. a second image sensor; 600. a third image sensor; 700. a first exposure controller; 800. a second exposure controller; 900. an image processor; 1000. and a display assembly.
Detailed Description
The application will be described in further detail below with reference to the drawings by means of specific embodiments. Wherein like elements in different embodiments are numbered alike in association. In the following embodiments, numerous specific details are set forth in order to provide a better understanding of the present application. However, one skilled in the art will readily recognize that some of the features may be omitted, or replaced by other elements, materials, or methods in different situations. In some instances, related operations of the present application have not been shown or described in the specification in order to avoid obscuring the core portions of the present application, and may be unnecessary to persons skilled in the art from a detailed description of the related operations, which may be presented in the description and general knowledge of one skilled in the art.
Furthermore, the features, operations, or characteristics described in the specification may be combined in any suitable manner to form various embodiments, and the operational steps involved in the embodiments may be sequentially exchanged or adjusted in a manner apparent to those skilled in the art. Accordingly, the description and drawings are merely for clarity of describing certain embodiments and are not necessarily intended to imply a required composition and/or order.
The numbering of the components itself, e.g. "first", "second", etc., is used herein merely to distinguish between the described objects and does not have any sequential or technical meaning. The term "coupled" as used herein includes both direct and indirect coupling (coupling), unless otherwise indicated.
Hereinafter, some terms in the present application will be explained. It should be noted that these explanations are for the convenience of those skilled in the art, and do not limit the scope of the present application.
1. Pixel arrangement
The pixel may refer to a minimum unit constituting an imaging region of the image sensor.
2. Image fusion
Image fusion is an image processing technology, which means that image data about the same target acquired by a multi-source channel is subjected to image processing, specific algorithm calculation and the like, beneficial information in each channel is extracted to the greatest extent, and finally, a high-quality (such as brightness, definition and color) image is synthesized, and compared with an original image, the fused image has higher resolution.
3. Demosaicing
Demosaicing or de-mosaicing, demosaicking, debayering is a digital image processing algorithm that is designed to reconstruct a full-Color image from an incomplete Color sample output from a photosensitive element covered with a Color filter array (Color FILTER ARRAY, CFA for short).
4. Color channel
The channel that holds the image color information is called a color channel. Each image has one or more color channels, and the default number of color channels in an image depends on its color mode, i.e. the color mode of an image will determine its number of color channels. For example, an RGB image has 3 channels including an R channel, a B channel, and a G channel.
5. Photometry (photometry mode)
The photometry mode is a mode for testing the reflectivity of a camera object. Including central heavy spot metering (accent metering), global metering (average metering), and spot metering.
The central heavy point light measurement (key light measurement) method is to measure light of a certain part of the picture; the global light measurement is to calculate the light measurement value on average in the whole scenic spot; spot metering is metering of light in the region of 1% -5% of the viewing range.
Referring to fig. 1 and 2, the present application provides a three-wafer endoscope imaging apparatus, which includes a light source assembly 100, an optical lens assembly 200, a light splitting element 300, a first image sensor 400, a second image sensor 500, a third image sensor 600, a first exposure controller 700, a second exposure controller 800, and an image processor 900, wherein the light source assembly 100 has a white light mode and a fluorescent mode for providing illumination to the region to be observed, the optical lens assembly 200 is used for receiving imaging light from the region to be observed, the light splitting element 300 is used for splitting the imaging light into two in the white light mode to form a first imaging light and a second imaging light respectively, the light splitting element 300 is used for splitting the imaging light into three in the fluorescent mode to form a first imaging light, a second imaging light and a fluorescent imaging light respectively, the first image sensor 400 is disposed on the optical path of the first imaging light, and is configured to receive the first imaging light and photoelectrically convert the first imaging light to form a first image, the second image sensor 500 is disposed on the optical path of the second imaging light, and is configured to receive the second imaging light and photoelectrically convert the second imaging light to form a second image, the third image sensor 600 is disposed on the optical path of the fluorescent imaging light, and is configured to receive the fluorescent imaging light and photoelectrically convert the fluorescent imaging light to form a third image, the first exposure controller 700 is configured to expose the first image to obtain a first exposure image, the second exposure controller 800 is configured to expose the second image to obtain a second exposure image, the image processor 900 is configured to fuse the first exposure image and the second exposure image into a high dynamic white light image, the high dynamic white light image and the fluorescent image are fused, and output as an endoscopic image.
Further, the light source assembly 100 includes a white light source for providing white light illumination to the region to be observed in a white light mode and a fluorescent excitation light source for providing laser illumination to the region to be observed in a fluorescent mode.
The light-splitting element 300 may be a Beam Splitter (BS) or a beam splitter plate. The spectroscopic prism is formed by plating one or more thin films (i.e., spectroscopic films) on the surface of the prism, and the spectroscopic plate is formed by plating one or more thin films (i.e., spectroscopic films) on one surface of the glass plate. The beam splitting prism and the beam splitting flat plate utilize the difference of the transmittance and the reflectance of the film to the incident light so as to split the imaging light transmitted by the optical lens assembly 200.
In order not to influence the imaging quality of the endoscope after being switched between the fluorescence mode and the white light mode, at least one path of the endoscope is used for collecting the high-dynamic white light image, and the other path of the endoscope is used for collecting the fluorescence image in a beam splitting mode. However, such designs tend to increase the complexity of the endoscope structure, and also present a barrier to the advancement of the double exposure HDR functionality.
The light splitting element 300 of the present application adopts a three-beam prism, in a fluorescence mode, the imaging light is divided into a first imaging light, a second imaging light and a fluorescence imaging light, the first image sensor 400, the second image sensor 500 and the third image sensor 600 are respectively and correspondingly arranged on three light paths, the first image sensor 400 and the second image sensor 500 are mutually matched to realize a double exposure HDR function, and a high dynamic white light image with clear image and consistent whole field brightness is obtained under the function, and the fusion of the high dynamic white light image and the fluorescence image can also be beneficial to the double exposure HDR function to realize the improvement of image quality under the fluorescence mode. In white light mode, without interference from fluorescent imaging light, the imaging device operates in accordance with a dual-wafer endoscopic imaging mode.
Further, the first image sensor 400 is a Bayer (Bayer) color sensor, the second image sensor 500 is a monochrome sensor, and the third image sensor 600 is one of a Bayer color sensor and a monochrome sensor, so that in order to ensure that the high dynamic white light image HDR is realized, the pixel sizes (pixel sizes) of the first image sensor 400 and the second image sensor 500 are consistent, and sub-pixel (sub-pixel) accuracy is achieved during installation. When the third image sensor 600 is also a monochrome sensor, it is consistent with the pixel size (pixel size) of the first and second image sensors 400 and 500, and sub-pixel (sub-pixel) accuracy is achieved when mounting.
In another embodiment, the first image sensor 400 and the second image sensor 500 are each Bayer (Bayer) color sensors, and the third image sensor 600 is one of a Bayer color sensor or a monochrome sensor.
It will be appreciated that the information carried by the first imaging light, the second imaging light, and the fluorescent imaging light is the same, and the information carried by the first imaging light, the second imaging light, and the fluorescent imaging light is the same as the information carried by the imaging light transmitted from the optical lens assembly 200. Wherein the sum of the intensity of the first imaging light, the intensity of the second imaging light and the intensity of the fluorescent imaging light is equal to or approximately equal to the intensity of the imaging light transmitted by the optical lens assembly 200, wherein the carried information includes color information such as R (red), B (blue) and G (green) information.
More specifically, the first imaging light ray and the second imaging light ray are both visible light rays, the third imaging light ray is fluorescent light rays, and the imaging of the first imaging light ray and the imaging of the second imaging light ray are finally fused to form a high-dynamic white light image, and the fluorescent light rays are finally imaged to form a fluorescent image.
In one embodiment, the display assembly 1000 is further included, and the display assembly 1000 is electrically connected to the image processor 900 for displaying the endoscopic image. The display assembly 1000 includes a display screen capable of displaying endoscopic images in real time for viewing and manipulation by medical personnel.
Referring to fig. 3, the present application also provides an imaging method using the three-wafer endoscopic imaging device, comprising the steps of:
s101: the imaging light is split into three by the spectroscopic element 300.
Specifically, in the fluorescence mode, the imaging light of the region to be observed is split into three by the light splitting element 300, so as to form a first imaging light, a second imaging light and a fluorescence imaging light respectively.
S102: and receiving a first control command, and exposing the first imaging light to form a first exposure image.
Specifically, a first control command is received, a first imaging sensor is used for performing photoelectric conversion on first imaging light to form a first image, and a first exposure strategy is adopted for performing first exposure on the first image to obtain a first exposure image.
S103: and receiving a second control command, and exposing the second imaging light to form a second exposure image.
Receiving a second control command, performing photoelectric conversion on second imaging light rays by using a second imaging sensor to form a second image, and performing second exposure on the second image by adopting a second exposure strategy to obtain a second exposure image;
S104: and performing photoelectric conversion on the fluorescence imaging light rays by using a third imaging sensor to form a fluorescence image.
S105: and receiving a third control command, and fusing and outputting the endoscope image by adopting an algorithm on the first exposure image, the second exposure image and the fluorescent image.
The control command may be automatically generated by a control module of the endoscope, or may be manually input.
Since the G channel in the Bayer (Bayer) format color sensor is easy to reach a saturated state and overexposure is easy to cause the R channel and the B channel, the R channel or the B channel is used as a standard for controlling exposure in the color sensor, that is, the exposure parameter of the R channel or the G passing image at the optimal brightness is used as the exposure parameter of the first exposure strategy, so that the data range and the signal to noise ratio of the R channel and the B channel can be improved. For example, a 10-bit Bayer color sensor, R/B is typically in the range of 0-600 (10 bit sensor, maximum 1023). After R or B is used as a standard, the value range of R/B can be greatly increased to be within the range of 100-600.
The application creatively utilizes three sensors to expose respectively, wherein two sensors are used for imaging and exposing high dynamic white light images, one of the two sensors is a Bayer color sensor, the sensor takes an R or B channel as a standard for controlling exposure, the other sensor is a monochromatic sensor, the monochromatic sensor senses a G channel, the value range and the signal-to-noise ratio of the R or B channel can be improved, the resolution and the signal-to-noise ratio of the G channel can also be improved, and more accurate color restoration, better resolution (G is a full resolution image) and higher signal-to-noise ratio are finally realized, and the quality of the high dynamic white light images is improved. And then the third image sensor 600 is utilized to conduct fluorescence sensitization, the obtained fluorescence image and the high dynamic white light image are fused and output into an endoscope image, and double exposure HDR can be achieved in a fluorescence mode, so that the endoscope can achieve double exposure HDR in both a white light mode and a fluorescence mode.
The fusion of the fluorescent image and the high dynamic white light image can adopt a pixel-level fusion method, which is the prior art and is not described herein.
In one embodiment, the first exposure strategy is to split the first image into an R channel image, a G channel image and a B channel image, calculate exposure parameters of the first exposure strategy according to optimal exposure values of the R channel image and the B channel image, and expose the first image to form a first exposure image, wherein the first exposure image is an RGB channel image; the second exposure strategy is to split the second image into an R channel image, a G channel image and a B channel image, calculate the exposure parameters of the second exposure strategy according to the optimal exposure value of the G channel image, and expose the second image to form a second exposure image, wherein the second exposure image is the G channel image.
Specifically, when determining the optimal exposure value, the method is a subjective evaluation method, the image quality after exposure is evaluated based on human eyes, the image quality is relatively better, and the corresponding exposure parameter is the optimal exposure value.
In one embodiment, the method for fusing the first exposure image, the second exposure image and the fluorescence image to output the endoscope image by adopting an algorithm is as follows: replacing the G-channel image in the first exposure image with the second exposure image; demosaicing the replaced three channel images to form a high-dynamic white light image; and fusing and outputting the high dynamic white light image and the fluorescent image into an endoscope image.
When the first exposure image and the second exposure image are fused into a high dynamic white light image, gain compensation is required to be carried out on the data of the three replaced channel images, and the compensation coefficient r of the gain compensation is calculated as follows:
r = (e_rb* ag_rb) / (e_g * ag_g),
Wherein e is an exposure parameter, ag is an analog gain parameter, e_rb is an exposure parameter of the first exposure strategy, ag_rb is an analog gain parameter when the first exposure strategy is performed, e_g is an exposure parameter of the second exposure strategy, and ag_g is an analog gain parameter of the second exposure strategy.
Specifically, the analog gain parameter is the voltage required for displaying the highest brightness, and the brightness of the image can be adjusted by adjusting the analog gain under different light rays.
In one embodiment, the method for calculating the exposure parameters of the first exposure strategy with the optimal exposure values of the R-channel image and the B-channel image is as follows: dividing the first image into a plurality of areas, carrying out regional photometry on the areas at the same time, respectively obtaining the optimal exposure values of the R channel image and the B channel image in the mode of optimal brightness and no overexposure, and taking the optimal exposure values of the R channel image or the B channel image as exposure parameters or taking the weighted average of the optimal exposure values of the R channel image and the B channel image as exposure parameters.
Referring to fig. 4, the region to be observed is divided into 15 sub-regions, namely 0-14 sub-regions, and 0-14 sub-regions all participate in light measurement, so that the brightness of each pixel of the 0-14 sub-regions is in a globally optimal brightness and overexposure-free mode, and the exposure parameters in the mode are used as the exposure parameters of a first exposure strategy, and a first exposure image is obtained by exposing a first image. The light metering mode is also called a global light metering mode, and the global optimal brightness mode is the histogram distribution balance of the whole first exposure image. The histogram equalization can effectively enhance local contrast without affecting overall contrast, effectively expand commonly used brightness to realize the function, accord with sensory cognition of human eyes to images, and effectively improve user experience.
In one embodiment, the method for calculating the exposure parameter of the second exposure strategy with the optimal exposure value of the G-channel image is: and respectively acquiring the optimal exposure value of the G channel image by one of a first photometry, a second photometry and a third photometry, and taking the optimal exposure value of the G channel image as an exposure parameter of a second exposure strategy.
In one embodiment, the first photometry divides the second image into a plurality of areas, and performs photometry on the plurality of areas at the same time, so that the brightness of each pixel of the plurality of areas of the second image is at least partially in an overexposure mode; for example, as shown in fig. 4, the image is divided into 15 areas, the light metering is performed on a plurality of areas at the same time, and all the 0-14 subareas participate in the light metering, so that the brightness of each pixel of the 0-14 subareas is balanced and optimal, and meanwhile, a certain image proportion is allowed to generate an overexposure phenomenon, such as (5-10%) of the image area, so that no dark area exists in the image.
In one embodiment, the second photometry divides the second image into a central area and an edge area, and performs photometry on the central area, so that the brightness of each pixel of the central area is in a mode of optimal brightness, and the central area is an area to be carefully observed; the second photometry method is also called a central photometry mode, which uses a central area as an area to be observed, ensures the image brightness of the area during photometry, and as shown in fig. 4, divides a second image into 15 sub-areas, wherein 0 sub-areas are areas to be observed according to a first image dividing method, and finally obtains a second exposure image with the center at the optimal brightness by taking parameters of ensuring the brightness of the 0 areas as exposure parameters of a second exposure strategy.
In one embodiment, the third photometry method determines the instrument area in the second image by using an image recognition and segmentation algorithm, and performs photometry on the remaining area excluding the instrument area, so that the brightness of each pixel of the remaining area is in an optimal brightness mode.
In one embodiment, the method further includes splitting the imaging light of the region to be observed into two by the light splitting element 300 in the white light mode to form a first imaging light and a second imaging light respectively; receiving a first control command, performing photoelectric conversion on first imaging light rays by using a first imaging sensor to form a first image, and performing first exposure on the first image by adopting a first exposure strategy to obtain a first exposure image; receiving a second control command, performing photoelectric conversion on second imaging light rays by using a second imaging sensor to form a second image, and performing second exposure on the second image by adopting a second exposure strategy to obtain a second exposure image; and receiving a third control command, and fusing and outputting the endoscope image by adopting an algorithm on the first exposure image and the second exposure image.
The foregoing description of the application has been presented for purposes of illustration and description, and is not intended to be limiting. Several simple deductions, modifications or substitutions may also be made by a person skilled in the art to which the application pertains, based on the idea of the application.

Claims (10)

1. A three-wafer endoscopic imaging device, comprising:
A light source assembly having a fluorescent mode for providing illumination to an area to be observed;
An optical lens assembly for receiving imaging light from the region to be observed;
The light splitting element is used for splitting the imaging light into three in the fluorescence mode to form a first imaging light, a second imaging light and a fluorescence imaging light respectively;
the first image sensor is arranged on the optical path of the first imaging light ray and is used for receiving the first imaging light ray and performing photoelectric conversion to form a first image;
the second image sensor is arranged on the optical path of the second imaging light ray and is used for receiving the second imaging light ray and performing photoelectric conversion to form a second image;
The third image sensor is arranged on the optical path of the fluorescence imaging light ray and is used for receiving the fluorescence imaging light ray and performing photoelectric conversion to form a fluorescence image;
the first exposure controller is used for exposing the first image to obtain a first exposure image;
the second exposure controller is used for exposing the second image to obtain a second exposure image;
And an image processor for fusing the first exposure image and the second exposure image into a high dynamic white light image, which is processed and fused with the fluorescent image, and outputting as an endoscope image.
2. The three-wafer endoscopic imaging device of claim 1, wherein the light source assembly further comprises a white light mode in which the light splitting element is configured to split the imaging light into two, forming a first imaging light and a second imaging light, respectively.
3. The three-wafer endoscopic imaging device of claim 2, wherein the light source assembly comprises a white light source for providing white light illumination to the region to be observed and a fluorescence excitation light source for providing laser illumination to the region to be observed in a white light mode.
4. The three-wafer endoscopic imaging device of claim 3, wherein the first image sensor is a bayer color sensor, the second image sensor is a monochrome sensor, and the third image sensor is one of a bayer color sensor and a monochrome sensor;
Or, the first image sensor and the second image sensor are both bayer color sensors, and the third image sensor is one of bayer color sensors and monochrome sensors.
5. The three-wafer endoscopic imaging device of claim 1, further comprising a display assembly electrically connected to the image processor for displaying the endoscopic image.
6. The three-wafer endoscopic imaging device of claim 1, wherein the first exposure controller is electrically connected to the first image sensor, the image processor, respectively, for exposing the first image with a first exposure strategy to obtain a first exposure image; the second exposure controller is respectively and electrically connected with the second image sensor and the image processor and is used for exposing the second image with a second exposure strategy to obtain a second exposure image, and the image processor is used for processing and fusing the first exposure image and the second exposure image into the high dynamic white light image;
The third image sensor is electrically connected with the image processor and is used for receiving the fluorescence imaging light rays and performing photoelectric conversion to form the fluorescence image, and the image processor can process and fuse the high-dynamic white light image and the fluorescence image and output the high-dynamic white light image and the fluorescence image as the endoscope image.
7. A three-wafer endoscopic imaging method, characterized in that the imaging method is applied to the imaging device according to any one of claims 1 to 6, comprising the steps of:
In a fluorescence mode, dividing imaging light of a region to be observed into three through a light splitting element to respectively form first imaging light, second imaging light and fluorescence imaging light;
the first imaging sensor receives a first control command, performs photoelectric conversion on the first imaging light according to the first control command to form a first image, and performs first exposure on the first image by adopting a first exposure strategy to obtain a first exposure image;
Receiving a second control command, performing photoelectric conversion on the second imaging light by using a second imaging sensor to form a second image, and performing second exposure on the second image by adopting a second exposure strategy to obtain a second exposure image;
Photoelectric conversion is carried out on the fluorescence imaging light rays by using a third imaging sensor to form a fluorescence image;
And receiving a third control command, and fusing the first exposure image, the second exposure image and the fluorescent image by adopting an algorithm to output an endoscope image.
8. The three-wafer endoscopic imaging method of claim 7, wherein the first exposure strategy is to split the first image into an R-channel image, a G-channel image, and a B-channel image, calculate exposure parameters of the first exposure strategy with optimal exposure values of the R-channel image and the B-channel image, and expose the first image to form a first exposure image, the first exposure image being an RGB-channel image; the second exposure strategy is to split the second image into an R channel image, a G channel image and a B channel image, calculate exposure parameters of the second exposure strategy according to the optimal exposure value of the G channel image, and expose the second image to form a second exposure image, wherein the second exposure image is the G channel image;
The method for fusing and outputting the endoscope image by adopting the algorithm to the first exposure image, the second exposure image and the fluorescent image comprises the following steps: replacing the G-channel image in the first exposure image with the second exposure image; demosaicing the replaced three channel images to form a high-dynamic white light image; and fusing and outputting the high dynamic white light image and the fluorescent image into an endoscope image.
9. The three-wafer endoscopic imaging method of claim 8, wherein the method of calculating the exposure parameters of the first exposure strategy with the optimal exposure values of the R-channel image and the B-channel image is: dividing the first image into a plurality of areas, carrying out regional photometry on the areas at the same time, respectively obtaining optimal exposure values of the R channel image and the B channel image in a mode of optimal brightness and no overexposure, and taking the optimal exposure values of the R channel image or the B channel image as exposure parameters or taking a weighted average value of the optimal exposure values of the R channel image and the B channel image as exposure parameters;
The method for calculating the exposure parameters of the second exposure strategy by using the optimal exposure value of the G channel image comprises the following steps: respectively acquiring the optimal exposure value of the G channel image by one of a first photometry, a second photometry and a third photometry, and taking the optimal exposure value of the G channel image as an exposure parameter of a second exposure strategy;
the first photometry is to divide the second image into a plurality of areas, and simultaneously photometry a plurality of areas, so that the brightness of each pixel of the areas of the second image is at least partially in an overexposure mode;
The second photometry is to divide a second image into a central area and an edge area, and perform photometry on the central area so that the brightness of each pixel in the central area is in an optimal brightness mode, wherein the central area is an area to be observed carefully;
And the third photometry is to determine the instrument area in the second image by utilizing an image recognition and segmentation algorithm, and perform photometry on the residual area excluding the instrument area, so that the brightness of each pixel of the residual area is in an optimal brightness mode.
10. The three-wafer endoscopic imaging method as defined in claim 7, further comprising dividing imaging light of the region to be observed into two by a light splitting element in a white light mode to form a first imaging light and a second imaging light, respectively;
receiving a first control command, performing photoelectric conversion on the first imaging light by using a first imaging sensor to form a first image, and performing first exposure on the first image by adopting a first exposure strategy to obtain a first exposure image;
Receiving a second control command, performing photoelectric conversion on the second imaging light by using a second imaging sensor to form a second image, and performing second exposure on the second image by adopting a second exposure strategy to obtain a second exposure image;
and receiving a third control command, and fusing and outputting the endoscope image by adopting an algorithm on the first exposure image and the second exposure image.
CN202410659521.1A 2024-05-27 2024-05-27 Three-wafer endoscope imaging method and imaging device Pending CN118250542A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202410659521.1A CN118250542A (en) 2024-05-27 2024-05-27 Three-wafer endoscope imaging method and imaging device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202410659521.1A CN118250542A (en) 2024-05-27 2024-05-27 Three-wafer endoscope imaging method and imaging device

Publications (1)

Publication Number Publication Date
CN118250542A true CN118250542A (en) 2024-06-25

Family

ID=91562764

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202410659521.1A Pending CN118250542A (en) 2024-05-27 2024-05-27 Three-wafer endoscope imaging method and imaging device

Country Status (1)

Country Link
CN (1) CN118250542A (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1874499A (en) * 2006-05-12 2006-12-06 北京理工大学 High dynamic equipment for reconstructing image in high resolution
US20180234603A1 (en) * 2017-02-10 2018-08-16 Novadaq Technologies Inc. Open-field handheld flourescence imaging systems and methods
CN109744986A (en) * 2019-01-31 2019-05-14 广东欧谱曼迪科技有限公司 A kind of exposure feedback-type fluorescence navigation endoscopic system and image procossing self-regulating method
CN114468950A (en) * 2021-12-30 2022-05-13 浙江大学 Mixed illumination autofluorescence laparoscope
US20230121217A1 (en) * 2021-10-18 2023-04-20 Arthrex, Inc. Surgical camera system with high dynamic range and fluorescent imaging
CN116471466A (en) * 2023-06-12 2023-07-21 深圳市博盛医疗科技有限公司 Dual-wafer endoscope imaging method and imaging device

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1874499A (en) * 2006-05-12 2006-12-06 北京理工大学 High dynamic equipment for reconstructing image in high resolution
US20180234603A1 (en) * 2017-02-10 2018-08-16 Novadaq Technologies Inc. Open-field handheld flourescence imaging systems and methods
CN109744986A (en) * 2019-01-31 2019-05-14 广东欧谱曼迪科技有限公司 A kind of exposure feedback-type fluorescence navigation endoscopic system and image procossing self-regulating method
US20230121217A1 (en) * 2021-10-18 2023-04-20 Arthrex, Inc. Surgical camera system with high dynamic range and fluorescent imaging
CN114468950A (en) * 2021-12-30 2022-05-13 浙江大学 Mixed illumination autofluorescence laparoscope
CN116471466A (en) * 2023-06-12 2023-07-21 深圳市博盛医疗科技有限公司 Dual-wafer endoscope imaging method and imaging device

Similar Documents

Publication Publication Date Title
TWI462055B (en) Cfa image with synthetic panchromatic image
US6529640B1 (en) Image processing apparatus
CN102783135B (en) Utilize the method and apparatus that low-resolution image provides high-definition picture
US20050218297A1 (en) Optical apparatus and beam splitter
US20180338096A1 (en) Image processing appartatus
JP2013066168A (en) Imaging element and imaging apparatus
JP2002232906A (en) White balance controller
JP2000078463A (en) Image fetching device
KR20140131452A (en) Image signal processing device and method, and image processing system using the same
US7474339B2 (en) Image-processing device with a first image sensor and a second image sensor
CN116471466B (en) Dual-wafer endoscope imaging method and imaging device
WO2021041928A1 (en) Systems and methods for creating a full-color image in low light
JP2007006061A (en) Color filter and image pickup apparatus having the same
JPH02112789A (en) Low optical-level television system
JP2004186879A (en) Solid-state imaging unit and digital camera
JP2002112276A (en) Color solid-state image pickup device
CN118250542A (en) Three-wafer endoscope imaging method and imaging device
JP4432479B2 (en) Image processing device
JP2006108988A (en) Calibration device
CN117061841B (en) Dual-wafer endoscope imaging method and imaging device
JP2022027195A (en) Three-plate type camera
JP2004186789A (en) Image evaluation apparatus
JPS5937777A (en) Electronic image pickup device
JP6260512B2 (en) Imaging apparatus, imaging method, and imaging program
JP6331966B2 (en) IMAGING DEVICE AND IMAGING DEVICE ADJUSTING METHOD

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination