CN114795080B - Endoscope exposure control method and endoscope - Google Patents

Endoscope exposure control method and endoscope Download PDF

Info

Publication number
CN114795080B
CN114795080B CN202210422525.9A CN202210422525A CN114795080B CN 114795080 B CN114795080 B CN 114795080B CN 202210422525 A CN202210422525 A CN 202210422525A CN 114795080 B CN114795080 B CN 114795080B
Authority
CN
China
Prior art keywords
value
exposure
brightness
calibration
preset
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210422525.9A
Other languages
Chinese (zh)
Other versions
CN114795080A (en
Inventor
杨戴天杙
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ankon Technologies Co Ltd
Original Assignee
Ankon Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ankon Technologies Co Ltd filed Critical Ankon Technologies Co Ltd
Priority to CN202210422525.9A priority Critical patent/CN114795080B/en
Publication of CN114795080A publication Critical patent/CN114795080A/en
Priority to PCT/CN2023/089558 priority patent/WO2023202675A1/en
Application granted granted Critical
Publication of CN114795080B publication Critical patent/CN114795080B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00163Optical arrangements
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/04Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
    • A61B1/045Control thereof
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/04Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
    • A61B1/05Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances characterised by the image sensor, e.g. camera, being in the distal end portion
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/06Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements
    • A61B1/0661Endoscope light sources
    • A61B1/0684Endoscope light sources using light emitting diodes [LED]

Abstract

The invention discloses an endoscope exposure control method and an endoscope, wherein the method comprises the following steps: driving n light sources to illuminate with n preset exposure values respectively, and obtaining a detection image; dividing the detection image into m areas, and calculating to obtain m detection brightness values corresponding to the areas; according to the preset illumination radiation degree of each light source to each region and the preset exposure value, calculating to obtain m standard brightness values; calibrating the standard brightness value by using the detected brightness value, and correspondingly generating m calibration brightness equations; the calibration brightness equation approaches to a preset target brightness value, and the calibration exposure value is updated and stored; and driving the n light sources to illuminate correspondingly according to the calibration exposure values respectively, and acquiring a calibration image. The endoscope exposure control method provided by the invention converts the brightness value into the equation with the exposure value variable, realizes anisotropic exposure control, and avoids the conditions of 'yin-yang image' and unclear shooting.

Description

Endoscope exposure control method and endoscope
Technical Field
The present invention relates to the field of image processing, and in particular, to an endoscope exposure control method and an endoscope.
Background
In the prior art, an imaging device and a matched lighting device are usually arranged at the end part of the endoscope, and the endoscope is placed in the patient body so as to enable the endoscope to follow the movement of the digestive system to detect, in the process, the endoscope can be extruded by the inner wall of the digestive tract, the imaging device and the lighting device inevitably abut against the inner wall, so that the problems of partial underexposure and the other partial overexposure of a 'yin-yang image' phenomenon, unclear pathological conditions shown by the images and the like occur in the photographed images, and meanwhile, as the light source arranged in the lighting device has divergence, a single light source can influence brightness data of the whole image, so that the problems of uneven exposure and unclear focus photographing on the premise of considering the divergence characteristics of the light source are solved.
Disclosure of Invention
The invention aims to provide an endoscope exposure control method, which aims to solve the technical problems of uneven exposure of pictures shot by an endoscope and unclear detected case conditions in the prior art.
It is an object of the present invention to provide an endoscope.
In order to achieve one of the above objects, an embodiment of the present invention provides an endoscope exposure control method including: driving n light sources to illuminate with n preset exposure values respectively, and obtaining a detection image; n is more than or equal to 2; dividing the detection image into m areas, and calculating to obtain m detection brightness values corresponding to the areas; m is more than or equal to 2; according to the preset illumination radiation degree of each light source to each region and the preset exposure value, calculating to obtain m standard brightness values; calibrating the standard brightness value by using the detected brightness value, and correspondingly generating m calibration brightness equations; wherein the calibration brightness equation comprises a calibration exposure value to be determined; the calibration brightness equation approaches to a preset target brightness value, and the calibration exposure value is updated and stored; and driving the n light sources to illuminate correspondingly according to the calibration exposure values respectively, and acquiring a calibration image.
As a further improvement of an embodiment of the present invention, the preset exposure value is configured to have a linear variation, and satisfies: e, e j =k j ×e 0 The method comprises the steps of carrying out a first treatment on the surface of the Where j=1, 2,.. j For the preset exposure value, k, of the jth light source j Line correlation coefficient for jth light source, e 0 Is the exposure accuracy value.
As a further improvement of an embodiment of the present invention, the method further includes: driving one of the n light sources to illuminate by taking the exposure precision value as a preset exposure value, and keeping the other (n-1) light sources in an off state; acquiring L area calibration images under preset L distance values; l is more than or equal to 2; dividing the L area calibration images into m areas respectively, analyzing the brightness value of each area, and calculating the average value of the brightness contribution value of the light source to each area in L distance value ranges to obtain m brightness contribution average values; iteratively driving the other (n-1) light sources to independently illuminate by taking the exposure precision value as a preset exposure value, and calculating to obtain n multiplied by m brightness contribution means; the method specifically comprises the following steps: and generating m standard brightness values according to the brightness contribution mean value and the preset exposure value.
As a further improvement of an embodiment of the present invention, the luminance contribution average value satisfies:where i=1, 2,..m, j=1, 2, n, l=1, 2, L, a.>For the average value of the brightness contribution of the jth light source to the ith region, Z (l) is the ith distance value, W ji (Z(l)|e 0 ) Is the brightness contribution value of the jth light source to the ith area at the distance Z (l).
As a further improvement of an embodiment of the present invention, the method specifically includes: calculating m standard brightness values according to the brightness contribution mean value and the line correlation coefficient corresponding to the preset exposure value; the standard luminance value satisfies: wherein,f iT is the standard brightness value; calculating a calibration parameter according to the quotient of the detected brightness value and the standard brightness value, and calibrating the standard brightness value by using the calibration parameter, so as to correspondingly generate m calibration brightness equations; wherein the calibration parameters satisfy: h is a i =f i /f iT ,h i For the calibration parameter, f i For the detected luminance value; the calibration brightness equation is: />Said k' j And the calibration correlation coefficient corresponding to the calibration exposure value.
As a further improvement of an embodiment of the present invention, the method specifically includes: approximating the calibration luminance equation to a preset target luminance value by using an optimization method; wherein the optimization method is configured to be according to the formulaPerforming calibration, said f i0 For the target luminance value; updating the calibration correlation coefficient, and calculating the calibration exposure value according to the updated calibration correlation coefficient and the exposure precision value.
As a further improvement of an embodiment of the present invention, the method further includes: driving one of the n light sources to illuminate by taking the exposure precision value as a preset exposure value, and keeping the other (n-1) light sources in an off state; acquiring L area calibration images under preset L distance values; l is more than or equal to 2; dividing the L area calibration images into m areas respectively, analyzing the brightness value of each area, and calculating the weighted average of the brightness contribution values of the light source to each area in the L distance value ranges according to the preset weights corresponding to the L distances respectively to obtain m brightness contribution weighted values; iteratively driving other (n-1) light sources to independently illuminate by taking the exposure precision value as a preset exposure value, and calculating to obtain n multiplied by m brightness contribution weighted values; the method specifically comprises the following steps: and generating m standard brightness values according to the brightness contribution weighted value and the preset exposure value.
As a further improvement of an embodiment of the present invention, the luminance contribution weighting value satisfies:where i=1, 2,..m, j=1, 2, n, l=1, 2, L, a.>Weighting the intensity contribution of the ith light source to the jth region, Z (l) being the ith distance value, W ji (Z(l)|e 0 ) For the brightness contribution value of the jth light source to the ith region at a distance Z (l), alpha l The preset weight corresponding to the first distance is satisfied +.>
As a further improvement of an embodiment of the present invention, the number m of the regions is configured to be 9, and the area of the region located at the center of the detection image is larger than the areas of other regions in the detection image; the number n of the light sources is configured to be 4; the preset exposure value and the calibration exposure value are configured as a product of the exposure time and the exposure gain of the endoscope; the method further comprises the steps of: and adjusting at least one of the exposure time and the exposure gain, driving the n light sources to respectively and correspondingly illuminate according to the calibration exposure value, and acquiring a calibration image.
In order to achieve one of the above objects, an embodiment of the present invention provides an endoscope in which an exposure control is performed by mounting the endoscope exposure control method according to any one of the above aspects.
Compared with the prior art, the endoscope exposure control method provided by the invention has the advantages that after the detection image is acquired through one shooting step, the detection image is divided into a plurality of areas, the standard brightness value is calculated according to the illumination radiation degrees of a plurality of light sources on the plurality of areas, and then the standard brightness value after calibration is close to the target brightness value, so that the standard exposure value is obtained and is applied to an endoscope, thus the brightness value in the detection image can be converted into an equation with a variable of the light source exposure value, the exposure value conforming to the target brightness value is calculated, and the exposure value calculated by each light source has a difference, so that the anisotropic exposure control method can be realized, and the problems of 'yin-yang map' phenomenon, unclear focus shooting and the like are prevented on the premise that the illumination radiation intensities of different light sources on different areas are considered.
Drawings
Fig. 1 is a schematic view of an endoscope according to an embodiment of the present invention.
Fig. 2 is a schematic structural view of an endoscope according to an embodiment of the present invention.
Fig. 3 is a schematic view of trigger level waveforms of a part of components in an endoscope according to an embodiment of the present invention.
Fig. 4 is a schematic diagram showing steps of an endoscope exposure control method according to an embodiment of the present invention.
Fig. 5 is a schematic view of a structure of a detection image generated when an endoscope exposure control method is performed in an embodiment of the present invention.
Fig. 6 is a schematic view showing a part of steps of an endoscope exposure control method according to another embodiment of the present invention.
Fig. 7 is a schematic view showing another part of steps of an endoscope exposure control method according to another embodiment of the present invention.
Fig. 8 is a schematic view of an endoscope in a case where an endoscope exposure control method is performed in another embodiment of the present invention.
Fig. 9 is a schematic view showing part of steps of a first example of an endoscope exposure control method in another embodiment of the present invention.
Fig. 10 is a partial step diagram showing a specific example of a first embodiment of an endoscope exposure control method in another embodiment of the present invention.
Fig. 11 is a schematic view showing a part of steps of an endoscope exposure control method according to still another embodiment of the present invention.
Fig. 12 is a schematic view showing another part of steps of an endoscope exposure control method according to still another embodiment of the present invention.
Fig. 13 is a schematic diagram showing steps of an endoscope exposure control method according to still another embodiment of the present invention.
Detailed Description
The present invention will be described in detail below with reference to specific embodiments shown in the drawings. These embodiments are not intended to limit the invention and structural, methodological, or functional modifications of these embodiments that may be made by one of ordinary skill in the art are included within the scope of the invention.
It should be noted that the term "comprises," "comprising," or any other variation thereof is intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Furthermore, the terms "first," "second," "third," "fourth," "fifth," "sixth," "seventh," "eighth," "ninth," and the like are used for descriptive purposes only and are not to be construed as indicating or implying relative importance.
Devices for detecting a lesion condition in a human body and outputting detection data such as image data are generally configured as endoscopes, particularly capsule endoscope devices having a special shape and being easily swallowed by a patient. The digestive system in the human body shields the external environment light, a light source for providing illumination needs to be arranged in the endoscope to obtain a clearly visible detection image, and the light source is preferably 4 to 6 LED lamps to provide sufficient illumination for the image sensor. However, if the plurality of light sources are arranged to have the same illumination level, the occurrence of such phenomena as "yin-yang image" may be caused, and particularly when the endoscope moves in the narrow intestinal tract or approaches any inner wall of the digestive system, the phenomenon of uneven exposure of the detected image may be more remarkable. Therefore, in order to solve the above-mentioned problems, it is necessary to configure different light sources to be driven at different degrees to form an anisotropic exposure effect and to perform dynamic adaptation according to the current detected image or bit condition of the endoscope.
An embodiment of the present invention provides an endoscope 10, which is provided with an endoscope exposure control method for performing exposure control, and includes a cover 11, a housing 12, an imaging section 111, and at least two light sources 112, as shown in fig. 1. The cover 11 is preferably configured as a light-transmitting material so that the image pickup section 111 emits and receives the detection signal enough to form a detection image, and so that the light emitted from the light source 112 can pass through and provide illumination for the detection signal of the image pickup section 111. Specifically, the cover 11 may be configured of a smooth transparent material. The casing cover 11 is disposed at least one end of the casing 12, and the casing cover 11 and the casing 12 are assembled to form a cavity, and the image pickup unit 111 and the light source 112 are disposed on a side of the cavity close to the casing cover 11. Preferably, the cover 11 is configured as a hemispherical shell and is provided at both ends of the body 12, and the body 12 is configured as a cylinder so that the endoscope has a capsule-shaped appearance that is easy to swallow. The detection signal and the light direction are preferably configured to be at least substantially parallel to each other and to be emitted outwardly in a direction away from the housing 12, so that the light beam and each portion on the detection image have a certain correspondence.
A main control part 13 may be further disposed in the cavity, and the main control part 13 may be respectively connected with the image pickup part 111 and the light source 112 and configured to perform the endoscope exposure control method, so that the exposure degrees of the different light sources 112 are adaptively adjusted when the endoscope approaches the inner wall 100 of the alimentary canal. For example, the exposure of the light source 112 to the side of the inner wall 100 of the alimentary canal is reduced and/or the exposure of the light source 112 to the side of the inner wall 100 of the alimentary canal is increased.
As shown in fig. 1 and 2, the endoscope may further include components such as a main control module 131 in addition to the image pickup unit 111 and the light source 112, and the relevant components of the main control module 131 may be defined as belonging to the main control unit 13 or may be provided independently of the main control unit 13 for executing the endoscope exposure control method. In such an embodiment, the main control section 13 may be used to realize functions other than exposure control, such as image data analysis, quantization parameter calculation, and the like.
The main control module 131 includes an image triggering module 132 and an illumination triggering module 133, and after the endoscope exposure control method is executed, the main control module 131 can selectively control the image capturing section 111 and the light source 112 to trigger through the image triggering module 132 and the illumination triggering module 133, respectively. Preferably, as shown in fig. 1, 2 and 3, in one embodiment, the light source 112 specifically includes a first light source 1121, a second light source 1122, a third light source 1123 and a fourth light source 1124, the triggering period of the image capturing section 111 may be configured to be longer than the triggering period of any one of the light sources 112, and the triggering start time of the image capturing section 111 may be configured to be earlier than the triggering start time of any one of the light sources 112, so as to collect more complete detection images under multiple conditions.
In addition, the exposure degree of the light source 112 may be achieved by adjusting parameters such as exposure time and exposure gain. In the present embodiment, it is preferable to achieve the effect of exposure adjustment by extending and shortening the exposure time. For example, the second and fourth light sources 1122 and 1124 near the inner wall 100 of the digestive tract may be configured to have shorter exposure times, and the first and third light sources 1121 and 1123 far from the inner wall 100 of the digestive tract may be configured to have longer exposure times. In some embodiments, the exposure gain parameter value may also be affected by adjusting the exposure intensity of the light source 112, specifically, adjusting the magnitude of the current passing through the light source 112, so as to achieve the effect of adjusting the exposure degree.
As further shown in FIG. 2, the endoscope may further include an image processing module 134, a data processing module 135, a storage module 136, and a communication module 137. The image processing module 134 is configured to receive the detection image, divide the detection image into regions, and calculate detection brightness values corresponding to the regions; the data processing module 135 is configured to invoke the illumination radiation degree and the preset exposure value to calculate a standard brightness value, calibrate the standard brightness value by using the detected brightness value to generate a calibration brightness equation, approximate the calibration brightness equation to the preset target brightness value, and update and store the calibration exposure value; the storage module 136 is configured to store a preset illumination radiation level for the data processing module 135 to call and buffer intermediate data in the calculation process; the communication module 137 is used for transmitting data generated in the exposure control process, and related detection images and calibration images to an external device for monitoring and receiving instructions.
An embodiment of the present invention provides an endoscope exposure control method, which may be mounted on the endoscope provided by any one of the above embodiments, or may be mounted on another device for controlling the exposure of the endoscope. As shown in fig. 4, the endoscope exposure control method may include the following steps.
Step 31, driving n light sources to illuminate with n preset exposure values respectively, and obtaining a detection image.
And step 32, dividing the detection image into m areas, and calculating to obtain m detection brightness values corresponding to the areas.
Step 33, calculating m standard brightness values according to the preset illumination radiation degree of each light source to each region and the preset exposure value.
And step 34, calibrating the standard brightness value by using the detected brightness value, and correspondingly generating m calibration brightness equations.
And 35, approximating the calibration brightness equation to a preset target brightness value, and updating and storing the calibration exposure value.
Step 36, driving the n light sources to illuminate according to the calibration exposure values respectively, and obtaining a calibration image.
Wherein n is more than or equal to 2; m is more than or equal to 2; the calibration brightness equation includes a calibration exposure value to be determined.
After the light source drives illumination, a part for executing shooting in the endoscope can obtain a detection image under the current illumination condition, and under the condition that the light beam of the light source is emitted outwards approximately along the signal transmission direction of the shooting device, the exposure degree of the light source can influence the brightness of different areas in the detection image, so that the imaging quality of the detection image is also influenced. In a default state, n light sources may illuminate with n identical preset exposure values, or n preset exposure values which are at least partially different, and after dividing the detection image into m areas, the brightness of the m areas may be affected by the illumination radiation degree of a single light source, and correspondingly, the brightness of each area may be comprehensively affected by the illumination radiation degrees of all the light sources.
For example, as shown in fig. 5, a detected image 1110 is generated at a certain time during the execution of the endoscope exposure control method, in one embodiment, the number m of regions may be configured to be 9, and the first region Q1, the second region Q2, the third region Q3, the fourth region Q4, the fifth region Q5, the sixth region Q6, the seventh region Q7, the eighth region Q8, and the ninth region Q9 are generated correspondingly. At this time, if the number n of light sources is even and symmetrically arranged about the symmetry axis extending longitudinally on the detection image 1110, each light source affects the brightness of all the above-mentioned regions, and the region on the left side, for example, the fourth region Q4, is affected by the light source on the left side as well, and the region on the right side, for example, the sixth region Q6, is affected by the light source on the right side as much. If the number n of light sources is 4 or more, and the light sources have a specific arrangement, for example, the light sources are respectively disposed at the boundary (intersection) between the first region Q1 and the centrally located fifth region Q5, the boundary between the third region Q3 and the fifth region Q5, the boundary between the seventh region Q7 and the fifth region Q5, and the boundary between the ninth region Q9 and the fifth region Q5, the illumination radiation influence situation is more complicated.
On the one hand, in order to achieve a better imaging effect, reduce the sensitivity to brightness adjustment, prevent unnecessary repeated adjustment, the area of the fifth region Q5 located at the center may be configured to be larger than the areas of other regions in the detection image 1110. Of course, in an embodiment in which the above-described problem is allowed to exist in an acceptable error range under a condition in which the brightness adjustment effect is emphasized, the above-described regions may equally be equally divided.
On the other hand, the complexity of the above-mentioned illumination radiation degree conditions results in poor imaging effect of the final detection image, and is mainly attributed to the fact that the illumination radiation degrees of the light sources on the endoscope are different in different positions, and for a single light source, the relationship between the single light source and an obstacle such as the inner wall of the digestive system can cause the illumination radiation degrees of different areas to be different, and the illumination radiation degrees influence the brightness and the exposure degree of the corresponding areas of the detection image, so that the brightness condition of the detection image in the current position of the endoscope needs to be judged in advance, compared with the brightness condition in the ideal state, and the light sources are called and controlled to be adjusted to the proper illumination radiation degrees.
The illumination radiation degree reflects the influence of the illumination radiation degree on the brightness of each area under the condition of one exposure value setting, wherein the illumination radiation of all light sources is reflected by a single area to present one brightness, and if different theoretical brightness values of the corresponding different areas are defined as standard brightness values, the standard brightness values are m. Based on the premise that the illumination radiation degree of each light source to each region is known and the preset exposure value is also known, the standard brightness value corresponding to each region under the current exposure state can be obtained without considering the influence of the endoscope position. Further, since the actual detected luminance value in the detected image is the actual quality of the image, the detected luminance value is affected by the endoscope bit state and has a deviation from the standard luminance value in the ideal state, the standard luminance value may be calibrated by the detected luminance value, or more specifically, an expression of the detected luminance value may be formed by the standard luminance value, so that the factor affected by the exposure value and the illumination radiation degree in the detected luminance value and the factor affected by the bit state in the detected luminance value are split and quantized, and the exposure value therein is further set as the calibrated exposure value to be determined, thereby forming different calibration luminance equations corresponding to different regions.
The independent variable in the calibration brightness equation is the detection brightness to be adjusted of a single area, the known parameter is a parameter or a relational expression representing the difference between the detection brightness and the standard brightness, and the independent variable is to be determined and can be a plurality of calibration exposure values corresponding to different light sources. Therefore, the purpose that the brightness of the corresponding area is kept in a proper range can be achieved by adjusting the independent variable of the calibration brightness equation to enable the independent variable to be in wireless approximation with the target brightness value, one or more calibration exposure values are obtained through solving, and the light source is controlled to use the calibration exposure values to replace the preset exposure values for illumination, so that the brightness of each area of the calibration image reaches the expected brightness.
The preset target brightness value may be a determined value, or may be a target brightness range generated by floating up and down a preset allowable error value based on the determined value, so that the brightness of each region on the final calibration image may be configured to be uniform. The illumination radiation degree can be preset by carrying out experiments for a limited number of times on the illumination radiation condition of each light source to each region, or can be calculated according to a corresponding relation between the light source and the region or a fitting function of a relative position relation (such as a distance) between the light source position and the region representing the actual position, and can be in a form of discrete data, a fitting function or a database which can be called is built. It is understood that the present invention is not limited to the predetermined manner or specific form of the illumination radiation level, and any manner or form sufficient to generate the standard luminance value deviating from the ideal state except the detected luminance value is included in the scope of the present invention.
Further, in one specific embodiment, the preset exposure value is configured to have a linear variation, and satisfies: e, e j =k j ×e 0 The method comprises the steps of carrying out a first treatment on the surface of the Where j=1, 2,.. j For the preset exposure value, k, of the jth light source j Line correlation coefficient for jth light source, e 0 Is the exposure accuracy value. Thus, can be adjusted byIntegral line correlation coefficient k j Forming a plurality of gears of the light source and adjusting the exposure precision value e 0 Size to control global adjustment range and adjustment accuracy, and at the same time, to use exposure accuracy value e 0 As an initial exposure value for the calibration and detection steps, the exposure level during the adjustment can be increased stepwise.
Another embodiment of the present invention provides an endoscope exposure control method, as shown in fig. 6 and 7, which may specifically include the following steps.
Step 21, one of the n light sources is driven to illuminate with the exposure precision as a preset exposure value, and the other (n-1) light sources are kept in an off state.
And step 22, acquiring L area calibration images under preset L distance values.
And step 23, dividing the L area calibration images into m areas respectively, analyzing the brightness value of each area, and calculating the average value of the brightness contribution values of the light source to each area in the L distance value ranges to obtain m brightness contribution average values.
And step 24, iteratively driving other (n-1) light sources to independently illuminate by taking the exposure precision value as a preset exposure value, and calculating to obtain n multiplied by m brightness contribution means.
Step 31, driving n light sources to illuminate with n preset exposure values respectively, and obtaining a detection image.
And step 32, dividing the detection image into m areas, and calculating to obtain m detection brightness values corresponding to the areas.
And step 33', generating m standard brightness values according to the brightness contribution mean and the preset exposure value.
And step 34, calibrating the standard brightness value by using the detected brightness value, and correspondingly generating m calibration brightness equations.
And 35, approximating the calibration brightness equation to a preset target brightness value, and updating and storing the calibration exposure value.
Step 36, driving the n light sources to illuminate according to the calibration exposure values respectively, and obtaining a calibration image.
Wherein L is more than or equal to 2.
The steps 21 to 24 disclose the calibration process of the endoscope before the endoscope is put into use, and the process can be performed under the condition that the detection image is selected to be uniformly exposed in the human body, or can be put into use after the endoscope is calibrated in advance in vitro. Of course, in some embodiments, the former may be defined as being included in the operation of the endoscope, however, no matter what manner of definition is based on above, the implementation of the present invention is not affected.
For embodiments calibrated in vitro, as shown in fig. 8, it is preferable to use a diffuse scattering plate 14 as a reference, the diffuse scattering plate 14 further preferably being configured as a pink latex or other simulated material with an approximate reflectance of the mucosal surface of the digestive tract. During the test, it is preferable to set the endoscope 10 to perform calibration in such a manner that the longitudinal extension direction thereof is kept perpendicular to the diffuse flat plate 14, or to set the endoscope 10 to perform calibration in such a manner that the average detection signal of the image pickup section 111 and the average light beam of the light source 112 are kept perpendicular to the diffuse flat plate 14. Considering that the light sources are driven one by one and according to the exposure accuracy value e 0 Illumination affects the brightness of the individual areas for the light source R j J=1, 2,..n, n, Q at a certain detection distance value Z i I=1, 2..the average luminance of the m region can be defined as w ji =W ji (Z|e 0 ) The average brightness w ji Can be used for characterizing the light source R j For the region Q under the detection distance value Z i Is a luminance contribution value of (a).
It should be emphasized that the detection distance value Z according to the present invention is preferably defined as a distance between the housing of the endoscope 10 and the diffuse scattering plate 14 and other tissues in the digestive system, and naturally, since the detection image is imaged on the image pickup unit 111 side, the detection distance value Z generally includes a distance difference between the housing disposed outside the image pickup unit 111 and the image pickup unit 111 as compared with the actual imaging distance value Z'. Accordingly, in an embodiment in which the detection distance value Z is obtained by analyzing the detection image, it may further include: and analyzing according to the detection brightness value to obtain an imaging distance value Z ', and calculating according to the imaging distance value Z' and the difference value of the distance between the image pickup part and the endoscope shell to obtain the detection distance value Z.
The division of the area calibration image in step 23 is preferably consistent with the division in step 31, so that the luminance value of a certain area can be adjusted to be optimal in a tendency. The brightness contribution data set may be pre-stored in a form of a table or a data matrix, or may be in the form of a brightness change curve for a single light source generated by fitting. For the former embodiment, w in the average luminance expression for a single light source with respect to one luminance contribution data set ji =W ji (Z|e 0 ) The j values in the calibration image are constant, the detection distance value Z is adjusted according to a preset step length, the L i values can be used for corresponding to a certain area in the calibration image of the pointing area, and under the definition, the first area Q 1 The brightness contribution value formed by illumination of only the first light source at the first distance Z (1) may be denoted as w 11 =W 11 (Z(1)|e 0 ). For expansion, for any ith region Q i The luminance contribution value formed by illumination of only the jth light source at the first distance Z (l) may be expressed as w ji =W ji (Z(l)|e 0 ) L=1, 2,..l. In a preferred embodiment, the compensation of the varying detection distance value Z may be chosen to be 2-10mm and the overall range may be controlled to be 0-30mm.
Because the process has three changeable parameters, namely the selection of the light source, the selection of the area and the adjustment of the distance value, the distance value can be adjusted in the process, the brightness contribution data of the specific light source to the specific area can be obtained, and the average value of a plurality of brightness contribution data generated by the brightness contribution data can be directly or processed to be used as the basis of standard brightness value calculation.
Based on this, in one embodiment, the luminance contribution mean may be configured to satisfy:where i=1, 2,..m, j=1, 2, n, l=1, 2, L, a.>For the average value of the brightness contribution of the jth light source to the ith region, Z (l) is the ith distance value, W ji (Z(l)|e 0 ) Is the brightness contribution value of the jth light source to the ith area at the distance Z (l). Thus, a plurality of brightness contribution values w of the jth light source to the ith region at L different distances can be calculated ji =W ji (Z(l)|e 0 ) The average value of l=1, 2, and the average value of L is used as the unified brightness contribution value of the jth light source to the ith area under different detection distances, so that the calculated amount can be greatly reduced, and the calculation accuracy is ensured to a certain extent. For a single light source, a total of m×L brightness contribution values w are obtained ji After averaging, corresponding m brightness contribution means +.>
Further iterating, and after the operations from step 21 to step 24 are respectively executed on other (n-1) light sources, n groups of arrays corresponding to the brightness contribution mean values of different light sources are obtained in total, wherein each array has m brightness contribution mean valuesThereby forming n×m luminance contribution means +.>According to step 33', each region corresponds to the mean value of the brightness contributions of all the light sources during the detection >The light source is known to provide a predetermined exposure value e for illumination j It is known that the luminance contribution means +.>Although corresponding to the exposure accuracy value e in the calibration process 0 But at a preset exposure value e j And exposure accuracy value e 0 In the embodiment having a linear relationship, even if the exposure value e is preset in the detection process j Not using exposure accuracy value e 0 The luminance contribution mean +.>Thus at the luminance contribution mean +.>And a preset exposure value e j In the case of known values, m standard luminance values corresponding to m regions can be calculated based on the linear relationship as well.
Another embodiment of the present invention provides a first example of an endoscope exposure control method, as shown in fig. 6 and 9, which may specifically include the following steps.
Step 21, one of the n light sources is driven to illuminate with the exposure precision as a preset exposure value, and the other (n-1) light sources are kept in an off state.
And step 22, acquiring L area calibration images under preset L distance values.
And step 23, dividing the L area calibration images into m areas respectively, analyzing the brightness value of each area, and calculating the average value of the brightness contribution values of the light source to each area in the L distance value ranges to obtain m brightness contribution average values.
And step 24, iteratively driving other (n-1) light sources to independently illuminate by taking the exposure precision value as a preset exposure value, and calculating to obtain n multiplied by m brightness contribution means.
Step 31, driving n light sources to illuminate with n preset exposure values respectively, and obtaining a detection image.
And step 32, dividing the detection image into m areas, and calculating to obtain m detection brightness values corresponding to the areas.
Step 331', calculating to obtain m standard brightness values according to the brightness contribution mean and the line correlation coefficient corresponding to the preset exposure value.
Step 34', calculating calibration parameters according to the quotient of the detected brightness value and the standard brightness value, and calibrating the standard brightness value by using the calibration parameters, so as to correspondingly generate m calibration brightness equations.
And 35, approximating the calibration brightness equation to a preset target brightness value, and updating and storing the calibration exposure value.
Step 36, driving the n light sources to illuminate according to the calibration exposure values respectively, and obtaining a calibration image.
Wherein the standard luminance value satisfies:f iT is the standard brightness value; the calibration parameters satisfy: h is a i =f i /f iT ,h i For the calibration parameter, f i For the detected luminance value; the calibration brightness equation is:said k' j And the calibration correlation coefficient corresponding to the calibration exposure value.
In this embodiment, a new step 331' is provided, which makes full use of the above-mentioned correlation coefficient k for the line j Is different from other embodiments, the calibration exposure value e to be determined j ' calculated as a whole, in the present embodiment, the light source in the endoscope is used to configure the exposure accuracy value e 0 As basic exposure value, and setting multiple gears to increase multiple, presetting exposure value e j Corresponding line correlation coefficient k' j As the standard brightness value calculation basis, directly calculating the calibration exposure value e j 'corresponding calibration correlation coefficient k' j And by calibrating the correlation coefficient k' j The gear of the corresponding light source is adjusted, so that the adjustment efficiency is improved, and the algorithm steps are simplified.
In addition, considering that the actual brightness contribution value of a single light source to a single region gradually decreases with the increase of the distance value Z, the brightness contribution value is averaged based on the factEquivalent to an equivalent distance Z of the endoscope eff Where the light source on the endoscope contributes to the brightness of the area, then when endoscopicThe actual distance between the mirror and the inner wall of the digestive tract is larger than the equivalent distance Z eff When the actual brightness contribution value is smaller than the average brightness contribution value, the calibration correlation coefficient k 'is formed' j The calculated value is smaller. Correspondingly, when the actual distance is smaller than the equivalent distance Z eff When the actual brightness contribution value is larger than the average brightness contribution value, the calibration correlation coefficient k 'is formed' j The calculated value is larger.
To make full use of the luminance contribution meanExpress the actual detected brightness value f i (or reflect the actual brightness) the detected brightness value f can be calculated directly i And standard luminance value f iT To form a calibration parameter h which varies with the magnitude of the detection distance i And with the calibration parameter h i And standard luminance value f iT Together form a calibration luminance equation, equivalently splitting the detected luminance value into a standard luminance value f that follows the exposure and luminance contribution variations iT And a calibration parameter h which follows the change in the detected distance condition i Two parts, so as to realize the technical effect.
Specifically, when the actual distance is greater than the equivalent distance Z eff At the time, the calculated calibration parameter h i <1, serving the purpose of reducing the weight, correspondingly, when the actual distance is smaller than the equivalent distance Z eff At the time, the calculated calibration parameter h i >1, the purpose of increasing the weight is achieved.
Of course, the definition of the detected luminance value described above implicitly classifies the ith region as having one detected luminance value f i In a refinement, the above scheme may be obtained by collecting the detected brightness values of a plurality of points on the ith area and taking the average value, or may be obtained by collecting the detected brightness value of a pixel at the center position of the ith area and taking the detected brightness value as the whole detected brightness value f of the area i
It will be appreciated that the arrangement of step 331' does not necessarily affect the implementation of step 34', step 34' being directBy detecting the brightness value f i And standard luminance value f iT The quotient is obtained by division operation, and the exposure value of the light source is not necessarily required to be used for the exposure precision value e 0 Is a linear change in step size. In embodiments that do not match step 331', the calibration parameters may also be configured to satisfy: h is a i =f i /f iT The standard brightness value f iT In the absence of line correlation coefficient k j The case of extraction may be configured to satisfy:
wherein,the average value of the brightness contribution calculated corresponding to the preset exposure value is not limited to the calculation mode, and the average value of the brightness contribution is calculated according to the preset exposure value e j The calibration operation similar to the steps 21 to 24 can be performed by the preset exposure value e j And exposure accuracy value e 0 Other operational relationships between the two are included in the luminance contribution mean calculation step.
Another embodiment of the present invention provides a specific example of the first embodiment of the endoscope exposure control method, and as shown in fig. 6 and 10, may specifically include the following steps.
Step 21, one of the n light sources is driven to illuminate with the exposure precision as a preset exposure value, and the other (n-1) light sources are kept in an off state.
And step 22, acquiring L area calibration images under preset L distance values.
And step 23, dividing the L area calibration images into m areas respectively, analyzing the brightness value of each area, and calculating the average value of the brightness contribution values of the light source to each area in the L distance value ranges to obtain m brightness contribution average values.
And step 24, iteratively driving other (n-1) light sources to independently illuminate by taking the exposure precision value as a preset exposure value, and calculating to obtain n multiplied by m brightness contribution means.
Step 31, driving n light sources to illuminate with n preset exposure values respectively, and obtaining a detection image.
And step 32, dividing the detection image into m areas, and calculating to obtain m detection brightness values corresponding to the areas.
Step 331', calculating to obtain m standard brightness values according to the brightness contribution mean and the line correlation coefficient corresponding to the preset exposure value.
Step 34', calculating calibration parameters according to the quotient of the detected brightness value and the standard brightness value, and calibrating the standard brightness value by using the calibration parameters, so as to correspondingly generate m calibration brightness equations.
Step 351', approximating the calibrated luminance equation to a preset target luminance value using an optimization method.
And step 352', updating the calibration correlation coefficient, and calculating a calibration exposure value according to the updated calibration correlation coefficient and the exposure precision value.
Step 36, driving the n light sources to illuminate according to the calibration exposure values respectively, and obtaining a calibration image.
Wherein the optimization method is configured to be according to the formulaPerforming calibration, said f i0 And the target brightness value.
Steps 351' to 352' provide a specific embodiment for the step 35, that is, setting the process of approximating the calibration brightness equation to the preset target brightness value as an optimization problem, taking the target brightness value as the optimal value, and adjusting the calibration exposure value e ' j Or to calibrate the correlation coefficient k' j And enabling the detection brightness to be adjusted, which is characterized by the calibration brightness equation, to be infinitely close to the target brightness value. The optimization method may include a gradient descent method (may be a batch gradient descent method or a random ladderDegree descent method), newton method and quasi newton method, conjugate gradient method, heuristic optimization method, lagrangian multiplier method, etc., wherein the heuristic optimization method may be specifically a simulated annealing method, a genetic algorithm, an ant colony algorithm or a particle swarm algorithm, and may also be a Multi-objective optimization algorithm such as NSGAII (Non-Dominated Sorting Genetic Algorithm-II, genetic algorithm with Non-dominant ordering of elite strategies), MOEA/D (Multi-Objective Evolutionary Algorithm Based on Decomposition ), an artificial immune algorithm, etc. The means for adjusting the approximation may be: and obtaining the difference between the to-be-adjusted detection brightness value and the target brightness value, which is characterized by the calibration brightness equation, and minimizing the difference. Of course, other approaches may be used.
It will be appreciated that steps 351' and 352' are provided as refinement embodiments based on step 35 and its derivative steps, with the objective of providing specific approximation methods and calibrating exposure values e ' j It can be seen that there is no necessary implication with the generation of the calibration luminance equation of steps 331 'and 34', steps 331 'and 34' can be combined with any of steps 35 described above, and steps 351 'to 352' can be combined with any of steps 33 and 34 described above as well. In summary, steps 331 'to 34' and steps 351 'to 352' may be split to form two independent embodiments, or may be combined with other steps in other embodiments to form a derivative embodiment, or may be configured as described above in the same embodiment, so that the embodiment has two specific technical effects brought by two specific steps at the same time.
Further, the target luminance value f i0 Can be specifically set to have a luminance value in the range of 90 to 120 (candela per square meter) to give the region a good visual effect. It can be understood that the present embodiment provides the target luminance value f i0 Not limited to fixed values, but can be a brightness interval range with better visual effect as described above, and can be matched with a technical scheme for setting an allowable error range for the solving process of the optimization problem, thereby So that the adjustment process does not have to be accurate to a specific target brightness value f i0 Generally, a better display effect tends to be achieved.
For the above-described optimization calibration formula, the purpose of squaring is to eliminate negative values that may be generated by performing a difference operation and amplify the difference value, and this portion may be replaced by an operation of obtaining an absolute value or the like. Through the calibration of the brightness equationAnd the target brightness value f i0 The optimized state data of the brightness of the ith area relative to the target brightness value under the influence of a plurality of light sources is synthesized, and the obtained optimized state data can be used for obtaining the calibration exposure value e' j Or to calibrate the correlation coefficient k' j As final output, i.e. the optimization method is adjusted to be according to the formula +.>An effect of optimizing the brightness of the individual areas is achieved. Of course, considering that multiple regions need to meet the requirement of approaching to the target brightness value or being in the target brightness range, all regions can be accumulated and integrated for consideration, so as to realize optimization of all regions, namely, as described above, according to the formula ∈ ->To complete the calibration of the exposure value e' j Or to calibrate the correlation coefficient k' j Is calculated by the computer.
Still another embodiment of the present invention provides an endoscope exposure control method, as shown in fig. 11 and 12, which may specifically include the following steps.
Step 21', one of the n light sources is driven to illuminate with the exposure precision value as a preset exposure value, and the other (n-1) light sources are kept in an off state.
And step 22', obtaining L area calibration images under preset L distance values.
And step 23', dividing the L area calibration images into m areas, analyzing the brightness values of the areas, and calculating the weighted average of the brightness contribution values of the light source to the areas in the L distance value ranges according to the preset weights corresponding to the L distance values, so as to obtain m brightness contribution weighted values.
And step 24', iteratively driving other (n-1) light sources to independently illuminate by taking the exposure precision value as a preset exposure value, and calculating to obtain n multiplied by m brightness contribution weighted values.
Step 31, driving n light sources to illuminate with n preset exposure values respectively, and obtaining a detection image.
And step 32, dividing the detection image into m areas, and calculating to obtain m detection brightness values corresponding to the areas.
And step 33", generating m standard brightness values according to the brightness contribution weighted value and the preset exposure value.
And step 34, calibrating the standard brightness value by using the detected brightness value, and correspondingly generating m calibration brightness equations.
And 35, approximating the calibration brightness equation to a preset target brightness value, and updating and storing the calibration exposure value.
Step 36, driving the n light sources to illuminate according to the calibration exposure values respectively, and obtaining a calibration image.
Wherein L is more than or equal to 2.
Step 21 'to step 24' reveal another calibration means, compared with the technical scheme described above, the technical scheme gives different preset weights to different distances, selectively increases the specific gravity of the common distance in the brightness contribution weighted value, and further improves the accuracy and calibration effect of subsequent calculation.
Further, in one embodiment, the brightness contribution weighting value satisfies:where i=1, 2,..m, j=1, 2, n, l=1, 2, L, a.>Weighting the intensity contribution of the ith light source to the jth region, Z (l) being the ith distanceValue, W ji (Z(l)|e 0 ) For the brightness contribution value of the jth light source to the ith region at a distance Z (l), alpha l The preset weight corresponding to the first distance is satisfied +.>
In one embodiment, the weight α is preset l >The preset weight alpha within the interval range of 0 and 0-8mm l Configured to be greater than the preset weight alpha in other ranges l
Still another embodiment of the present invention provides an endoscope exposure control method, as shown in fig. 13, which may specifically include the following steps.
Step 31, driving n light sources to illuminate with n preset exposure values respectively, and obtaining a detection image.
And step 32, dividing the detection image into m areas, and calculating to obtain m detection brightness values corresponding to the areas.
Step 33, calculating m standard brightness values according to the preset illumination radiation degree of each light source to each region and the preset exposure value.
And step 34, calibrating the standard brightness value by using the detected brightness value, and correspondingly generating m calibration brightness equations.
And 35, approximating the calibration brightness equation to a preset target brightness value, and updating and storing the calibration exposure value.
Step 36', at least one of exposure time and exposure gain is adjusted, n light sources are driven to respectively illuminate according to the calibration exposure value, and a calibration image is obtained.
Wherein the preset exposure value and the calibration exposure value are configured as a product of an exposure time and an exposure gain of the endoscope.
The present embodiment specifically defines a means for adjusting the exposure value, and since the exposure degree of the light source can be adjusted by parameters such as exposure time and exposure gain, the preset exposure value and the calibration exposure value can be adjusted by using the exposure precision value and the calibration correlation coefficient as described above, and the exposure degree of the light source can be adjusted by using the actual light sourceThe control layer can also at least express the exposure accuracy value as the product of the exposure gain and the exposure time (or at least configure the exposure accuracy value to be influenced by the exposure gain and the exposure time, the exposure accuracy value can be expressed as e) 0 (Time, gain)) and one of them is selected for adjustment.
Preferably, in the case where the exposure time can be configured to be sufficiently long, the exposure gain can be configured to be small to reduce noise of the detection image and the calibration image. For example, the exposure gain may be fixedly configured to be 1, and the exposure time corresponding to the exposure accuracy value is configured to be 1ms, and then the exposure accuracy value may be at least expressed as:
e 0 =1×1 or e 0 (1ms,1)。
After steps 31 to 35 and the deriving steps, calibration exposure values corresponding to different light sources can be calculated, and the light sources are configured to include a first light source R 1 A second light source R 2 Third light source R 3 And a fourth light source R 4 The first calibration exposure value e 'is calculated corresponding to the four light sources respectively' 1 =2e 0 Second calibration exposure value e' 2 =e 0 Third calibration exposure value e' 3 =3e 0 Fourth calibration exposure value e' 4 =0.5e 0 . Thus, the four calibration exposure values can be expressed as e 'in turn' 1 (2ms,1)、e′ 2 (1ms,1)、e′ 3 (3 ms, 1) and e' 4 (0.5ms,1)。
Of course, the exposure adjustment mode is not limited to the above technical solution, and the exposure degree can be adjusted by changing the current level introduced into the light source and/or the voltage level applied to two sides of the light source. Of course, the above-described various modes may be combined to achieve high-precision adjustment, and the present invention is not limited thereto. Meanwhile, this embodiment provides only a specific embodiment for the step 36, and it is natural that the step 36 provided in this embodiment can be applied to any of the above embodiments, so that other embodiments have the technical effects of this embodiment. Similarly, combinations and substitutions may be made between embodiments in other implementations.
In summary, according to the endoscope exposure control method provided by the invention, after a detection image is acquired through one shooting step, the detection image is divided into a plurality of areas, the standard brightness value is calculated according to the illumination radiation degrees of a plurality of light sources on the plurality of areas, and then the calibrated standard brightness value is approximated to the target brightness value, so that the standard exposure value is obtained and applied to an endoscope, thus, the brightness value in a detection picture can be converted into an equation with a variable of the light source exposure value, the exposure value conforming to the target brightness value is calculated, and the difference exists in the exposure values calculated by the light sources, so that the anisotropic exposure control method can be realized, and the problems of 'yin-yang map' phenomenon, unclear focus shooting and the like can be prevented on the premise that the illumination radiation intensities of the different light sources on the different areas are considered.
It should be understood that although the present disclosure describes embodiments, not every embodiment is provided with a separate embodiment, and that this description is for clarity only, and that the skilled artisan should recognize that the embodiments may be combined as appropriate to form other embodiments that will be understood by those skilled in the art.
The above list of detailed descriptions is only specific to practical embodiments of the present invention, and they are not intended to limit the scope of the present invention, and all equivalent embodiments or modifications that do not depart from the spirit of the present invention should be included in the scope of the present invention.

Claims (10)

1. An endoscope exposure control method, comprising:
driving n light sources to illuminate with n preset exposure values respectively, and obtaining a detection image; n is more than or equal to 2;
dividing the detection image into m areas, and calculating to obtain m detection brightness values corresponding to the areas; m is more than or equal to 2;
according to the preset illumination radiation degree of each light source to each region and the preset exposure value, calculating to obtain m standard brightness values; wherein, the illumination radiation degree represents the influence of the light source on the brightness of each area under the setting of the exposure value;
calibrating the standard brightness value by using the detected brightness value, and correspondingly generating m calibration brightness equations; wherein the calibration brightness equation comprises a calibration exposure value to be determined;
the calibration brightness equation approaches to a preset target brightness value, and the calibration exposure value is updated and stored;
And driving the n light sources to illuminate correspondingly according to the calibration exposure values respectively, and acquiring a calibration image.
2. The endoscope exposure control method according to claim 1, wherein the preset exposure value is configured to have a linear variation, and satisfies: e, e j =k j ×e 0 The method comprises the steps of carrying out a first treatment on the surface of the Where j=1, 2,.. j For the preset exposure value, k, of the jth light source j Line correlation coefficient for jth light source, e 0 Is the exposure accuracy value.
3. The endoscope exposure control method according to claim 2, wherein before the driving of the n light sources to illuminate at n preset exposure values, respectively, and the acquisition of the detection image, the method further comprises:
driving one of the n light sources to illuminate by taking the exposure precision value as a preset exposure value, and keeping the other (n-1) light sources in an off state;
acquiring L area calibration images under preset L distance values; l is more than or equal to 2;
dividing the L area calibration images into m areas respectively, analyzing the brightness value of each area, and calculating the average value of the brightness contribution value of the light source to each area in L distance value ranges to obtain m brightness contribution average values;
iteratively driving the other (n-1) light sources to independently illuminate by taking the exposure precision value as a preset exposure value, and calculating to obtain n multiplied by m brightness contribution means;
The calculating the m standard brightness values according to the preset illumination radiation degrees of the light sources to the areas and the preset exposure values specifically includes:
and generating m standard brightness values according to the brightness contribution mean value and the preset exposure value.
4. The endoscope exposure control method according to claim 3, wherein the brightness contribution average satisfies:where i=1, 2,..m, j=1, 2, n, l=1, 2, L, a.>For the average value of the brightness contribution of the jth light source to the ith region, Z (l) is the ith distance value, W ji (Z(l)|e 0 ) Is the brightness contribution value of the jth light source to the ith area at the distance Z (l).
5. The method according to claim 4, wherein generating m standard luminance values according to the luminance contribution average value and the preset exposure value comprises:
calculating m standard brightness values according to the brightness contribution mean value and the line correlation coefficient corresponding to the preset exposure value; the standard luminance value satisfies:wherein f iT Is the standard brightness value;
the "calibrating the standard luminance value with the detected luminance value" specifically includes:
Calculating a calibration parameter according to the quotient of the detected brightness value and the standard brightness value, and calibrating the standard brightness value by using the calibration parameter to correspondingly generate m calibration brightnessesA degree equation; wherein the calibration parameters satisfy: h is a i =f i /f iT ,h i For the calibration parameter, f i For the detected luminance value; the calibration brightness equation is:the k is j ' is the calibration correlation coefficient corresponding to the calibration exposure value.
6. The endoscope exposure control method according to claim 5, wherein the "approximating the calibration luminance equation to a preset target luminance value, updating and storing the calibration exposure value" specifically includes:
approximating the calibration luminance equation to a preset target luminance value by using an optimization method; wherein the optimization method is configured to be according to the formulaPerforming calibration, said f i0 For the target luminance value;
updating the calibration correlation coefficient, and calculating the calibration exposure value according to the updated calibration correlation coefficient and the exposure precision value.
7. The endoscope exposure control method according to claim 2, wherein before the driving of the n light sources to illuminate at n preset exposure values, respectively, and the acquisition of the detection image, the method further comprises:
Driving one of the n light sources to illuminate by taking the exposure precision value as a preset exposure value, and keeping the other (n-1) light sources in an off state;
acquiring L area calibration images under preset L distance values; l is more than or equal to 2;
dividing the L area calibration images into m areas respectively, analyzing the brightness value of each area, and calculating the weighted average of the brightness contribution values of the light source to each area in the L distance value ranges according to the preset weights corresponding to the L distances respectively to obtain m brightness contribution weighted values;
iteratively driving other (n-1) light sources to independently illuminate by taking the exposure precision value as a preset exposure value, and calculating to obtain n multiplied by m brightness contribution weighted values;
the calculating the m standard brightness values according to the preset illumination radiation degrees of the light sources to the areas and the preset exposure values specifically includes:
and generating m standard brightness values according to the brightness contribution weighted value and the preset exposure value.
8. The endoscope exposure control method according to claim 7, wherein the brightness contribution weighting value satisfies:where i=1, 2,..m, j=1, 2, n, l=1, 2, L, a. >Weighting the intensity contribution of the ith light source to the jth region, Z (l) being the ith distance value, W ji (Z(l)|e 0 ) For the brightness contribution value of the jth light source to the ith region at a distance Z (l), alpha l The preset weight corresponding to the first distance is satisfied +.>
9. The endoscope exposure control method according to claim 1, wherein the number m of the areas is configured to be 9, and an area of an area located at a center of the detection image is larger than an area of other areas in the detection image; the number n of the light sources is configured to be 4;
the preset exposure value and the calibration exposure value are configured as a product of the exposure time and the exposure gain of the endoscope; the driving the n light sources to illuminate according to the calibration exposure values and obtain the calibration image specifically includes:
and adjusting at least one of the exposure time and the exposure gain, driving the n light sources to respectively and correspondingly illuminate according to the calibration exposure value, and acquiring a calibration image.
10. An endoscope, wherein the endoscope carries the endoscope exposure control method according to any one of claims 1 to 9 to perform exposure control.
CN202210422525.9A 2022-04-21 2022-04-21 Endoscope exposure control method and endoscope Active CN114795080B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202210422525.9A CN114795080B (en) 2022-04-21 2022-04-21 Endoscope exposure control method and endoscope
PCT/CN2023/089558 WO2023202675A1 (en) 2022-04-21 2023-04-20 Method for controlling endoscope exposure and endoscope

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210422525.9A CN114795080B (en) 2022-04-21 2022-04-21 Endoscope exposure control method and endoscope

Publications (2)

Publication Number Publication Date
CN114795080A CN114795080A (en) 2022-07-29
CN114795080B true CN114795080B (en) 2024-04-09

Family

ID=82505812

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210422525.9A Active CN114795080B (en) 2022-04-21 2022-04-21 Endoscope exposure control method and endoscope

Country Status (2)

Country Link
CN (1) CN114795080B (en)
WO (1) WO2023202675A1 (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114795080B (en) * 2022-04-21 2024-04-09 安翰科技(武汉)股份有限公司 Endoscope exposure control method and endoscope
CN114504293B (en) * 2022-04-21 2022-07-29 安翰科技(武汉)股份有限公司 Endoscope exposure control method and endoscope

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2014176449A (en) * 2013-03-14 2014-09-25 Panasonic Corp Endoscope
CN107872625A (en) * 2017-12-25 2018-04-03 信利光电股份有限公司 A kind of dual camera exposure sync control method and system
CN109714543A (en) * 2019-01-10 2019-05-03 成都品果科技有限公司 A method of it obtaining skin brightness in camera data stream and adjusts exposure
CN111182232A (en) * 2019-12-31 2020-05-19 浙江华诺康科技有限公司 Exposure parameter adjusting method, device, equipment and computer readable storage medium

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2018157918A (en) * 2017-03-22 2018-10-11 ソニー株式会社 Control device for surgery, control method, surgical system, and program
CN114795080B (en) * 2022-04-21 2024-04-09 安翰科技(武汉)股份有限公司 Endoscope exposure control method and endoscope

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2014176449A (en) * 2013-03-14 2014-09-25 Panasonic Corp Endoscope
CN107872625A (en) * 2017-12-25 2018-04-03 信利光电股份有限公司 A kind of dual camera exposure sync control method and system
CN109714543A (en) * 2019-01-10 2019-05-03 成都品果科技有限公司 A method of it obtaining skin brightness in camera data stream and adjusts exposure
CN111182232A (en) * 2019-12-31 2020-05-19 浙江华诺康科技有限公司 Exposure parameter adjusting method, device, equipment and computer readable storage medium

Also Published As

Publication number Publication date
WO2023202675A1 (en) 2023-10-26
CN114795080A (en) 2022-07-29

Similar Documents

Publication Publication Date Title
CN114795080B (en) Endoscope exposure control method and endoscope
CN114504293B (en) Endoscope exposure control method and endoscope
US9588046B2 (en) Fluorescence observation apparatus
CN101902961B (en) Device, system and method for estimating the size of an object in a body lumen
JP6727845B2 (en) Otoscope with controlled illumination
US20200400795A1 (en) Noise aware edge enhancement in a pulsed laser mapping imaging system
BRPI0717532A2 (en) method for determining a camera configuration, computer program product in a camera, and camera
CN108594451B (en) Control method, control device, depth camera and electronic device
JP5485835B2 (en) Endoscope light source device and light amount control method thereof, endoscope system and control method thereof
JP2009104547A (en) Image processing apparatus, image processing system and image processing program
TWI684026B (en) Control method, control device, depth camera and electronic device
JP2023099679A (en) Endoscope system and data generation method
JP2023093574A (en) Information processing device, control device, method of processing information and program
CN113411508B (en) Non-vision field imaging method based on camera brightness measurement
WO2020064737A1 (en) A handheld imaging element with a movement sensor
US20210038054A1 (en) Tunable color-temperature white light source
JP2019200140A (en) Imaging apparatus, accessory, processing device, processing method, and program
US8830310B2 (en) Capsule endoscope
JP6227706B2 (en) Endoscope light source device and endoscope system
JP7399151B2 (en) Light source device, medical observation system, adjustment device, lighting method, adjustment method and program
CN115546039A (en) Image dimming method, device, equipment and storage medium of endoscope system
CN112823509A (en) Method and system for estimating exposure time of multispectral light source
JP5934267B2 (en) Endoscope light source device and operating method thereof, and endoscope system and operating method thereof
KR101401395B1 (en) Photosensor for measuring surface and measuring method of 3D depth and surface using thereof
JP6601632B2 (en) Color measurement method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant