CN111383206B - Image processing method and device, electronic equipment and storage medium - Google Patents

Image processing method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN111383206B
CN111383206B CN202010482188.3A CN202010482188A CN111383206B CN 111383206 B CN111383206 B CN 111383206B CN 202010482188 A CN202010482188 A CN 202010482188A CN 111383206 B CN111383206 B CN 111383206B
Authority
CN
China
Prior art keywords
image
brightness
area
gain
frame image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010482188.3A
Other languages
Chinese (zh)
Other versions
CN111383206A (en
Inventor
柳文娟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang Dahua Technology Co Ltd
Original Assignee
Zhejiang Dahua Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang Dahua Technology Co Ltd filed Critical Zhejiang Dahua Technology Co Ltd
Priority to CN202010482188.3A priority Critical patent/CN111383206B/en
Publication of CN111383206A publication Critical patent/CN111383206A/en
Application granted granted Critical
Publication of CN111383206B publication Critical patent/CN111383206B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/70Denoising; Smoothing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/90Dynamic range modification of images or parts thereof
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10048Infrared image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20172Image enhancement details
    • G06T2207/20208High dynamic range [HDR] image processing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20221Image fusion; Image merging

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Studio Devices (AREA)
  • Image Processing (AREA)

Abstract

The embodiment of the application provides an image processing method, an image processing device, electronic equipment and a storage medium, which are used for improving the signal-to-noise ratio in a wide dynamic scene. The method comprises the following steps: acquiring a first image and a second image, wherein the first image is a visible light image, and the second image is an infrared light image; respectively acquiring a dark frame image and a bright frame image in the first image; fusing the second image and the bright frame image to obtain a third image; and fusing the third image and the dark frame image to obtain a fourth image.

Description

Image processing method and device, electronic equipment and storage medium
Technical Field
The present invention relates to the field of image processing technologies, and in particular, to an image processing method and apparatus, an electronic device, and a storage medium.
Background
In the practical application of a security monitoring camera, due to the limitation of the photosensitive characteristics of the sensor, the camera is difficult to take into account the brightest area and the darker area in the same scene, so that the situation that some areas are too bright or too dark to see clearly may occur in the picture, and some important information in the picture of the image may be lost. In this regard, the image sensor is exposed to the image multiple times by the wide dynamic technique, and then the image after the multiple exposure is transferred to the image processor for wide dynamic calculation, thereby outputting a bright and uniform image.
At present, in a wide dynamic range scene (a situation that the contrast of a monitored scene is large), because the exposure amount required by a darker area is more, that is, the gain of the lighter area of the darker area in the same picture is higher, it is difficult to ensure a higher signal-to-noise ratio on the basis of a wide dynamic technology.
Disclosure of Invention
The embodiment of the application provides an image processing method, an image processing device, electronic equipment and a storage medium, which are used for improving the signal-to-noise ratio in a wide dynamic scene.
In a first aspect, an image processing method is provided, the method comprising:
acquiring a first image and a second image, wherein the first image is a visible light image, and the second image is an infrared light image;
respectively acquiring a dark frame image and a bright frame image in the first image;
fusing the second image and the bright frame image to obtain a third image;
and fusing the third image and the dark frame image to obtain a fourth image.
Optionally, fusing the second image and the bright frame image to obtain a third image, including:
determining a non-overexposed region on the bright frame image;
determining a first area corresponding to the non-overexposed area in the second image;
and fusing the first area and the non-overexposure area to obtain a third image.
Optionally, before the fusing the first region and the non-overexposed region to obtain a third image, the method further includes:
determining first brightness of the non-overexposed area, wherein the first brightness is average brightness of the non-overexposed area;
determining target brightness of the second image according to the first brightness, wherein the target brightness is in a proportional relation with the first brightness;
and adjusting the second brightness of the first area on the second image to be the target brightness.
Optionally, adjusting the second brightness of the first area on the second image to the target brightness includes:
acquiring the gain of the light supplement lamp;
when the gain is smaller than a preset gain threshold value, judging whether the second brightness can reach the target brightness;
if the second brightness reaches the target brightness, determining that the current exposure state of the light supplement lamp is a target exposure state, so that the second brightness reaches the target brightness;
and if the second brightness does not reach the target brightness, increasing the gain and/or increasing the exposure intensity of the light supplement lamp so as to enable the second brightness to reach the target brightness.
Optionally, when the second brightness does not reach the target brightness, increasing the gain and/or increasing the exposure intensity of the fill-in light to make the second brightness reach the target brightness, including:
adjusting the gain to the preset gain threshold;
and when the gain is the preset gain threshold value, if the second brightness does not reach the target brightness, increasing the exposure intensity of the light supplement lamp so as to enable the second brightness to reach the target brightness.
Optionally, when the second brightness does not reach the target brightness, increasing the gain and/or increasing the exposure intensity of the fill-in light to make the second brightness reach the target brightness, including:
and adjusting the gain to be the preset gain threshold, and when adjusting the exposure intensity of the light supplement lamp to be the maximum exposure intensity, if the second brightness still cannot reach the target brightness, continuing to increase the gain so as to enable the second brightness to reach the target brightness.
Optionally, fusing the first region and the non-overexposed region to obtain a third image, including:
acquiring the brightness value of each area in the non-overexposure area;
determining a second area of the non-overexposure area, the brightness value of which is greater than a preset brightness threshold value, and a third area of which the brightness value is less than or equal to the brightness threshold value;
fusing the third region with the first region;
and the third image is formed by the second area and the image obtained by fusing the third area and the first area.
In a second aspect, there is provided an image processing apparatus, the apparatus comprising:
the device comprises an acquisition module, a processing module and a display module, wherein the acquisition module is used for acquiring a first image and a second image, the first image is a visible light image, and the second image is an infrared light image;
the acquisition module is further used for acquiring a dark frame image and a bright frame image in the first image;
the processing module is used for fusing the second image with the bright frame image to obtain a third image;
the processing module is further configured to fuse the third image with the dark frame image to obtain a fourth image.
Optionally, the processing module is specifically configured to:
determining a non-overexposed region on the bright frame image;
determining a first area corresponding to the non-overexposed area in the second image;
and fusing the first area and the non-overexposure area to obtain a third image.
Optionally, before the processing module fuses the first region and the non-overexposed region to obtain a third image, the processing module is further configured to:
determining first brightness of the non-overexposed area, wherein the first brightness is average brightness of the non-overexposed area;
determining target brightness of the second image according to the first brightness, wherein the target brightness is in a proportional relation with the first brightness;
and adjusting the second brightness of the first area on the second image to be the target brightness.
Optionally, the processing module is specifically configured to:
acquiring the gain of the light supplement lamp;
when the gain is smaller than a preset gain threshold value, judging whether the second brightness can reach the target brightness;
if the second brightness reaches the target brightness, determining that the current exposure state of the light supplement lamp is a target exposure state, so that the second brightness reaches the target brightness;
and if the second brightness does not reach the target brightness, increasing the gain and/or increasing the exposure intensity of the light supplement lamp so as to enable the second brightness to reach the target brightness.
Optionally, the processing module is specifically configured to:
adjusting the gain to the preset gain threshold;
and when the gain is the preset gain threshold value, if the second brightness does not reach the target brightness, increasing the exposure intensity of the light supplement lamp so as to enable the second brightness to reach the target brightness.
Optionally, the processing module is specifically configured to:
and adjusting the gain to be the preset gain threshold, and when adjusting the exposure intensity of the light supplement lamp to be the maximum exposure intensity, if the second brightness still cannot reach the target brightness, continuing to increase the gain so as to enable the second brightness to reach the target brightness.
Optionally, the processing module is specifically configured to:
acquiring the brightness value of each area in the non-overexposure area;
determining a second area of the non-overexposure area, the brightness value of which is greater than a preset brightness threshold value, and a third area of which the brightness value is less than or equal to the brightness threshold value;
fusing the third region with the first region;
and the third image is formed by the second area and the image obtained by fusing the third area and the first area.
In a third aspect, an electronic device is provided, which includes:
a memory for storing program instructions;
a processor for calling the program instructions stored in the memory and executing the steps comprised in any of the methods of the first aspect according to the obtained program instructions.
In a fourth aspect, there is provided a computer-readable storage medium having stored thereon computer-executable instructions for causing a computer to perform the steps included in the method of any one of the first aspects.
In a fifth aspect, a computer program product containing instructions is provided, which when run on a computer causes the computer to perform the image processing method described in the various possible implementations described above.
In the embodiment of the application, a visible light image and an infrared light image shot by a camera are obtained, a dark frame image with a small exposure and a bright frame image with a large exposure in the visible light image are obtained, the bright frame image and the infrared light image are fused to obtain a third image (namely a new bright frame image), and then the new bright frame image and the dark frame image are fused to obtain a final fused image. That is to say, the bright frame image and the infrared light image of the visible light image are fused before the visible light image and the infrared light image are fused, at this time, because the infrared light image is irradiated by infrared light, the illumination condition is good, and the gain of the infrared light image is smaller than that of the bright frame image in the visible light image, the signal-to-noise ratio of the infrared light image relative to the bright frame image in the visible light image is higher, so that after the new bright frame image and the dark frame image are fused, the signal-to-noise ratio of the fused image is higher, and the signal-to-noise ratio in the wide dynamic scene is effectively improved.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the invention, as claimed.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings used in the description of the embodiments will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application.
FIG. 1 is a diagram illustrating a fusion mechanism for fusing images by a wide dynamic technique in the related art;
FIG. 2 is a schematic diagram of a fusion mechanism for fusing images by a wide dynamic technique according to an embodiment of the present application;
fig. 3 is a flowchart of an image processing method according to an embodiment of the present application;
FIG. 4 is a timing diagram of a visible light image and an infrared light image provided by an embodiment of the present application;
fig. 5 is a block diagram of an image processing apparatus according to an embodiment of the present application;
fig. 6 is a schematic structural diagram of a computer device in an embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the technical solutions in the embodiments of the present application will be described clearly and completely with reference to the accompanying drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application. In the present application, the embodiments and features of the embodiments may be arbitrarily combined with each other without conflict. Also, while a logical order is shown in the flow diagrams, in some cases, the steps shown or described may be performed in an order different than here.
The terms "first" and "second" in the description and claims of the present application and the above-described drawings are used for distinguishing between different objects and not for describing a particular order. Furthermore, the term "comprises" and any variations thereof, which are intended to cover non-exclusive protection. For example, a process, method, system, article, or apparatus that comprises a list of steps or elements is not limited to only those steps or elements listed, but may alternatively include other steps or elements not listed, or inherent to such process, method, article, or apparatus. The "plurality" in the present application may mean at least two, for example, two, three or more, and the embodiments of the present application are not limited.
In addition, the term "and/or" herein is only one kind of association relationship describing an associated object, and means that there may be three kinds of relationships, for example, a and/or B, which may mean: a exists alone, A and B exist simultaneously, and B exists alone. In addition, the character "/" in this document generally indicates that the preceding and following related objects are in an "or" relationship unless otherwise specified.
In order to facilitate understanding of the technical solutions provided by the embodiments of the present invention, some key terms used in the embodiments of the present invention are explained first.
1) And the gain refers to the magnification times of the exposure intensity of the light supplement lamp.
2) The signal-to-noise ratio refers to the ratio of signals to noise in an image, and the higher the signal-to-noise ratio is, the better the image quality is.
For example, when the gain is larger, it indicates that the amplification factor of the exposure intensity of the fill-in light is larger, that is, the noise in the image is larger, and at this time, the signal-to-noise ratio of the image is lower. The smaller the gain, the higher the signal-to-noise ratio of the image.
For ease of understanding, the technical background of the embodiments of the present invention will be described below.
At present, for the reason that it is difficult to ensure a higher signal-to-noise ratio based on a wide dynamic technology, please refer to fig. 1, one scheme is to perform wide dynamic fusion on a bright frame image a2 and a dark frame image a1 of a visible light image a after a camera acquires the visible light image a and an infrared light image B, and then fuse a color image obtained by the wide dynamic fusion with the infrared light image B. Among them, since the wide dynamic range processing is performed on the bright frame image a2 and the dark frame image a1 of the visible light image a at the time of fusion thereof, it is difficult to distinguish the bright frame from the dark frame in the fused data, so that the luminance of the infrared light image B cannot be determined from the current information. When the brightness of the infrared light image B cannot be determined according to the existing information, if the brightness of the infrared light image B is not appropriate, for example, when the brightness of the infrared light image B is high, the area of the infrared light image B finally used for fusion is small; when the brightness of the infrared light image B is low, the overall information of the infrared light image B is insufficient (for example, some detailed information cannot be acquired), so that the finally obtained fused image has a poor effect.
In another scheme, a visible light image and an infrared light image shot by a camera are acquired, before the visible light image and the infrared light image are fused, exposure processing is carried out on the visible light image to obtain more image information, wide dynamic range processing is carried out on the infrared light image to obtain more detailed information, and then the processed visible light image and the processed infrared light image are fused. When the infrared light image is processed in a wide dynamic range, some detailed information (such as scratches of a license plate) which is not very obvious in the visible light image is amplified, so that the finally obtained fused image is relatively free of heat.
In view of this, the embodiment of the present application provides an image processing method, please refer to fig. 2, in which a bright frame image a2 in a visible light image a and an infrared light image B are fused to obtain a new bright frame image, and a wide dynamic fusion is performed between the new bright frame image and a dark frame image a1 in the visible light image a to obtain a final wide dynamic image D. In the method, the bright frame image a2 in the visible light image a is fused with the infrared light image B, so that the brightness of the infrared light image B can be adjusted according to the brightness of the bright frame image a2, and the problem of poor image effect caused by inappropriate brightness of the infrared light image B can be effectively avoided. When the new bright frame image and the dark frame image a1 are fused, the dark frame image a1 does not acquire the information related to the infrared light image B, so that excessive information is not generated for detailed information that is not very obvious in the visible light image a. And when the two frame images are fused with the infrared light image B, the signal-to-noise ratio of the infrared light image B is higher, so that the signal-to-noise ratio of the new bright frame image is higher than that of the bright frame image A2 in the visible light image A before fusion, the signal-to-noise ratio difference between the dark frame image A1 and the new bright frame image is reduced, the denoising processing is more convenient finally, and the signal-to-noise ratio in a wide dynamic scene is effectively improved.
After introducing the design concept of the embodiment of the present application, some simple descriptions are provided below for application scenarios to which the technical solution of the embodiment of the present application can be applied, and it should be noted that the application scenarios described below are only used for describing the embodiment of the present application and are not limited. In specific implementation, the technical scheme provided by the embodiment of the application can be flexibly applied according to actual needs.
In the embodiment of the present application, the provided image processing method and image processing apparatus may be applied to an image capturing system, in which a shooting device is provided, for example, a camera, an electronic device such as a mobile phone or a tablet with a camera function, a monitoring device, or the like, which can shoot an image. The shooting equipment comprises an infrared lamp and two image sensors, wherein the infrared lamp is used for supplementing light for an infrared light image so as to adjust the brightness of the infrared light image, and the two image sensors are respectively used for acquiring a visible light image and an infrared light image.
The following describes an image processing method provided by an embodiment of the present application with reference to the drawings of the specification. Referring to fig. 3, a flow of the image processing method in the embodiment of the present application is described as follows:
step 301: a visible light image and an infrared light image are acquired.
As described above, in the embodiment of the present application, the photographing apparatus includes the infrared lamp and the two image sensors. When the photographing device (e.g., a camera) is in operation, light reflected from a scene is split into two portions, one portion being visible light and one portion being infrared light, by a beam splitting prism. The image sensor a acquires a visible light image by reflected visible light, and the image sensor b acquires an infrared light image by reflected infrared light. The infrared lamp is used for supplementing reflected infrared light. In the embodiment of the present application, the acquired infrared light image and the acquired visible light image are for the same screen.
Step 302: and acquiring a dark frame image and a bright frame image in the visible light image.
At present, the most common wide dynamic technique is multiple exposure wide dynamic (also called multi-frame imaging wide dynamic), and the application mainly refers to two-frame imaging wide dynamic. The dark frame image in the embodiment of the application refers to a frame image with a smaller exposure amount during imaging; the bright frame image is one frame image with a large exposure amount in imaging. Alternatively, the dark frame image may be an area with a high brightness (i.e., a bright area) in the actual scene, and the bright frame image may be an area with a low brightness (i.e., a dark area) in the actual scene. For the bright area of the actual scene, the exposure amount required during imaging is small, the formed image is dark, namely, the brightness of the dark frame image is small, for the dark area of the actual scene, the exposure amount required during imaging is large, the formed image is bright, namely, the brightness corresponding to the bright frame image is large.
In a specific implementation process, because the two-frame imaging wide dynamic technology is adopted, only two-frame fusion is involved in an actual imaging process, and compared with the traditional multi-frame imaging wide dynamic technology, the calculation amount is relatively small, and real-time operation can be realized.
Step 303: and fusing the bright frame images in the infrared light image and the visible light image to obtain a third image.
In the embodiment of the application, after the dark frame image and the bright frame image in the visible light image are obtained, the infrared light image is supplemented with light through the infrared lamp, that is, the illumination condition of the infrared light image is good, so that the brightness of the infrared light image is high, and the bright frame image with high brightness in the infrared light image and the visible light image is fused to obtain a new bright frame image, that is, a third image.
In a possible implementation manner, when fusing the bright frame image in the infrared light image and the bright frame image in the visible light image, a non-overexposed region in the bright frame image may be determined according to a brightness value in the bright frame image, a position corresponding to the non-overexposed region is determined from the infrared light image according to a position of the non-overexposed region in the bright frame image, where the corresponding position is a first region in the infrared light image corresponding to the non-overexposed region in the bright frame image, and finally, the image of the first region in the infrared light image and the image of the non-overexposed region in the bright frame image are fused to obtain a third image. The non-overexposed area is an area of the picture, the brightness of which does not exceed the overexposed brightness threshold. At this time, the third image includes an overexposed region that is not included in the bright frame image, and the region whose luminance exceeds the overexposed luminance threshold value is mainly represented by the dark frame image.
In a specific implementation process, due to the fact that the brightness of the overexposed area is too high, the image of the overexposed area is whitened, and the image display effect of the overexposed area is poor. By acquiring the non-overexposure area on the bright frame image and fusing the non-overexposure area of the bright frame image and the first area corresponding to the infrared image, the situation that the image display effect is poor due to overexposure can be avoided in the acquired third image, and the noise of the third image is effectively reduced.
In a possible implementation manner, before the first area of the infrared light image and the non-overexposed area of the bright frame image are fused, first the first brightness of the non-overexposed area is determined, then the target brightness of the infrared light image, that is, the target brightness to be adjusted for the first area is determined according to the first brightness, the target brightness is proportional to the first brightness, and after the target brightness of the second image is determined, the second brightness of the first area on the second image is adjusted to the target brightness. Wherein, the first brightness refers to the average brightness of the non-overexposed area.
For example, after acquiring the non-overexposed area of the bright frame image, the brightness value of each area in the non-overexposed area is recorded, the average brightness ev _ long of each area is calculated according to the brightness value of each area, and then the target brightness ev _ inf of the infrared light image is determined according to the average brightness value, so that the target brightness of the infrared light image is in proportional relation to the average brightness of the non-overexposed area in the bright frame image. After determining the target brightness of the infrared light image, the second brightness of the first area is adjusted to the target brightness. Wherein, the proportional relation between the target brightness of the infrared light image and the average brightness of the non-overexposed area is specifically expressed as follows:
Figure 318951DEST_PATH_IMAGE002
in order to make the final third image more effective, k is generally 1, that is, when the first region of the infrared light image is merged with the non-overexposed region of the bright frame image, the brightness of the first region is substantially the same as that of the non-overexposed region.
In one possible embodiment, the gain of the fill-in lamp (for example, also referred to as an infrared lamp) and the preset gain threshold are obtained before the second brightness of the first region is adjusted. The gain is the amplification factor of the exposure intensity of the light supplement lamp, and the larger the gain is, the larger the amplification factor of the exposure intensity of the light supplement lamp is. The preset gain threshold refers to a preset gain threshold, and only one preset gain threshold may be used, or multiple thresholds may be set simultaneously. When there is only one preset gain threshold, the gain of the fill-in light at the beginning is generally smaller than the preset gain threshold. When a plurality of gain thresholds are set simultaneously, for example, 4 preset gain thresholds are set, which are respectively a first preset gain threshold, a second preset gain threshold, a third preset gain threshold and a fourth preset gain threshold, wherein the first preset gain threshold, the second preset gain threshold, the third preset gain threshold and the fourth preset gain threshold are arranged in order from small to large, and at this time, the gain at the beginning of the fill-in lamp is generally smaller than the first preset gain threshold.
When the gain of the fill-in light is smaller than a preset gain threshold (or smaller than a first preset gain threshold), whether the current brightness in the first region can reach the target brightness to be adjusted is judged. And if the current brightness reaches the target brightness to be adjusted, keeping the current exposure state of the light supplement lamp unchanged, namely determining the current exposure state of the light supplement lamp as the final target exposure state. If the current brightness does not reach the target brightness to be adjusted, the second brightness of the first area in the infrared image can reach the target brightness by adjusting the gain of the fill-in light and/or the exposure intensity of the fill-in light. In the process of determining the average brightness of the non-overexposed area and adjusting the second brightness, a certain error may exist, so that a certain tolerance is allowed between the second brightness and the target brightness, that is, a certain brightness difference is allowed between the second brightness and the target brightness within the error range.
In this embodiment of the application, when the gain of the fill-in light is smaller than the preset gain threshold (or the first preset gain threshold), and the second brightness does not reach the target brightness, the manner of adjusting the gain of the fill-in light and/or adjusting the exposure intensity of the fill-in light may include multiple manners, which is exemplified below for easy understanding.
First adjustment mode
When the second brightness of the first area is adjusted to be the target brightness, firstly, the exposure intensity of the light supplement lamp is kept unchanged, the gain of the light supplement lamp is adjusted to be a preset gain threshold (or a first preset gain threshold), at the moment, whether the second brightness of the first area can reach the target brightness is judged, and when the second brightness can reach the target brightness, the current exposure state of the light supplement lamp is determined to be the target exposure state. If the second brightness exceeds the target brightness when the gain of the fill-in light is the preset gain threshold (or the first preset gain threshold), reducing the exposure intensity of the fill-in light until the second brightness is the same as the target brightness, and determining the exposure intensity of the fill-in light at the moment as the target exposure state. If the second brightness does not reach the target brightness when the gain of the fill-in light is the preset gain threshold (or the first preset gain threshold), increasing the exposure intensity of the fill-in light until the second brightness can reach the target brightness, and determining the exposure intensity of the fill-in light at the moment as the target exposure state.
Second adjustment mode
If the gain of the fill-in light is adjusted to be the preset gain threshold (or the first preset gain threshold), and the exposure intensity of the fill-in light is adjusted to be the maximum intensity, the second brightness still cannot reach the target brightness, and at this moment, the gain of the fill-in light is continuously increased, so that the second brightness reaches the target brightness.
In one possible embodiment, only one preset gain threshold is used, and the gain of the fill-in lamp is continuously increased until the second brightness can reach the target brightness.
In another possible embodiment, there are a plurality of preset thresholds (e.g., 4 thresholds), and at this time, the gain of the fill-in light is adjusted to the second preset gain threshold, and it is determined whether the second brightness can reach the target brightness. And if the second brightness reaches the target brightness and is the same as the target brightness, determining the current gain and the exposure intensity of the fill-in light as the target gain and the target exposure intensity. And if the second brightness exceeds the target brightness, reducing the exposure intensity of the light supplement lamp until the second brightness is the same as the target brightness, and determining the current gain and the exposure intensity of the light supplement lamp as the target gain and the target exposure intensity. If the second brightness still does not reach the target brightness, the gain of the fill-in light is adjusted to a third preset gain threshold, and whether the second brightness can reach the target brightness is determined, the determination result is the same as the corresponding result and operation mode if the gain of the fill-in light is the second preset gain threshold, and details are not repeated here. If the second brightness does not reach the target brightness when the gain of the light supplement lamp reaches the fourth preset gain threshold, the gain of the light supplement lamp is continuously increased until the second brightness can reach the target brightness, and the exposure intensity of the light supplement lamp at the moment is determined as the target exposure state.
In a specific implementation process, the brightness of the infrared light image is adjusted according to the brightness of the bright frame image, so that the problem of poor image effect of the infrared light image due to inappropriate brightness can be effectively avoided. And when the contrast of the monitored scene is relatively large, the brightness difference between the bright area and the dark area is also large, in this case, the gain difference between the dark frame Image and the bright frame Image is also large, and the noise of the Image is larger and the Signal-to-noise ratio is lower when the gain is larger, so that the Signal-to-noise ratio difference between the dark frame Image and the bright frame Image is also larger when the gain difference is larger, which results in higher difficulty in performing Image (ISP) noise reduction processing. However, because the infrared light image is irradiated by the infrared light, the noise of the infrared light image is smaller, and the signal-to-noise ratio is higher, so that when the bright frame image and the infrared light image are fused, the signal-to-noise ratio of the fused image can be effectively improved, the signal-to-noise ratio of the new bright frame image is higher and closer to the signal-to-noise ratio of the dark frame image, the difficulty is reduced when ISP noise reduction processing is carried out, the noise of a fusion area is smaller, and the purpose of improving the signal-to-noise ratio of the wide dynamic fusion image is achieved. The noise reduction processing mode may also be to perform noise reduction through an IC, and in the embodiment of the present application, IPS noise reduction is taken as an example for description.
In the embodiment of the application, the brightness of the fill-in lamp is adjusted by adjusting the gain and the brightness of the fill-in lamp, wherein the process of adjusting the brightness of the fill-in lamp by adjusting the exposure time is not involved, so that time sequences when a dark frame image, a bright frame image and an infrared light image are generated are completely the same, exposure starting points and ending time points of the dark frame image, the bright frame image and the infrared light image are completely the same, and frame starting points and frame ending points are also completely the same, so that the problem of smear is effectively avoided. Specifically, referring to fig. 4, fig. 4 is a timing chart of the bright frame image, the dark frame image and the infrared light image provided by the embodiment of the present application when the gains are different.
In a possible embodiment, when the first region and the non-overexposed region are fused, if the brightness value of the non-overexposed region is larger, the non-overexposed region is more likely to be directly used as a region in the fused image, that is, when the brightness value in the non-overexposed region exceeds a preset brightness threshold, the corresponding region in the first region is discarded, and the region with the brightness exceeding the brightness threshold in the non-overexposed region is not fused with the corresponding region in the first region. Therefore, before the first region and the non-overexposure region are fused, the brightness value of each region in the non-overexposure region can be obtained, the region with the brightness value larger than the preset brightness threshold value in the non-overexposure region is determined as the second region, and the region with the brightness value smaller than the preset brightness threshold value in the non-overexposure region is determined as the third region. And when the first area and the non-overexposed area are fused, fusing the third area and the first area, and determining an image obtained by fusing the third area and the first area and an image of the second area as a third image.
In a specific implementation process, in the bright frame image, the image presenting effect is better in the region where the brightness value exceeds the brightness threshold, the image noise is lower, namely the signal-to-noise ratio of the image is higher, and at the moment, the signal-to-noise ratio of the bright frame image is not required to be improved by fusing with the infrared light image.
Step 304: and carrying out wide dynamic fusion on the third image and the dark frame image in the visible light image to obtain a fourth image.
In the embodiment of the present application, when a dark frame image and a third image in a visible light image are subjected to wide dynamic fusion, the images are exposed for multiple times, at this time, the exposure proportion of the dark frame image and the third image is the same as the exposure proportion of the dark frame image and the bright frame image in the visible light image, that is, the dark frame image and the third image (new bright frame image) are fused according to the exposure proportion of the dark frame image and the bright frame image in the visible light image, so as to obtain a fourth image (that is, a final wide dynamic image).
In a specific implementation process, when the dark frame image is fused with the new bright frame image, the dark frame image does not acquire related information of the infrared light image any more, so that a bright area (such as a license plate) does not generate excessive information, and a picture is more natural.
Based on the same inventive concept, the embodiment of the present application provides an image processing apparatus, which can realize the corresponding functions of the aforementioned image processing method. The image processing apparatus may be a hardware structure, a software module, or a hardware structure plus a software module. The image processing device can be realized by a chip system, and the chip system can be formed by a chip and can also comprise the chip and other discrete devices. Referring to fig. 5, the image processing apparatus includes an obtaining module 501 and a processing module 502. Wherein:
an obtaining module 501, configured to obtain a first image and a second image, where the first image is a visible light image and the second image is an infrared light image;
the obtaining module 501 is further configured to obtain a dark frame image and a bright frame image in the first image;
a processing module 502, configured to fuse the second image and the bright frame image to obtain a third image;
the processing module 502 is further configured to fuse the third image with the dark frame image to obtain a fourth image.
In a possible implementation, the processing module 502 is specifically configured to:
determining a non-overexposed region on the bright frame image;
determining a first area corresponding to the non-overexposed area in the second image;
and fusing the first area and the non-overexposure area to obtain a third image.
In a possible implementation manner, before the processing module 502 fuses the first region and the non-overexposed region to obtain a third image, the processing module is further configured to:
determining first brightness of the non-overexposed area, wherein the first brightness is average brightness of the non-overexposed area;
determining target brightness of the second image according to the first brightness, wherein the target brightness is in a proportional relation with the first brightness;
and adjusting the second brightness of the first area on the second image to be the target brightness.
In a possible implementation, the processing module 502 is specifically configured to:
acquiring the gain of the light supplement lamp;
when the gain is smaller than a preset gain threshold value, judging whether the second brightness can reach the target brightness;
if the second brightness reaches the target brightness, determining that the current exposure state of the light supplement lamp is a target exposure state, so that the second brightness reaches the target brightness;
and if the second brightness does not reach the target brightness, increasing the gain and/or increasing the exposure intensity of the light supplement lamp so as to enable the second brightness to reach the target brightness.
In a possible implementation, the processing module 502 is specifically configured to:
adjusting the gain to the preset gain threshold;
and when the gain is the preset gain threshold value, if the second brightness does not reach the target brightness, increasing the exposure intensity of the light supplement lamp so as to enable the second brightness to reach the target brightness.
In a possible implementation, the processing module 502 is specifically configured to:
and adjusting the gain to be the preset gain threshold, and when adjusting the exposure intensity of the light supplement lamp to be the maximum exposure intensity, if the second brightness still cannot reach the target brightness, continuing to increase the gain so as to enable the second brightness to reach the target brightness.
In a possible implementation, the processing module 502 is specifically configured to:
acquiring the brightness value of each area in the non-overexposure area;
determining a second area of the non-overexposure area, the brightness value of which is greater than a preset brightness threshold value, and a third area of which the brightness value is less than or equal to the brightness threshold value;
fusing the third region with the first region;
and the third image is formed by the second area and the image obtained by fusing the third area and the first area.
All relevant contents of each step related to the foregoing embodiment of the image processing method can be cited to the functional description of the functional module corresponding to the image processing apparatus in the embodiment of the present application, and are not described herein again.
The division of the modules in the embodiments of the present application is schematic, and only one logical function division is provided, and in actual implementation, there may be another division manner, and in addition, each functional module in each embodiment of the present application may be integrated in one processor, may also exist alone physically, or may also be integrated in one module by two or more modules. The integrated module can be realized in a hardware mode, and can also be realized in a software functional module mode.
Based on the same inventive concept, the embodiment of the application provides electronic equipment. Referring to fig. 6, the electronic device includes at least one processor 601 and a memory 602 connected to the at least one processor, in this embodiment, a specific connection medium between the processor 601 and the memory 602 is not limited in this application, in fig. 6, the processor 601 and the memory 602 are connected by a bus 600 as an example, the bus 600 is represented by a thick line in fig. 6, and a connection manner between other components is only schematically illustrated and is not limited. The bus 600 may be divided into an address bus, a data bus, a control bus, etc., and is shown with only one thick line in fig. 6 for ease of illustration, but does not represent only one bus or type of bus.
In the embodiment of the present application, the memory 602 stores instructions executable by the at least one processor 601, and the at least one processor 601 may execute the steps included in the foregoing image processing method by executing the instructions stored in the memory 602.
The processor 601 is a control center of the electronic device, and may connect various parts of the whole electronic device by using various interfaces and lines, and perform various functions and process data of the electronic device by operating or executing instructions stored in the memory 602 and calling data stored in the memory 602, thereby performing overall monitoring on the electronic device. Alternatively, processor 601 may include one or more processing units, and processor 601 may integrate an application processor, which mainly handles operating systems and application programs, etc., and a modem processor, which mainly handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor 601. In some embodiments, the processor 601 and the memory 602 may be implemented on the same chip, or in some embodiments, they may be implemented separately on separate chips.
The processor 601 may be a general-purpose processor, such as a Central Processing Unit (CPU), digital signal processor, application specific integrated circuit, field programmable gate array or other programmable logic device, discrete gate or transistor logic, discrete hardware components, or the like, that may implement or perform the methods, steps, and logic blocks disclosed in embodiments of the present application. A general purpose processor may be a microprocessor or any conventional processor or the like. The steps of the image processing method disclosed in connection with the embodiments of the present application may be directly implemented by a hardware processor, or implemented by a combination of hardware and software modules in the processor.
The memory 602, which is a non-volatile computer-readable storage medium, may be used to store non-volatile software programs, non-volatile computer-executable programs, and modules. The Memory 602 may include at least one type of storage medium, and may include, for example, a flash Memory, a hard disk, a multimedia card, a card-type Memory, a Random Access Memory (RAM), a Static Random Access Memory (SRAM), a Programmable Read Only Memory (PROM), a Read Only Memory (ROM), a charge Erasable Programmable Read Only Memory (EEPROM), a magnetic Memory, a magnetic disk, an optical disk, and so on. The memory 602 is any other medium that can be used to carry or store desired program code in the form of instructions or data structures and that can be accessed by a computer, but is not limited to such. The memory 602 in the embodiments of the present application may also be circuitry or any other device capable of performing a storage function for storing program instructions and/or data.
By programming the processor 601, the code corresponding to the image processing method described in the foregoing embodiment may be solidified in the chip, so that the chip can execute the steps of the image processing method when running, and how to program the processor 601 is a technique known by those skilled in the art, and is not described herein again.
Based on the same inventive concept, embodiments of the present application further provide a computer-readable storage medium storing computer instructions, which, when executed on a computer, cause the computer to perform the steps of the image processing method as described above.
In some possible embodiments, the aspects of the image processing method provided by the present application may also be implemented in the form of a program product, which includes program code for causing a detection apparatus to perform the steps in the image processing method according to various exemplary embodiments of the present application described above in this specification, when the program product is run on an electronic device.
As will be appreciated by one skilled in the art, embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to the application. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
It will be apparent to those skilled in the art that various changes and modifications may be made in the present application without departing from the spirit and scope of the application. Thus, if such modifications and variations of the present application fall within the scope of the claims of the present application and their equivalents, the present application is intended to include such modifications and variations as well.

Claims (10)

1. An image processing method, characterized in that the method comprises:
acquiring a first image and a second image, wherein the first image is a visible light image, and the second image is an infrared light image;
respectively acquiring a dark frame image and a bright frame image in the first image; wherein the dark frame image exposure is less than the bright frame image exposure;
fusing the second image and the bright frame image to obtain a third image;
and fusing the third image and the dark frame image to obtain a fourth image.
2. The method of claim 1, wherein fusing the second image with the bright frame image to obtain a third image comprises:
determining a non-overexposed region on the bright frame image;
determining a first area corresponding to the non-overexposed area in the second image;
and fusing the first area and the non-overexposure area to obtain a third image.
3. The method of claim 2, wherein prior to fusing the first region with the non-overexposed region to obtain a third image, further comprising:
determining first brightness of the non-overexposed area, wherein the first brightness is average brightness of the non-overexposed area;
determining target brightness of the second image according to the first brightness, wherein the target brightness is in a proportional relation with the first brightness;
and adjusting the second brightness of the first area on the second image to be the target brightness.
4. The method of claim 3, wherein adjusting the second brightness of the first region on the second image to the target brightness comprises:
acquiring the gain of the light supplement lamp;
when the gain is smaller than a preset gain threshold value, judging whether the second brightness can reach the target brightness;
if the second brightness reaches the target brightness, determining that the current exposure state of the light supplement lamp is a target exposure state, so that the second brightness reaches the target brightness;
and if the second brightness does not reach the target brightness, increasing the gain and/or increasing the exposure intensity of the light supplement lamp so as to enable the second brightness to reach the target brightness.
5. The method of claim 4, wherein increasing the gain and/or increasing the exposure intensity of the fill light to make the second brightness reach the target brightness when the second brightness does not reach the target brightness comprises:
adjusting the gain to the preset gain threshold;
and when the gain is the preset gain threshold value, if the second brightness does not reach the target brightness, increasing the exposure intensity of the light supplement lamp so as to enable the second brightness to reach the target brightness.
6. The method of claim 4, wherein increasing the gain and/or increasing the exposure intensity of the fill light to make the second brightness reach the target brightness when the second brightness does not reach the target brightness comprises:
and adjusting the gain to be the preset gain threshold, and when adjusting the exposure intensity of the light supplement lamp to be the maximum exposure intensity, if the second brightness still cannot reach the target brightness, continuing to increase the gain so as to enable the second brightness to reach the target brightness.
7. The method of claim 2, wherein fusing the first region with the non-overexposed region to obtain a third image comprises:
acquiring the brightness value of each area in the non-overexposure area;
determining a second area of the non-overexposure area, the brightness value of which is greater than a preset brightness threshold value, and a third area of which the brightness value is less than or equal to the brightness threshold value;
fusing the third region with the first region;
and the third image is formed by the second area and the image obtained by fusing the third area and the first area.
8. An image processing apparatus, characterized in that the apparatus comprises:
the device comprises an acquisition module, a processing module and a display module, wherein the acquisition module is used for acquiring a first image and a second image, the first image is a visible light image, and the second image is an infrared light image;
the acquisition module is further used for acquiring a dark frame image and a bright frame image in the first image; wherein the dark frame image exposure is less than the bright frame image exposure;
the processing module is used for fusing the second image with the bright frame image to obtain a third image;
the processing module is further configured to fuse the third image with the dark frame image to obtain a fourth image.
9. An electronic device, comprising:
a memory for storing program instructions;
a processor for calling program instructions stored in said memory and for executing the steps comprised by the method of any one of claims 1 to 7 in accordance with the obtained program instructions.
10. A computer-readable storage medium, characterized in that the computer-readable storage medium stores a computer program comprising program instructions that, when executed by a computer, cause the computer to perform the method according to any one of claims 1-7.
CN202010482188.3A 2020-06-01 2020-06-01 Image processing method and device, electronic equipment and storage medium Active CN111383206B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010482188.3A CN111383206B (en) 2020-06-01 2020-06-01 Image processing method and device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010482188.3A CN111383206B (en) 2020-06-01 2020-06-01 Image processing method and device, electronic equipment and storage medium

Publications (2)

Publication Number Publication Date
CN111383206A CN111383206A (en) 2020-07-07
CN111383206B true CN111383206B (en) 2020-09-29

Family

ID=71217815

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010482188.3A Active CN111383206B (en) 2020-06-01 2020-06-01 Image processing method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN111383206B (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114143418B (en) * 2020-09-04 2023-12-01 聚晶半导体股份有限公司 Dual-sensor imaging system and imaging method thereof
CN112887639A (en) * 2021-01-18 2021-06-01 Oppo广东移动通信有限公司 Image processing method, device, system, electronic device and storage medium
CN113191965B (en) * 2021-04-14 2022-08-09 浙江大华技术股份有限公司 Image noise reduction method, device and computer storage medium
CN113289323B (en) * 2021-06-08 2022-06-14 深圳泰山体育科技有限公司 Optical measurement method and system for long distance running
CN113888455A (en) * 2021-11-05 2022-01-04 Oppo广东移动通信有限公司 Image generation method and device, electronic equipment and computer-readable storage medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102572245A (en) * 2011-12-22 2012-07-11 深圳市赛格导航科技股份有限公司 Method and device for extending image dynamic ranges
CN104103053A (en) * 2013-11-25 2014-10-15 北京华科创智健康科技股份有限公司 Electronic endoscope image enhancement method and device
CN104683767A (en) * 2015-02-10 2015-06-03 浙江宇视科技有限公司 Fog penetrating image generation method and device
CN109951646A (en) * 2017-12-20 2019-06-28 杭州海康威视数字技术股份有限公司 Image interfusion method, device, electronic equipment and computer readable storage medium

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109429001B (en) * 2017-08-25 2021-06-29 杭州海康威视数字技术股份有限公司 Image acquisition method and device, electronic equipment and computer readable storage medium
CN109194873B (en) * 2018-10-29 2021-02-02 浙江大华技术股份有限公司 Image processing method and device
CN110248105B (en) * 2018-12-10 2020-12-08 浙江大华技术股份有限公司 Image processing method, camera and computer storage medium

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102572245A (en) * 2011-12-22 2012-07-11 深圳市赛格导航科技股份有限公司 Method and device for extending image dynamic ranges
CN104103053A (en) * 2013-11-25 2014-10-15 北京华科创智健康科技股份有限公司 Electronic endoscope image enhancement method and device
CN104683767A (en) * 2015-02-10 2015-06-03 浙江宇视科技有限公司 Fog penetrating image generation method and device
CN109951646A (en) * 2017-12-20 2019-06-28 杭州海康威视数字技术股份有限公司 Image interfusion method, device, electronic equipment and computer readable storage medium

Also Published As

Publication number Publication date
CN111383206A (en) 2020-07-07

Similar Documents

Publication Publication Date Title
CN111383206B (en) Image processing method and device, electronic equipment and storage medium
US9934438B2 (en) Scene recognition method and apparatus
US11228720B2 (en) Method for imaging controlling, electronic device, and non-transitory computer-readable storage medium
EP3609177B1 (en) Control method, control apparatus, imaging device, and electronic device
CN108683862B (en) Imaging control method, imaging control device, electronic equipment and computer-readable storage medium
CN108322646B (en) Image processing method, image processing device, storage medium and electronic equipment
CN111586314B (en) Image fusion method and device and computer storage medium
CN110443766B (en) Image processing method and device, electronic equipment and readable storage medium
CN113747008B (en) Camera and light supplementing method
CN111489320A (en) Image processing method and device
CN113163127A (en) Image processing method, image processing device, electronic equipment and storage medium
CN111953893B (en) High dynamic range image generation method, terminal device and storage medium
CN115278104B (en) Image brightness adjustment method and device, electronic equipment and storage medium
US11956552B2 (en) Method and electronic device for increased dynamic range of an image
CN112822413B (en) Shooting preview method, shooting preview device, terminal and computer readable storage medium
CN111160340B (en) Moving object detection method and device, storage medium and terminal equipment
CN110705336B (en) Image processing method, system, electronic device and readable storage medium
CN105141857A (en) Image processing method and device
CN112543286A (en) Image generation method and device for terminal, storage medium and terminal
CN114615439B (en) Exposure statistical method, device, electronic equipment and medium
CN111131716B (en) Image processing method and electronic device
CN115225824B (en) Automatic exposure method and shooting device
CN115278046B (en) Shooting method, shooting device, electronic equipment and storage medium
WO2024185450A1 (en) Image processing device, image processing method, and program
JP2009044451A (en) Image processor, on-vehicle image processor, and on-vehicle image processing method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant