WO2021082580A1 - Procédé, dispositif et appareil électronique de génération d'une image à plage dynamique élevée d'une scène nocturne - Google Patents

Procédé, dispositif et appareil électronique de génération d'une image à plage dynamique élevée d'une scène nocturne Download PDF

Info

Publication number
WO2021082580A1
WO2021082580A1 PCT/CN2020/106786 CN2020106786W WO2021082580A1 WO 2021082580 A1 WO2021082580 A1 WO 2021082580A1 CN 2020106786 W CN2020106786 W CN 2020106786W WO 2021082580 A1 WO2021082580 A1 WO 2021082580A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
brightness
denoising
normal exposure
dynamic range
Prior art date
Application number
PCT/CN2020/106786
Other languages
English (en)
Chinese (zh)
Inventor
王涛
陈雪琴
Original Assignee
北京迈格威科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 北京迈格威科技有限公司 filed Critical 北京迈格威科技有限公司
Publication of WO2021082580A1 publication Critical patent/WO2021082580A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/741Circuitry for compensating brightness variation in the scene by increasing the dynamic range of the image compared to the dynamic range of the electronic image sensors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • H04N23/81Camera processing pipelines; Components thereof for suppressing or minimising disturbance in the image signal generation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • H04N5/265Mixing

Definitions

  • the present invention relates to the technical field of image fusion, and in particular to a method, device and electronic equipment for generating a night scene high dynamic range image.
  • HDR image High-Dynamic Range, referred to as HDR
  • HDR image synthesis method can greatly improve the quality of the captured images without adding additional hardware, which greatly helps improve the user's shooting experience.
  • this solution is used for night scene shooting, because the same scene has a greater difference in brightness and darkness in a night scene environment, and the difference in image noise between different exposures is serious, the overall shooting effect is not very good.
  • the general processing method is to use a better denoising algorithm such as deep network denoising to denoise the image, or to increase the number of image frames for synthesis.
  • a better denoising algorithm such as deep network denoising to denoise the image
  • algorithms with good denoising effects such as deep network denoising require too much hardware and are currently difficult to apply to mobile phones and other devices.
  • the running time of the denoising algorithm and the image acquisition time will change. It's very long, which greatly affects the user experience.
  • the problem solved by the present invention is how to quickly determine difficult samples from sample data.
  • the present invention first provides a method for generating a night scene high dynamic range image, which includes:
  • the normal exposure result image and the underexposure result image are fused to obtain an HDR image.
  • the multiple frames of normally exposed images are denoised using a multiple frame denoising algorithm.
  • the algorithm has low complexity and can run smoothly in electronic devices with poor hardware, increasing the scope of application; and through multi-frame denoising algorithms, good denoising effects can be achieved.
  • the under-exposed image is denoised by a single frame denoising algorithm.
  • the denoising of underexposed images can be quickly realized, and the speed of HDR image generation can be improved.
  • the fusing the normal exposure result image and the underexposure result image to obtain an HDR image includes:
  • the brightness of the result image is increased to obtain the HDR image.
  • the adjusting the brightness of the normal exposure result image according to the under-exposed image includes:
  • the overall brightness of the normally exposed image and the brightness multiple of the under-exposed image are determined by the easily obtained sensitivity and exposure time, which is simple and fast, and the brightness of the normal exposure result image can be quickly adjusted, thereby speeding up the generation of the entire night scene HDR image. .
  • the formula for determining the brightness multiple is:
  • L is the brightness multiple of the normally exposed image to the underexposed image
  • i 0 is the sensitivity of the normally exposed image
  • t 0 is the exposure time of the normally exposed image
  • i ⁇ is the underexposed image.
  • the sensitivity of the exposed image, t - is the exposure time of the underexposed image.
  • the increasing the brightness of the result image to obtain the HDR image includes:
  • the brightness value of each pixel in the HDR image is determined according to the brightness increase ratio to obtain the HDR image.
  • the pixels with lower brightness values can also be kept closer to the brightness values in the normal exposure result image In order to avoid the degradation of HDR image quality caused by the increase in brightness.
  • the formula for determining the brightness increase ratio is:
  • s is the brightness increase ratio of the pixel in the result image
  • p is the brightness of the pixel
  • L is the brightness multiple of the normally exposed image to the underexposed image.
  • the formula for determining the overexposure mask is:
  • E is the overexposure mask of the pixel in the normal exposure result image
  • p is the brightness of the pixel
  • p t is the brightness threshold
  • the fusing the underexposed image and the adjusted normal exposure result image into a result image according to the overexposure mask includes:
  • the underexposed image and the adjusted normal exposure result image are fused into a result image.
  • image fusion is performed by assigning weights, and weight distribution is performed through overexposure masks, so that under-exposed images have a higher weight in the over-exposed areas when fusing images, and images with normal exposure have higher weights in the remaining areas, thus Reduce the interference of high noise intensity on the fusion image, making the image quality of the fusion image higher.
  • the fusion weight of each pixel in the underexposed image is an overexposure mask of the corresponding pixel in the normal exposure result image.
  • the formula for determining the fusion weight of each pixel in the normal exposure result image after adjustment is:
  • w 0 is the fusion weight of the pixel in the adjusted normal exposure result image
  • E is the overexposure mask of the pixel.
  • a device for generating a night scene high dynamic range image which includes:
  • the acquisition unit is used to acquire multiple frames of normally exposed images and one frame of underexposed images
  • a denoising unit to denoise the multiple frames of normally exposed images to obtain a normal exposure result image
  • the fusion unit is used for fusing the normal exposure result image and the underexposure result image to obtain an HDR image.
  • an electronic device which includes a processor and a memory, and the memory stores a control program, and when the control program is executed by the processor, the above-mentioned night scene high dynamic range image generation method is realized.
  • a computer-readable storage medium is provided again, and instructions are stored, and when the instructions are loaded and executed by a processor, the above-mentioned night scene high dynamic range image generation method is realized.
  • a computer program including computer-readable code, which when the computer-readable code runs on a computing processing device, causes the computing processing device to execute the above-mentioned night scene high dynamic range image generation method.
  • Fig. 1 is a flowchart of a method for generating a night scene high dynamic range image according to an embodiment of the present invention
  • 2A is a normal exposure image in the same night scene according to an embodiment of the present invention.
  • 2B is an underexposed image in the same night scene according to an embodiment of the present invention.
  • step S40 of a method for generating a night scene high dynamic range image according to an embodiment of the present invention
  • step S42 is a flowchart of step S42 of a method for generating a night scene high dynamic range image according to an embodiment of the present invention
  • step S44 of a method for generating a night scene high dynamic range image according to an embodiment of the present invention
  • step S43 of a method for generating a night scene high dynamic range image according to an embodiment of the present invention
  • Fig. 7 is a structural block diagram of a night scene high dynamic range image generating device according to an embodiment of the present invention.
  • Fig. 8 is a structural block diagram of an electronic device according to an embodiment of the present invention.
  • Fig. 9 is a block diagram of another electronic device according to an embodiment of the present invention.
  • 1-Acquisition unit 2-Denoising unit, 3-Fusion unit, 12-Electronic equipment, 14-External equipment, 16-Processing unit, 18-Bus, 20-Network adapter, 22-Input/output (I/O) Interface, 24-display, 28-system memory, 30-random access memory, 32-cache memory, 34-storage system, 40-utility tool, 42-program module.
  • HDR image High-Dynamic Range, referred to as HDR
  • LDR Low-Dynamic Range, low dynamic range image
  • the name of the LDR image is also different. If the exposure time is insufficient, the obtained LDR image is an underexposed image; if the exposure time is in the normal range, the obtained LDR image is a normally exposed image; if the exposure time is too long, the obtained LDR image is an overexposed image.
  • the normal exposure image is recorded as ev0, the underexposed image is recorded as ev-, and the overexposed image is recorded as ev+.
  • the amount of light entering is controlled by adjusting the aperture size of the lens, so that under-exposure images, normal-exposure images, and over-exposure images can be obtained through exposure time.
  • the camera of other shooting devices such as mobile phones cannot adjust the aperture size, so by adjusting the ISO (sensitivity) value and/or exposure time to obtain underexposed images, normal exposure images, and overexposed images.
  • the main development trend of existing shooting equipment is simplification and integration, that is, the integration of shooting functions on other handheld devices such as mobile phones and pads.
  • simplification and integration that is, the integration of shooting functions on other handheld devices such as mobile phones and pads.
  • this simplicity and integration also brings poor hardware performance.
  • the HDR image synthesis method can greatly improve the quality of the captured images without adding additional hardware, which greatly helps improve the user's shooting experience.
  • this solution is used for night scene shooting, because the same scene has a greater difference in brightness and darkness in a night scene environment, and the difference in image noise between different exposures is serious, the overall shooting effect is not very good.
  • the general processing method is to use a better denoising algorithm such as deep network denoising to denoise the image, or to increase the number of image frames for synthesis.
  • the deep network denoising algorithm or similar algorithms can be used to denoise the image, so that the denoising effect is good, so that the synthesized HDR image has higher image quality, but the deep network denoising algorithm and its The algorithm complexity of similar algorithms is very high, and the hardware performance of mobile phones and other devices is poor, and it is difficult to run this denoising method; therefore, the multi-frame denoising method is mainly used for denoising. This method should achieve better denoising. For the noise effect, it is necessary to increase the number of image frames to be synthesized to improve the image quality of the synthesized HDR image.
  • the embodiments of the present disclosure provide a method for generating a night scene high dynamic range image.
  • the method can be executed by a night scene high dynamic range image generating device.
  • the night scene high dynamic range image generating device can be integrated in a mobile phone, a notebook, a server, a video camera, and a camera. , PAD and other electronic equipment.
  • FIG. 1 it is a flowchart of a method for generating a night scene high dynamic range image according to an embodiment of the present invention; wherein, the night scene high dynamic range image generation method includes:
  • Step S10 acquiring multiple frames of normally exposed images and one frame of underexposed images
  • the multiple frames of normally exposed images and the under-exposed images are all shot images of the same scene at the same angle, and the acquisition method can be through electronic equipment, transmission through electronic equipment or other equipment, or other acquisitions. the way.
  • the number of frames of the normally exposed image can be determined according to the actual situation, and a better denoising effect can be achieved through multi-frame denoising.
  • a better denoising effect can be achieved through multi-frame denoising.
  • 4 frames of normally exposed image Multi-frame denoising cannot achieve a good denoising effect
  • 8 frames of normally exposed images can be obtained to achieve a good denoising effect after multi-frame denoising. If a good denoising effect can be obtained with the normal exposure image of the remaining frames, the remaining frames can also be selected.
  • Step S20 Denoising the multiple frames of normally exposed images to obtain a normally exposed result image
  • denoising can be carried out according to the method of denoising by multi-frame denoising.
  • multi-frame superposition the noise of the normally exposed image is eliminated, and a normal exposure result image with little noise after superimposition is obtained.
  • Step S30 denoising the under-exposed image to obtain an under-exposed result image
  • FIGS. 2A and Figure 2B they are respectively the normal exposure image and the underexposure image in the same night scene; we can see from the two figures that taking the normal exposure image as the benchmark, we can find that the normal exposure image
  • the area can be divided into normal brightness area, too dark area and too bright area; among them, the too dark area is due to the very dark brightness, and the amount of light is insufficient, and the noise is large; the over bright area is due to the brightness value of the pixel in the picture.
  • the highest value can only be 255. Once the brightness exceeds this value, it can only be recorded as the value 255. That is to say, in the over-bright area, one or more areas with a brightness value of 255 will be generated, which is called over-exposure. Area in which the brightness value cannot give valuable information.
  • the over-dark area in the normally exposed image has darker brightness and more noise at the corresponding position in the underexposed image.
  • the normally exposed area in the normally exposed image also has darker and noisy areas at the corresponding position in the under-exposed image; only the over-bright area in the normally exposed image corresponds to the corresponding position in the under-exposed image due to the overall brightness
  • the weakening reflects more valuable information, especially the overexposed area, which has a corresponding clear image at the corresponding position in the underexposed image, and because of the higher brightness, the noise is small.
  • the position of this part will be reduced during the fusion.
  • the weight of the exposed image is the over-exposed area, especially the over-exposed area. Since the noise intensity of this area of the under-exposed image is lower, it will increase the weight of the under-exposed image in this part of the position, and even assign all the weights to the over-exposed area. In other words, the higher the brightness of the underexposed image, the lower the noise intensity and the greater the contribution to the image fusion.
  • Step S40 fusing the normal exposure result image and the underexposure result image to obtain an HDR image.
  • the overexposed area in the underexposed image (that is, the area corresponding to the overexposed area in the normal exposure result image.
  • This part of the underexposed image is called the overexposed area, which is just an approximation It does not mean that the area of the under-exposed image is over-exposed.)
  • the contribution to the fusion image is the largest, the weight is the largest, and the weight in the rest is small; while the over-exposed area in the under-exposed image has very high brightness and low noise. It is easy to filter out the noise in this area only by a simple denoising method.
  • underexposed images In this way, by studying the contribution of underexposed images to night scene HDR images, only one frame of underexposed images and multiple frames of normal exposure images are used to generate night scene HDR images. On the one hand, the number of required composite images is reduced, thereby reducing the image complexity. Acquisition time, on the other hand, under-exposure images only need simple denoising instead of multi-frame denoising, which saves a lot of denoising time; in this way, only low-complex multi-frame denoising can be done. Realize the night scene high dynamic range image generation, and the entire generation process is simpler and quickly improves the user experience effect.
  • step S20 the multiple frames of normally exposed images are denoised, and the multiple frames of normally exposed images are denoised by using a multiple frame denoising algorithm in the normal exposure result image.
  • the multi-frame denoising algorithm can be a multi-frame superposition denoising algorithm, a single reference block multi-image denoising algorithm, a hybrid multi-frame image denoising algorithm based on superimposed average and BM3D, or other forms of multi-frame denoising. Noise algorithm.
  • the algorithm has low complexity and can run smoothly in electronic devices with poor hardware, increasing the scope of application; and through the multi-frame denoising algorithm, a good denoising effect can be achieved.
  • step S30 the under-exposed image is denoised, and in the under-exposed result image, the under-exposed image is denoised by a single frame denoising algorithm.
  • the single-frame denoising algorithm can be a median filter algorithm or other denoising algorithms with similar complexity, so that the denoising of underexposed images can be quickly realized and the speed of HDR image generation can be improved.
  • the step S40, fusing the normal exposure result image and the underexposure result image to obtain an HDR image includes:
  • Step S41 determining an overexposure mask of each pixel in the normal exposure result image
  • the normal exposure result image contains multiple pixels (it can also be said that the normal exposure result image is composed of pixels, for example, an image of 4000 ⁇ 3000 pixels, which is composed of 4000 ⁇ 3000 pixels), this step is Calculate the overexposure mask of each pixel separately.
  • the overexposure mask is used to reflect whether the corresponding pixel is located in the overexposed area, that is, the overexposure mask with other values between 0 and 0-1 means that the corresponding pixel is not overexposed. It is not in the overexposed area; an overexposure mask of 1 value means that the corresponding pixel is overexposed and is located in the overexposed area.
  • the formula for determining the overexposure mask is:
  • E is the overexposure mask of the pixel in the normal exposure result image
  • p is the brightness of the pixel
  • p t is the brightness threshold
  • the overexposure mask is 0; if the brightness is greater than the brightness threshold, the overexposure mask is a positive value greater than 0 and less than or equal to 1, and if the brightness is the highest brightness of 255 , The overexposure mask is 1.
  • the detection result of the overexposed area (the area formed by the pixel with the overexposure mask of 1) is more natural and smooth, and there is no cut off.
  • the overexposure mask reflects the overexposure degree of the pixel; the calculation formula of the above overexposure mask is only a specific calculation method, as long as the overexposure degree of the pixel can be reflected, other formulas or Way to calculate or determine the pixel overexposure mask.
  • Step S42 adjusting the brightness of the normal exposure result image according to the underexposed image
  • the overall brightness of the under-exposed image and the normal exposure result image is inconsistent. If the fusion is directly performed, it will cause image fusion errors due to the inconsistent brightness; adjusting the normal exposure result image to an image with the brightness similar to the under-exposed image can reduce the image The image fusion error caused by the inconsistent brightness during the fusion process achieves a better fusion result.
  • the brighter part of the underexposed image will be overexposed due to the increase in brightness, and valuable information will be lost, resulting in the final image being over-exposed after fusion.
  • the fusion of the exposed part is wrong; therefore, by reducing the brightness of the normal exposure result image and then fusing it with the underexposure, it is possible to avoid the overexposure of the underexposed image to increase the brightness and achieve a better fusion effect.
  • Step S43 fusing the underexposed image and the adjusted normal exposure result image into a result image according to the overexposure mask
  • the fusion weight of the pixel points corresponding to the overexposed area of the normal exposure result image in the underexposed image is increased, and the fusion weight of the remaining part is decreased, so that the image quality of the final result image can be improved.
  • Step S44 Increase the brightness of the result image to obtain the HDR image.
  • the brightness of the resulting image is closer to the underexposed image, and the brightness is lower.
  • the image quality and display effect of the resulting image can be improved, and the user experience can be improved.
  • steps S41-44 the brightness of the normal exposure result image is reduced and then merged with the underexposure, which can avoid the overexposure of the underexposure image when the brightness is increased, and achieve a better fusion effect, and by increasing the brightness of the result image, Better display effect.
  • the step S42 adjusting the brightness of the normal exposure result image according to the underexposed image, includes:
  • Step S421 acquiring the sensitivity and exposure time of the underexposed image and the normally exposed image
  • Shooting equipment such as mobile phones, obtain under-exposure images, normal exposure images, and overexposure images by adjusting the ISO (sensitivity) value and/or exposure time; it is preset before the actual shooting behavior, so the sensitivity and The exposure time can be obtained by reading directly from the device, or by querying the customary settings of the device (generally, the manual of the device will indicate the default setting), or it can be obtained by other means.
  • ISO sensitivity
  • the exposure time can be obtained by reading directly from the device, or by querying the customary settings of the device (generally, the manual of the device will indicate the default setting), or it can be obtained by other means.
  • the sensitivity and exposure time of the multiple frames of the normally exposed images are generally the same. If they are not the same, the sensitivity and exposure time of the multiple frames of the normally exposed images can be determined by averaging or median value.
  • Step S422 determining the brightness multiple of the normally exposed image to the underexposed image according to the sensitivity and the exposure time;
  • the brightness multiple reflects the proportional relationship between the overall brightness of the normally exposed image and the underexposed image.
  • the formula for determining the brightness multiple is:
  • L is the brightness multiple of the normally exposed image to the underexposed image
  • i 0 is the sensitivity of the normally exposed image
  • t 0 is the exposure time of the normally exposed image
  • i ⁇ is the underexposed image.
  • the sensitivity of the exposed image, t - is the exposure time of the under-exposed image.
  • determining the overall brightness of the normally exposed image and the brightness multiple of the under-exposed image through the easily obtained sensitivity and exposure time is simpler and faster than determining the ratio after counting the brightness of all pixels, and the result is closer to the true ratio.
  • Step S423 Adjust the brightness of the normal exposure result image according to the brightness multiple.
  • the adjustment method may be to set the brightness value of each pixel of the normal exposure result image to the brightness multiple to realize brightness adjustment, which is simple, convenient and fast.
  • the overall brightness of the normally exposed image and the brightness multiple of the under-exposed image are determined by the easily obtained sensitivity and exposure time, which is simple and fast, and the brightness of the normal exposure result image can be quickly adjusted, thereby speeding up the generation of the entire night scene HDR image. .
  • the step S44 increasing the brightness of the result image to obtain the HDR image, includes:
  • Step S441 Determine the brightness increase ratio of each pixel in the result image according to the brightness multiple of the normally exposed image to the underexposed image;
  • the brightness of the resultant image is similar to that of the normal exposure result image after the brightness is reduced. If the brightness is still increased by the brightness multiple, the overexposed area will still be overexposed, which will lead to overexposure in the result image after the increase in brightness. Improved image quality.
  • the formula for determining the brightness increase ratio is:
  • s is the brightness increase ratio of the pixel in the result image
  • p is the brightness of the pixel
  • L is the brightness multiple of the normally exposed image to the underexposed image.
  • the ratio is determined by the difference between the brightness of the pixel and 255. The closer the brightness of the pixel is to 255, the ratio is approximately close to 1; the further the brightness of the pixel is away from 255, the ratio is approximately close to the brightness multiple.
  • Step S442 Determine the brightness value of each pixel in the HDR image according to the brightness increase ratio to obtain the HDR image.
  • the method for determining the brightness value is to multiply the brightness value of each pixel in the result image by the corresponding brightness increase ratio, thereby obtaining the brightness value of each pixel in the HDR image.
  • the pixels with lower brightness values can also be kept closer to the brightness values in the normal exposure result image In order to avoid the degradation of HDR image quality caused by the increase in brightness.
  • the step S43, fusing the underexposed image and the adjusted normal exposure result image into a result image according to the overexposure mask includes:
  • Step S431 Determine the fusion weight of each pixel in the underexposed image and the adjusted normal exposure result image according to the overexposure mask;
  • the fusion weight of each pixel in the underexposed image is an overexposure mask of the corresponding pixel in the normal exposure result image.
  • the overexposure mask if the brightness of the pixels in the normal exposure result image is less than or equal to the brightness threshold, the overexposure mask is 0; if the brightness is greater than the brightness threshold, the overexposure mask is a positive value greater than 0 and less than or equal to 1, if The maximum brightness is 255, and the overexposure mask is 1.
  • the overexposure mask as the fusion weight of the underexposure image can make each pixel in the normal exposure result image whose brightness value is greater than the threshold.
  • the formula for determining the fusion weight of each pixel in the normal exposure result image after adjustment is:
  • w 0 is the fusion weight of the pixel in the adjusted normal exposure result image
  • E is the overexposure mask of the pixel.
  • Step S432 According to the fusion weight, the underexposed image and the adjusted normal exposure result image are fused into a result image.
  • the fusion method is the brightness value of each pixel in the result image, which is the sum of the product of the brightness value of the corresponding pixel in the underexposed image and the adjusted normal exposure result image and the fusion weight . It should be noted that the sum of the fusion weights of the corresponding pixels in the under-exposed image and the adjusted normal exposure result image is 1.
  • image fusion is performed by assigning weights, and weight distribution is performed through overexposure masks, so that under-exposed images have a higher weight in the over-exposed areas when fusing images, and images with normal exposure have higher weights in the remaining areas, thus Reduce the interference of high noise intensity on the fusion image, making the image quality of the fusion image higher.
  • the embodiment of the present disclosure provides a night scene high dynamic range image generation device for executing the night scene high dynamic range image generation method described in the above content of the present invention.
  • the night scene high dynamic range image generation device will be described in detail below.
  • a night scene high dynamic range image generation device includes:
  • the acquiring unit 1 is used to acquire multiple frames of normally exposed images and one frame of underexposed images;
  • the denoising unit 2 denoises the multiple frames of normally exposed images to obtain a normal exposure result image
  • the fusion unit 3 is used for fusing the normal exposure result image and the underexposure result image to obtain an HDR image.
  • the multi-frame normal exposure image is denoised by a multi-frame denoising algorithm.
  • the underexposed image is denoised by a single frame denoising algorithm.
  • the fusion unit 3 is also used for:
  • the fusion unit 3 is also used for:
  • the formula for determining the brightness multiple is:
  • L is the brightness multiple of the normally exposed image to the underexposed image
  • i 0 is the sensitivity of the normally exposed image
  • t 0 is the exposure time of the normally exposed image
  • i ⁇ is the underexposed image.
  • the sensitivity of the exposed image, t - is the exposure time of the under-exposed image.
  • the fusion unit 3 is also used for:
  • the formula for determining the brightness increase ratio is:
  • s is the brightness increase ratio of the pixel in the result image
  • p is the brightness of the pixel
  • L is the brightness multiple of the normally exposed image to the underexposed image.
  • the formula for determining the overexposure mask is:
  • E is the overexposure mask of the pixel in the normal exposure result image
  • p is the brightness of the pixel
  • p t is the brightness threshold
  • the fusion unit 3 is also used for:
  • the fusion weight of each pixel in the underexposed image is an overexposure mask of the corresponding pixel in the normal exposure result image.
  • the formula for determining the fusion weight of each pixel in the normal exposure result image after adjustment is:
  • w 0 is the fusion weight of the pixel in the adjusted normal exposure result image
  • E is the overexposure mask of the pixel.
  • the above-described device embodiments are only illustrative.
  • the division of the units is only a logical function division, and there may be other division methods in actual implementation, for example, multiple units Or components can be combined or integrated into another system, or some features can be omitted or not implemented.
  • the displayed or discussed mutual coupling or direct coupling or communication connection may be through some communication interfaces, indirect coupling or communication connection of devices or units, and may be in electrical, mechanical or other forms.
  • the night scene high dynamic range image generation device can be implemented as an electronic device, including: a processor and a memory, and the memory stores There is a control program, which, when executed by a processor, realizes the above-mentioned night scene high dynamic range image generation method.
  • Fig. 9 is a block diagram showing another electronic device according to an embodiment of the present invention.
  • the electronic device 12 shown in FIG. 9 is only an example, and should not bring any limitation to the function and scope of use of the embodiments of the present application.
  • the electronic device 12 may be implemented in the form of a general-purpose electronic device.
  • the components of the electronic device 12 may include, but are not limited to: one or more processors or processing units 16, a system memory 28, and a bus 18 connecting different system components (including the system memory 28 and the processing unit 16).
  • the bus 18 represents one or more of several types of bus structures, including a memory bus or a memory controller, a peripheral bus, a graphics acceleration port, a processor, or a local bus using any bus structure among multiple bus structures.
  • these architectures include but are not limited to industry standard architecture (Industry Standard Architecture; hereinafter referred to as ISA) bus, Micro Channel Architecture (hereinafter referred to as MAC) bus, enhanced ISA bus, video electronics Standards Association (Video Electronics Standards Association; hereinafter referred to as VESA) local bus and Peripheral Component Interconnection (hereinafter referred to as PCI) bus.
  • Industry Standard Architecture hereinafter referred to as ISA
  • MAC Micro Channel Architecture
  • VESA Video Electronics Standards Association
  • PCI Peripheral Component Interconnection
  • the electronic device 12 typically includes a variety of computer system readable media. These media may be any available media that can be accessed by the electronic device 12, including volatile and non-volatile media, removable and non-removable media.
  • the memory 28 may include a computer system readable medium in the form of a volatile memory, such as a random access memory (Random Access Memory; hereinafter referred to as RAM) 30 and/or a cache memory 32.
  • RAM Random Access Memory
  • the electronic device 12 may further include other removable/non-removable, volatile/nonvolatile computer-readable storage media.
  • the storage system 34 may be used to read and write non-removable, non-volatile magnetic media (not shown in the figure, but generally referred to as a "hard drive").
  • a disk drive for reading and writing to a removable non-volatile disk (such as a "floppy disk”) and a removable non-volatile optical disk (such as a compact disk read-only memory) can be provided.
  • Disc Read Only Memory hereinafter referred to as CD-ROM
  • DVD-ROM Digital Video Disc Read Only Memory
  • each drive can be connected to the bus 18 through one or more data media interfaces.
  • the memory 28 may include at least one program product.
  • the program product has a set of (for example, at least one) program modules, and these program modules are configured to perform the functions of the embodiments of the present application.
  • a program/utility tool 40 having a set of (at least one) program module 42 may be stored in, for example, the memory 28.
  • Such program module 42 includes but is not limited to an operating system, one or more application programs, other program modules, and program data Each of these examples or some combination may include the implementation of a network environment.
  • the program module 42 generally executes the functions and/or methods in the embodiments described in this application.
  • the electronic device 12 can also communicate with one or more external devices 14 (such as keyboards, pointing devices, displays 24, etc.), and can also communicate with one or more devices that enable users to interact with the computer system/server 12, and/ Or communicate with any device (such as a network card, modem, etc.) that enables the computer system/server 12 to communicate with one or more other electronic devices. This communication can be performed through an input/output (I/O) interface 22.
  • external devices 14 such as keyboards, pointing devices, displays 24, etc.
  • any device such as a network card, modem, etc.
  • the electronic device 12 can also connect to one or more networks (such as a local area network (Local Area Network; hereinafter referred to as: LAN), a wide area network (Wide Area Network; hereinafter referred to as: WAN) and/or a public network, such as the Internet, through the network adapter 20 ) Communication.
  • networks such as a local area network (Local Area Network; hereinafter referred to as: LAN), a wide area network (Wide Area Network; hereinafter referred to as: WAN) and/or a public network, such as the Internet
  • LAN Local Area Network
  • WAN Wide Area Network
  • the Internet such as the Internet
  • the network adapter 20 communicates with other modules of the electronic device 12 through the bus 18.
  • other hardware and/or software modules can be used in conjunction with the electronic device 12, including but not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, RAID systems, Tape drives and data backup storage systems, etc.
  • the processing unit 16 executes various functional applications and data processing by running programs stored in the system memory 28, such as implementing the methods mentioned in the foregoing embodiments.
  • the electronic device of the present invention can be a server or a terminal device with limited computing power.
  • the lightweight network structure of the present invention is particularly suitable for the latter.
  • the base implementation of the terminal equipment includes, but is not limited to: smart mobile communication terminals, drones, robots, portable image processing equipment, security equipment, and so on.
  • the embodiments of the present disclosure provide a computer-readable storage medium, including a memory, in which a computer program is stored, and the computer program is used to implement any one of the night scene high dynamics explained in any of the embodiments of the present application when the computer program is executed by a processor. Steps of the range image generation method.
  • the computer-readable storage medium includes, but is not limited to, any type of disk (including floppy disk, hard disk, optical disk, CD-ROM, and magneto-optical disk), ROM (Read-Only Memory), RAM ( Random Access Memory), EPROM (Erasable Programmable Read-Only Memory), EEPROM (Electrically Erasable Programmable Read-Only Memory), flash memory, magnetic Card or light card. That is, a readable storage medium includes any medium that stores or transmits information in a readable form by a device (for example, a computer).
  • the technical solution of the embodiment of the present invention is essentially or a part that contributes to the prior art, or all or part of the technical solution can be embodied in the form of a software product.
  • the computer software product is stored in a storage medium, including several The instructions are used to make a computer device (which may be a personal computer, a server, or a network device, etc.) or a processor (processor) execute all or part of the steps of the method described in the embodiment of the present invention.
  • the aforementioned storage media include: U disk, mobile hard disk, ROM, RAM, magnetic disk or optical disk and other media that can store program codes.
  • an embodiment of the present application further provides a computer program, including computer-readable code, when the computer-readable code runs on a computing processing device, it can cause the computing processing device to execute the explanation in any one of the embodiments of this application. Any one of the night scene high dynamic range image generation methods.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)
  • Image Processing (AREA)

Abstract

La présente invention concerne un procédé, un dispositif et un appareil électronique de génération d'une image à plage dynamique élevée d'une scène nocturne. Le procédé comprend les étapes consistant à : acquérir de multiples trames d'images à exposition normale et une trame d'une image à sous-exposition ; débruiter les multiples trames d'images à exposition normale de façon à obtenir une image résultante à exposition normale ; débruiter l'image à sous-exposition de façon à obtenir une image résultante à sous-exposition ; et fusionner l'image résultante à exposition normale et l'image résultante à sous-exposition de façon à obtenir une image à plage dynamique élevée. L'image à plage dynamique élevée de la scène nocturne est donc générée uniquement par l'intermédiaire d'une trame d'une image à sous-exposition et de multiples trames d'images à exposition normale. D'une part, le nombre de trames de l'image composite requise est réduit, de sorte que le temps d'acquisition de l'image est raccourci. D'autre part, l'image à sous-exposition doit uniquement être débruitée simplement. Un débruitage n'est pas nécessaire pour les multiples trames. Un important temps de débruitage s'en trouve économisé. La génération de l'image à plage dynamique élevée de la scène nocturne peut donc être réalisée seulement par l'intermédiaire d'un débruitage de multiples trames de faible complexité. L'ensemble du processus de génération s'en trouve simplifié et l'effet sur l'expérience utilisateur est rapidement amélioré.
PCT/CN2020/106786 2019-10-31 2020-08-04 Procédé, dispositif et appareil électronique de génération d'une image à plage dynamique élevée d'une scène nocturne WO2021082580A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201911055568.2A CN110611750B (zh) 2019-10-31 2019-10-31 一种夜景高动态范围图像生成方法、装置和电子设备
CN201911055568.2 2019-10-31

Publications (1)

Publication Number Publication Date
WO2021082580A1 true WO2021082580A1 (fr) 2021-05-06

Family

ID=68895847

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/106786 WO2021082580A1 (fr) 2019-10-31 2020-08-04 Procédé, dispositif et appareil électronique de génération d'une image à plage dynamique élevée d'une scène nocturne

Country Status (2)

Country Link
CN (1) CN110611750B (fr)
WO (1) WO2021082580A1 (fr)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114095666A (zh) * 2021-08-12 2022-02-25 荣耀终端有限公司 拍照方法、电子设备和计算机可读存储介质
CN114554106A (zh) * 2022-02-18 2022-05-27 瑞芯微电子股份有限公司 自动曝光方法、装置、图像获取方法、介质及设备
CN115760663A (zh) * 2022-11-14 2023-03-07 辉羲智能科技(上海)有限公司 基于多帧多曝光的低动态范围图像合成高动态范围图像的方法
WO2024093545A1 (fr) * 2022-10-31 2024-05-10 华为技术有限公司 Procédé photographique et dispositif électronique

Families Citing this family (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110611750B (zh) * 2019-10-31 2022-03-22 北京迈格威科技有限公司 一种夜景高动态范围图像生成方法、装置和电子设备
CN111242860B (zh) * 2020-01-07 2024-02-27 影石创新科技股份有限公司 超级夜景图像的生成方法、装置、电子设备及存储介质
CN111416936B (zh) * 2020-03-24 2021-09-17 Oppo广东移动通信有限公司 图像处理方法、装置、电子设备及存储介质
CN111462031A (zh) * 2020-03-27 2020-07-28 Oppo广东移动通信有限公司 一种多帧hdr图像处理方法、装置、存储介质及电子设备
CN112651899A (zh) * 2021-01-15 2021-04-13 北京小米松果电子有限公司 图像处理方法及装置、电子设备、存储介质
CN112887639A (zh) * 2021-01-18 2021-06-01 Oppo广东移动通信有限公司 图像处理方法、装置、系统、电子设备以及存储介质
CN115314627B (zh) * 2021-05-08 2024-03-01 杭州海康威视数字技术股份有限公司 一种图像处理方法、系统及摄像机
CN115514876B (zh) * 2021-06-23 2023-09-01 荣耀终端有限公司 图像融合方法、电子设备、存储介质及计算机程序产品
CN115706766B (zh) * 2021-08-12 2023-12-15 荣耀终端有限公司 视频处理方法、装置、电子设备和存储介质
CN113781370A (zh) * 2021-08-19 2021-12-10 北京旷视科技有限公司 图像的增强方法、装置和电子设备
CN117135468A (zh) * 2023-02-21 2023-11-28 荣耀终端有限公司 图像处理方法及电子设备

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2012110894A1 (fr) * 2011-02-18 2012-08-23 DigitalOptics Corporation Europe Limited Extension de plage dynamique par combinaison d'images acquises par des dispositifs tenus à la main exposés différemment
CN110062159A (zh) * 2019-04-09 2019-07-26 Oppo广东移动通信有限公司 基于多帧图像的图像处理方法、装置、电子设备
CN110072051A (zh) * 2019-04-09 2019-07-30 Oppo广东移动通信有限公司 基于多帧图像的图像处理方法和装置
CN110072052A (zh) * 2019-04-09 2019-07-30 Oppo广东移动通信有限公司 基于多帧图像的图像处理方法、装置、电子设备
CN110166709A (zh) * 2019-06-13 2019-08-23 Oppo广东移动通信有限公司 夜景图像处理方法、装置、电子设备以及存储介质
CN110166711A (zh) * 2019-06-13 2019-08-23 Oppo广东移动通信有限公司 图像处理方法、装置、电子设备以及存储介质
CN110264420A (zh) * 2019-06-13 2019-09-20 Oppo广东移动通信有限公司 基于多帧图像的图像处理方法和装置
CN110611750A (zh) * 2019-10-31 2019-12-24 北京迈格威科技有限公司 一种夜景高动态范围图像生成方法、装置和电子设备
CN110751608A (zh) * 2019-10-23 2020-02-04 北京迈格威科技有限公司 一种夜景高动态范围图像融合方法、装置和电子设备

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102497490B (zh) * 2011-12-16 2014-08-13 上海富瀚微电子有限公司 实现图像高动态范围压缩的系统及其方法
CN104349066B (zh) * 2013-07-31 2018-03-06 华为终端(东莞)有限公司 一种生成高动态范围图像的方法、装置
CN105809641B (zh) * 2016-03-09 2018-02-16 北京理工大学 一种去雾图像的曝光补偿和边缘增强方法
CN106534677B (zh) * 2016-10-27 2019-12-17 成都西纬科技有限公司 一种图像过曝优化方法及装置
CN108364275B (zh) * 2018-03-02 2022-04-12 成都西纬科技有限公司 一种图像融合方法、装置、电子设备及介质
CN108833775B (zh) * 2018-05-22 2020-04-03 深圳岚锋创视网络科技有限公司 一种抗运动鬼影的hdr方法、装置及便携式终端
CN108717691B (zh) * 2018-06-06 2022-04-15 成都西纬科技有限公司 一种图像融合方法、装置、电子设备及介质
CN109767413B (zh) * 2019-01-11 2022-11-29 影石创新科技股份有限公司 一种抗运动伪影的hdr方法、装置及便携式终端

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2012110894A1 (fr) * 2011-02-18 2012-08-23 DigitalOptics Corporation Europe Limited Extension de plage dynamique par combinaison d'images acquises par des dispositifs tenus à la main exposés différemment
CN110062159A (zh) * 2019-04-09 2019-07-26 Oppo广东移动通信有限公司 基于多帧图像的图像处理方法、装置、电子设备
CN110072051A (zh) * 2019-04-09 2019-07-30 Oppo广东移动通信有限公司 基于多帧图像的图像处理方法和装置
CN110072052A (zh) * 2019-04-09 2019-07-30 Oppo广东移动通信有限公司 基于多帧图像的图像处理方法、装置、电子设备
CN110166709A (zh) * 2019-06-13 2019-08-23 Oppo广东移动通信有限公司 夜景图像处理方法、装置、电子设备以及存储介质
CN110166711A (zh) * 2019-06-13 2019-08-23 Oppo广东移动通信有限公司 图像处理方法、装置、电子设备以及存储介质
CN110264420A (zh) * 2019-06-13 2019-09-20 Oppo广东移动通信有限公司 基于多帧图像的图像处理方法和装置
CN110751608A (zh) * 2019-10-23 2020-02-04 北京迈格威科技有限公司 一种夜景高动态范围图像融合方法、装置和电子设备
CN110611750A (zh) * 2019-10-31 2019-12-24 北京迈格威科技有限公司 一种夜景高动态范围图像生成方法、装置和电子设备

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114095666A (zh) * 2021-08-12 2022-02-25 荣耀终端有限公司 拍照方法、电子设备和计算机可读存储介质
CN114095666B (zh) * 2021-08-12 2023-09-22 荣耀终端有限公司 拍照方法、电子设备和计算机可读存储介质
CN114554106A (zh) * 2022-02-18 2022-05-27 瑞芯微电子股份有限公司 自动曝光方法、装置、图像获取方法、介质及设备
CN114554106B (zh) * 2022-02-18 2024-01-09 瑞芯微电子股份有限公司 自动曝光方法、装置、图像获取方法、介质及设备
WO2024093545A1 (fr) * 2022-10-31 2024-05-10 华为技术有限公司 Procédé photographique et dispositif électronique
CN115760663A (zh) * 2022-11-14 2023-03-07 辉羲智能科技(上海)有限公司 基于多帧多曝光的低动态范围图像合成高动态范围图像的方法
CN115760663B (zh) * 2022-11-14 2023-09-22 辉羲智能科技(上海)有限公司 基于多帧多曝光的低动态范围图像合成高动态范围图像的方法

Also Published As

Publication number Publication date
CN110611750A (zh) 2019-12-24
CN110611750B (zh) 2022-03-22

Similar Documents

Publication Publication Date Title
WO2021082580A1 (fr) Procédé, dispositif et appareil électronique de génération d'une image à plage dynamique élevée d'une scène nocturne
CN109218628B (zh) 图像处理方法、装置、电子设备及存储介质
WO2020103503A1 (fr) Procédé et appareil de traitement d'images de scène nocturne, dispositif électronique, et support de stockage
CN109005366B (zh) 摄像模组夜景摄像处理方法、装置、电子设备及存储介质
WO2020034737A1 (fr) Procédé de commande d'imagerie, appareil, dispositif électronique et support d'informations lisible par ordinateur
WO2020207239A1 (fr) Procédé et appareil de traitement d'images
US11558558B1 (en) Frame-selective camera
CN109218627B (zh) 图像处理方法、装置、电子设备及存储介质
WO2020034735A1 (fr) Procédé de commande d'imagerie et dispositif électronique
US8077218B2 (en) Methods and apparatuses for image processing
WO2020207262A1 (fr) Procédé et appareil de traitement d'images basés sur de multiples trames d'images, et dispositif électronique
CN109729274B (zh) 图像处理方法、装置、电子设备及存储介质
US11532076B2 (en) Image processing method, electronic device and storage medium
US9275445B2 (en) High dynamic range and tone mapping imaging techniques
WO2020038074A1 (fr) Procédé et appareil de commande d'exposition, et dispositif électronique
JP4846259B2 (ja) 輝度補正
CN110751608B (zh) 一种夜景高动态范围图像融合方法、装置和电子设备
CN109919116B (zh) 场景识别方法、装置、电子设备及存储介质
CN109361853B (zh) 图像处理方法、装置、电子设备及存储介质
US20200396367A1 (en) Systems and methods for controlling exposure settings based on motion characteristics associated with an image sensor
CN117710264A (zh) 图像的动态范围校准方法和电子设备
WO2022227200A1 (fr) Procédé et appareil de réglage de résolution pendant une capture d'image, et dispositif intelligent

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20882768

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20882768

Country of ref document: EP

Kind code of ref document: A1