WO2023020201A1 - 图像的增强方法和电子设备 - Google Patents
图像的增强方法和电子设备 Download PDFInfo
- Publication number
- WO2023020201A1 WO2023020201A1 PCT/CN2022/107425 CN2022107425W WO2023020201A1 WO 2023020201 A1 WO2023020201 A1 WO 2023020201A1 CN 2022107425 W CN2022107425 W CN 2022107425W WO 2023020201 A1 WO2023020201 A1 WO 2023020201A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- processed
- fusion
- pixel
- images
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 71
- 238000007499 fusion processing Methods 0.000 claims abstract description 49
- 238000012545 processing Methods 0.000 claims abstract description 44
- 230000004927 fusion Effects 0.000 claims description 162
- 238000001514 detection method Methods 0.000 claims description 21
- 230000003044 adaptive effect Effects 0.000 claims description 19
- 238000004590 computer program Methods 0.000 claims description 18
- 230000015654 memory Effects 0.000 claims description 16
- 238000005282 brightening Methods 0.000 claims description 5
- 238000009826 distribution Methods 0.000 claims description 5
- 238000010606 normalization Methods 0.000 claims description 4
- 230000004907 flux Effects 0.000 claims description 3
- 230000002708 enhancing effect Effects 0.000 claims 1
- 230000000694 effects Effects 0.000 abstract description 26
- 238000005516 engineering process Methods 0.000 description 13
- 238000010586 diagram Methods 0.000 description 12
- 238000003860 storage Methods 0.000 description 7
- 230000006870 function Effects 0.000 description 6
- 238000013473 artificial intelligence Methods 0.000 description 5
- 230000008569 process Effects 0.000 description 5
- 238000004891 communication Methods 0.000 description 4
- 239000003086 colorant Substances 0.000 description 3
- 230000008878 coupling Effects 0.000 description 3
- 238000010168 coupling process Methods 0.000 description 3
- 238000005859 coupling reaction Methods 0.000 description 3
- 238000013135 deep learning Methods 0.000 description 3
- 238000011160 research Methods 0.000 description 3
- 230000003796 beauty Effects 0.000 description 2
- 238000003384 imaging method Methods 0.000 description 2
- 238000010801 machine learning Methods 0.000 description 2
- 238000012795 verification Methods 0.000 description 2
- 238000013528 artificial neural network Methods 0.000 description 1
- 230000003190 augmentative effect Effects 0.000 description 1
- 230000015572 biosynthetic process Effects 0.000 description 1
- 238000009529 body temperature measurement Methods 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 238000012937 correction Methods 0.000 description 1
- 239000002537 cosmetic Substances 0.000 description 1
- 238000009792 diffusion process Methods 0.000 description 1
- 238000005315 distribution function Methods 0.000 description 1
- 238000005286 illumination Methods 0.000 description 1
- 238000009434 installation Methods 0.000 description 1
- 230000004807 localization Effects 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 229920006395 saturated elastomer Polymers 0.000 description 1
- 238000003786 synthesis reaction Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/50—Image enhancement or restoration using two or more images, e.g. averaging or subtraction
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/90—Dynamic range modification of images or parts thereof
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10024—Color image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20212—Image combination
- G06T2207/20221—Image fusion; Image merging
Definitions
- the present disclosure relates to the technical field of image processing, and in particular, to an image enhancement method and electronic equipment.
- Image information plays a huge role in current production and life. As an important information carrier, they promote the exchange of information and help people to understand the world more intuitively. However, in real life, under low-light conditions, due to dim ambient light and insufficient illumination, the light reflected by objects in the environment is weak, and the number of photons captured by the imaging device is insufficient, resulting in low contrast in the final imaged image. , Insufficient saturation, loss of image details and other issues.
- GHE Global Histogram Equalization
- the image enhancement methods in related fields can obtain poor enhanced image effects.
- the purpose of the present disclosure is to provide an image enhancement method, device and electronic equipment, so as to alleviate the technical problem of poor enhanced image effect obtained by the image enhancement method in the related field.
- an embodiment of the present disclosure provides an image enhancement method, including:
- the exposure parameters may include at least one of exposure time and light flux.
- the at least two images to be processed are low dynamic range images with different exposure times
- the enhanced fusion image is an enhanced high dynamic range capable of providing more dynamic range and image details image.
- performing a first fusion process on the first image to be processed and the second image to be processed to obtain a fusion image to be processed includes:
- processing the brightness of the first image to be processed in the at least two images to be processed to obtain the second image to be processed includes:
- performing adaptive weight detection on the first image to be processed to obtain a fusion coefficient feature map including:
- determining a weight value corresponding to each pixel in the first image to be processed based on the pixel value of each pixel in the first image to be processed includes:
- W represents the weight value corresponding to each pixel in the first image to be processed
- c represents the number of channels
- u c (x) represents the pixel value of the pixel point at the x coordinate after normalization of the pixel values of each pixel in the first image to be processed
- x represents the pixel value of the first image to be processed
- ⁇ 2 represents the variance parameter in the Gaussian distribution.
- the fusion coefficient feature map is a fusion coefficient feature map corresponding to the first image to be processed, and the first image to be processed and the second image to be processed are processed based on the fusion coefficient feature map.
- First fusion processing including:
- the fusion coefficient feature map corresponding to the first image to be processed, the second image to be processed, and the fusion coefficient feature map corresponding to the second image to be processed Perform weighted average processing, including:
- performing a second fusion process on the fusion image to be processed and the remaining images to be processed in the at least two images to be processed except the first image to be processed including:
- the determining the weight corresponding to each pixel in the fusion image to be processed according to the contrast, saturation and exposure of each pixel in the fusion image to be processed includes:
- the weight corresponding to the pixel is calculated by the following formula:
- W ij,k (C ij,k ) wc ⁇ (S ij,k ) ws ⁇ (E ij,k ) wE ,
- C, S, and E represent the contrast, saturation, and good exposure, respectively;
- the index w represents the weight of contrast, saturation, and good exposure;
- ij, k represent the (i,j)th pixel of the kth image.
- the first image to be processed is the image with the highest brightness among the at least two images to be processed.
- an embodiment of the present disclosure provides an electronic device, including a memory, a processor, and a computer program stored on the memory and operable on the processor.
- the processor executes the computer program, The steps of the method described in any one of the above-mentioned first aspects are realized.
- an embodiment of the present disclosure provides a computer-readable medium having a non-volatile program code executable by a processor, the program code causing the processor to execute the method described in any one of the above-mentioned first aspects. method steps.
- an embodiment of the present disclosure further provides a computer program product, where the computer program product includes a computer program, and when the computer program is executed by a processor, the steps of the method described in any one of the above-mentioned first aspects are implemented.
- an image enhancement method including: acquiring at least two images to be processed of a target scene, and processing the brightness of the first image to be processed in the at least two images to be processed, to obtain The second image to be processed; then, perform the first fusion process on the first image to be processed and the second image to be processed to obtain a fusion image to be processed; finally, divide the fusion image to be processed from at least two images to be processed into the first The remaining images to be processed other than the image to be processed are subjected to a second fusion process to obtain an enhanced fusion image.
- the present disclosure performs the first fusion processing on the first image to be processed and the second image to be processed, so that the image effect of the obtained fusion image to be processed is good, and then the enhancement obtained based on the fusion image to be processed with good image effect
- the image effect of the fused image is good, which alleviates the technical problem that the enhanced image effect obtained by the image enhancement method in the related field is poor.
- FIG. 1 is a schematic diagram of an electronic device provided by an embodiment of the present disclosure
- FIG. 2 is a flow chart of an image enhancement method provided by an embodiment of the present disclosure
- FIG. 3 is a flow chart of processing the brightness of the first image to be processed among at least two images to be processed provided by an embodiment of the present disclosure
- FIG. 4 is a schematic diagram of a first image to be processed provided by an embodiment of the present disclosure
- FIG. 5 is a schematic diagram of a fusion coefficient feature map corresponding to a first image to be processed provided by an embodiment of the present disclosure
- FIG. 6 is a schematic diagram of another image to be processed among the two images to be processed provided by an embodiment of the present disclosure
- FIG. 7 is a schematic diagram of a comparison between an unenhanced fused image and an enhanced fused image provided by an embodiment of the present disclosure
- FIG. 8 is a schematic diagram of an image enhancement device provided by an embodiment of the present disclosure.
- an electronic device 100 for implementing an embodiment of the present disclosure will be described with reference to FIG. 1 , and the electronic device can be used to run the image enhancement method of each embodiment of the present disclosure.
- an electronic device 100 includes one or more processors 102, one or more memories 104, an input device 106, an output device 108, and a camera 110. These components are connected via a bus system 112 and/or other forms of connection mechanisms (not shown) interconnects. It should be noted that the components and structure of the electronic device 100 shown in FIG. 1 are only exemplary rather than limiting, and the electronic device may also have other components and structures as required.
- the processor 102 can be a digital signal processor (DSP, Digital Signal Processing), a field programmable gate array (FPGA, Field-Programmable Gate Array), a programmable logic array (PLA, Programmable Logic Array) and an ASIC (Application Specific Integrated Circuit), the processor 102 can be a central processing unit (CPU, Central Processing Unit) or other forms of processing units with data processing capabilities and/or instruction execution capabilities, and can Other components in the electronic device 100 are controlled to perform desired functions.
- DSP digital signal processor
- FPGA field programmable gate array
- PLA Programmable Logic Array
- ASIC Application Specific Integrated Circuit
- the memory 104 may include one or more computer program products, which may include various forms of computer-readable storage media, such as volatile memory and/or non-volatile memory.
- the volatile memory may include, for example, random access memory (RAM) and/or cache memory (cache).
- the non-volatile memory may include, for example, a read-only memory (ROM), a hard disk, a flash memory, and the like.
- One or more computer program instructions can be stored on the computer-readable storage medium, and the processor 102 can execute the program instructions to realize the client functions (implemented by the processor) in the embodiments of the present disclosure described below and/or other desired functionality.
- Various application programs and various data such as various data used and/or generated by the application programs, may also be stored in the computer-readable storage medium.
- the input device 106 may be a device used by a user to input instructions, and may include one or more of a keyboard, a mouse, a microphone, and a touch screen.
- the output device 108 may output various information (eg, images or sounds) to the outside (eg, a user), and may include one or more of a display, a speaker, and the like.
- the camera 110 is used to collect at least two images to be processed, where at least two images to be processed collected by the camera are processed by the image enhancement method to obtain an enhanced fusion image, for example, the camera can capture The image desired by the user (such as a photo, video, etc.), then, the image is processed by the image enhancement method to obtain an enhanced fusion image, and the camera can also store the captured image in the memory 104 to for use by other components.
- the electronic device for implementing the image enhancement method according to the embodiment of the present disclosure may be implemented as an intelligent mobile terminal such as a smart phone, a tablet computer, and the like.
- Artificial Intelligence is an emerging science and technology that studies and develops theories, methods, technologies and application systems for simulating and extending human intelligence.
- the subject of artificial intelligence is a comprehensive subject that involves many technologies such as chips, big data, cloud computing, Internet of Things, distributed storage, deep learning, machine learning, and neural networks.
- computer vision is specifically to allow machines to recognize the world.
- Computer vision technology usually includes face recognition, liveness detection, fingerprint recognition and anti-counterfeiting verification, biometric recognition, face detection, pedestrian detection, target detection, pedestrian detection, etc.
- an image enhancement method is provided. It should be noted that the steps shown in the flow chart of the accompanying drawings can be executed in a computer system such as a set of computer-executable instructions, and although in the flow chart The figures show a logical order, but in some cases the steps shown or described may be performed in an order different from that shown or described herein.
- Fig. 2 is a flowchart of an image enhancement method according to an embodiment of the present disclosure. As shown in Fig. 2, the method includes the following steps:
- Step S202 acquiring at least two images to be processed of the target scene, and processing the brightness of the first image to be processed in the at least two images to be processed to obtain a second image to be processed, wherein the at least two images to be processed are different exposure parameters;
- the above-mentioned target scene may be any scene
- the above-mentioned at least two images to be processed may be images obtained by shooting the same target scene, or may be images of the same target scene stored in advance. There is no specific limitation on the manner of acquiring the above at least two images to be processed.
- the scales of the at least two images to be processed are the same, and the exposure parameters of the at least two images to be processed are different.
- the above exposure parameters may be, for example, exposure time, light flux, etc., which are not limited in the embodiments of the present disclosure.
- the brightness of the first image to be processed in the at least two images to be processed is processed to obtain a second image to be processed.
- the first image to be processed may be determined from at least two images to be processed according to brightness values of the at least two images to be processed.
- Step S204 performing a first fusion process on the first image to be processed and the second image to be processed to obtain a fusion image to be processed;
- the above-mentioned first fusion processing may be performing weighting processing on the first image to be processed and the second image to be processed to obtain a fusion image to be processed.
- Step S206 performing a second fusion process on the fusion image to be processed and other images to be processed except the first image to be processed in at least two images to be processed to obtain an enhanced fusion image.
- the above enhanced fusion image is actually an enhanced high dynamic range image.
- High-dynamic range images (High-Dynamic Range, referred to as HDR), compared with ordinary images, can provide more dynamic range and image details, according to different exposure times of LDR (Low-Dynamic Range, low dynamic range images), And use the LDR image corresponding to the best detail at each exposure time to synthesize the final HDR image. It can better reflect the visual effects in the real environment.
- the final enhanced HDR image is synthesized according to the fusion image to be processed and the rest of the at least two images to be processed except the first image to be processed, and the resulting enhanced high dynamic range The image of the image works well.
- an image enhancement method including: acquiring at least two images to be processed of a target scene, and processing the brightness of the first image to be processed in the at least two images to be processed, to obtain The second image to be processed; then, perform the first fusion process on the first image to be processed and the second image to be processed to obtain a fusion image to be processed; finally, divide the fusion image to be processed from at least two images to be processed into the first The remaining images to be processed other than the image to be processed are subjected to a second fusion process to obtain an enhanced fusion image.
- the present disclosure performs the first fusion processing on the first image to be processed and the second image to be processed, so that the image effect of the obtained fusion image to be processed is good, and then the enhancement obtained based on the fusion image to be processed with good image effect
- the image effect of the fused image is good, which alleviates the technical problem that the enhanced image effect obtained by the image enhancement method in the related field is poor.
- the first fusion processing is performed on the first image to be processed and the second image to be processed to obtain the fusion image to be processed, which specifically includes: performing adaptive weight detection on the first image to be processed, Obtaining a feature map of fusion coefficients; performing a first fusion process on the first image to be processed and the second image to be processed based on the feature map of fusion coefficients to obtain a fusion image to be processed.
- a correlation formula may be used to perform adaptive weight detection on the first image to be processed, and an adaptive weight detection model may be used to perform adaptive weight detection on the first image to be processed, thereby obtaining a fusion coefficient feature map.
- processing the brightness of the first image to be processed in at least two images to be processed includes the following steps:
- Step S301 determining brightness values of at least two images to be processed
- Step S302 determining a first image to be processed from at least two images to be processed according to the brightness value
- Step S303 performing brightening processing on the first image to be processed to obtain a second image to be processed.
- the above-mentioned first image to be processed may be an image with the highest brightness among at least two images to be processed, and this embodiment of the present disclosure does not specifically limit the above-mentioned first image to be processed.
- a weighted distribution adaptive gamma correction (AGCWD) algorithm may be used to perform brightening processing on the first image to be processed to obtain the second image to be processed.
- AGCWD weighted distribution adaptive gamma correction
- an adaptive gamma curve (a special tone curve) is designed according to the distribution function to brighten the image.
- the Gamma value is equal to 1
- the curve is a straight line at 45° to the coordinate axis, which means that the input and output densities are the same.
- Gamma values above 1 will darken the output, and Gamma values below 1 will lighten the output.
- the reason for choosing the above-mentioned AGCWD algorithm is mainly that it can be compatible with the speed under the premise of achieving the effect.
- the above step S204 is to perform adaptive weight detection on the first image to be processed to obtain a fusion coefficient feature map, which specifically includes: based on the pixel values of each pixel in the first image to be processed, Determine the weight value corresponding to each pixel in the first image to be processed, and then obtain a fusion coefficient feature map corresponding to the first image to be processed.
- adaptive weight detection is performed on each pixel in the first image to be processed by the following formula to obtain a weight value corresponding to each pixel in the first image to be processed;
- W represents the weight value corresponding to each pixel in the first image to be processed
- c represents the number of channels
- u c (x) represents the pixel value of the pixel at the x coordinate after normalization of the pixel values of each pixel in the first image to be processed
- x represents the pixel in the first image to be processed
- the coordinates of , ⁇ 2 represents the variance parameter in the Gaussian distribution
- 0.5 represents the ideal pixel value.
- the ideal pixel value can generally take a value between 0 and 1, and in some embodiments, the ideal pixel value can also be set to a value greater than or less than 0.5 as required.
- ⁇ is equal to 0.2.
- Fig. 4 is a schematic diagram of the first image to be processed
- Fig. 5 is a schematic diagram of a fusion coefficient feature map corresponding to the first image to be processed obtained after adaptive weight detection is performed on the first image to be processed
- Fig. 6 is two A schematic diagram of another image to be processed in the image to be processed.
- the fusion coefficient feature map is a fusion coefficient feature map corresponding to the first image to be processed.
- the first image to be processed and the second image to be processed are Performing the first fusion process specifically includes: determining the fusion coefficient feature map corresponding to the second image to be processed according to the fusion coefficient feature map corresponding to the first image to be processed;
- the fusion coefficient feature map, the second image to be processed, and the fusion coefficient feature map corresponding to the second image to be processed are subjected to weighted average processing to obtain a fusion image to be processed.
- fusion coefficient feature maps corresponding to the two images to be processed are weighted and averaged to obtain the fusion image to be processed, where I represents the fusion image to be processed, I 1 represents the first image to be processed, I 2 represents the second image to be processed, and mask represents The fusion coefficient feature map corresponding to the first image to be processed, (1-mask) represents the fusion coefficient feature map corresponding to the second image to be processed.
- the first image to be processed is multiplied with the fusion coefficient feature map, and at the same time, the second image to be processed is multiplied with (1-fusion coefficient feature map), and the obtained two The result of the product operation is then added.
- the pixel value of each pixel in A' is multiplied by the corresponding weight value (that is, the fusion coefficient) in the mask.
- the low-light area comes from the second image to be processed, and the high-light area comes from the first image to be processed, thus achieving the effect of low-light enhancement.
- step S208 is to perform a second fusion process on the fusion image to be processed and the rest of the at least two images to be processed except the first image to be processed, which specifically includes:
- the exposure fusion algorithm is used to perform a second fusion process on the fusion image to be processed and the rest of the at least two images to be processed except the first image to be processed to obtain an enhanced fusion image.
- the above-mentioned exposure fusion (exposure fusion) algorithm uses three quality indicators of image contrast, saturation, and good exposure to fuse multiple frames of images.
- the exposure fusion algorithm can directly extract information from LDR image sequences with different exposures and fuse them into a HDR image with local adaptive exposure (that is, the enhanced fusion image).
- W ij,k (C ij,k ) wc ⁇ (S ij,k ) ws ⁇ (E ij,k ) wE
- C, S, E represent contrast, saturation, and good exposure respectively
- index w represents their three
- the weight of each, ij, k represent the (i, j)th pixel of the kth image. If the index is equal to 0, the corresponding metric is not used for consideration.
- the final pixel weights are used to guide the fusion process.
- the left figure in FIG. 7 shows a schematic diagram of a fused image without enhancement
- the right figure shows a schematic diagram of an enhanced fused image. From the comparison in Figure 7, it can be seen that compared with the non-enhanced fused image, the high-light areas of the enhanced fused image (corresponding to the upper boxes of the left and right images respectively) have no diffusion, and there is no overexposure phenomenon. The regions (corresponding to the lower boxes in the left and right images, respectively) are well enhanced and the image looks good.
- the image enhancement method disclosed in this disclosure is based on high dynamic range scenes, using images to be processed with different exposure times, through adaptive The weight detection can quickly calculate the fusion coefficient feature map of the high-light area and the low-light area, so that the enhancement method of the present disclosure can obtain an enhanced high dynamic range image with good effect, and can be compatible with large-area overexposed low-light scenes, real-time Good sex.
- the embodiment of the present disclosure also provides an image enhancement device, the image enhancement device is mainly used to implement the image enhancement method provided by the above content of the embodiment of the present disclosure, the image enhancement device provided by the embodiment of the present disclosure will be described in detail below introduce.
- Fig. 8 is a schematic diagram of an image enhancement device according to an embodiment of the present disclosure.
- the image enhancement device mainly includes: a processing unit 10, a first fusion processing unit 20 and a second fusion processing unit 30, in:
- the processing unit may be configured to acquire at least two images to be processed of the target scene, and process the brightness of a first image to be processed in the at least two images to be processed to obtain a second image to be processed, wherein at least The exposure parameters of the two images to be processed are different;
- the first fusion processing unit may be configured to perform first fusion processing on the first image to be processed and the second image to be processed to obtain a fusion image to be processed;
- the second fusion processing unit may be configured to perform a second fusion process on the fusion image to be processed and the remaining images to be processed except the first image to be processed in at least two images to be processed, to obtain an enhanced fusion image .
- an image enhancement device including: acquiring at least two images to be processed of a target scene, and processing the brightness of the first image to be processed in the at least two images to be processed, to obtain The second image to be processed; then, perform the first fusion process on the first image to be processed and the second image to be processed to obtain a fusion image to be processed; finally, divide the fusion image to be processed from at least two images to be processed into the first The remaining images to be processed other than the image to be processed are subjected to a second fusion process to obtain an enhanced fusion image.
- the present disclosure performs the first fusion processing on the first image to be processed and the second image to be processed, so that the image effect of the obtained fusion image to be processed is good, and then the enhancement obtained based on the fusion image to be processed with good image effect
- the image effect of the fused image is good, which alleviates the technical problem that the enhanced image effect obtained by the image enhancement method in the related field is poor.
- the first fusion processing unit may also be configured to: perform adaptive weight detection on the first image to be processed to obtain a fusion coefficient feature map; The processed image is subjected to a first fusion process to obtain a fusion image to be processed.
- the processing unit may also be configured to: determine brightness values of at least two images to be processed; determine a first image to be processed from the at least two images to be processed according to the brightness values; Perform brightening processing to obtain a second image to be processed.
- the first fusion processing unit may also be configured to: determine the weight value corresponding to each pixel point in the first image to be processed based on the pixel value of each pixel point in the first image to be processed, and then obtain the weight value corresponding to The fusion coefficient feature map corresponding to the first image to be processed.
- the first fusion processing unit may also be configured to: perform adaptive weight detection on each pixel in the first image to be processed by using the following formula to obtain the weight corresponding to each pixel in the first image to be processed Weights;
- W represents the weight value corresponding to each pixel in the first image to be processed
- c represents the number of channels
- u c (x) represents the pixel value of the pixel at the x coordinate after normalization of the pixel values of each pixel in the first image to be processed
- x represents the pixel in the first image to be processed
- the coordinates of , ⁇ 2 represents the variance parameter in the Gaussian distribution.
- the fusion coefficient feature map is a fusion coefficient feature map corresponding to the first image to be processed
- the first fusion processing unit may also be configured to: determine the fusion coefficient feature map corresponding to the first image to be processed and The fusion coefficient feature map corresponding to the second image to be processed; for the first image to be processed, the fusion coefficient feature map corresponding to the first image to be processed, the second image to be processed, and the fusion coefficient feature map corresponding to the second image to be processed Perform weighted average processing to obtain the fusion image to be processed.
- the fusion coefficient feature map, the second image to be processed, and the fusion coefficient feature map corresponding to the second image to be processed are weighted and averaged to obtain the fusion image to be processed, where I represents the fusion image to be processed, and I 1 represents the first fusion coefficient to be processed.
- Processing image, I 2 represents the second image to be processed, mask represents the fusion coefficient feature map corresponding to the first image to be processed, (1-mask) represents the fusion coefficient feature map corresponding to the second image to be processed.
- the second fusion processing unit may also be configured to: determine the weight corresponding to each pixel in the fusion image to be processed according to the contrast, saturation and exposure of each pixel in the fusion image to be processed, and obtain Fusing the weight map corresponding to the image, and determining the weight corresponding to each pixel in the rest of the image to be processed according to the contrast, saturation and exposure of each pixel in the rest of the image to be processed, and obtaining the weight map corresponding to the rest of the image to be processed;
- the fused image, the weight map corresponding to the fused image to be processed, the remaining images to be processed, and the weight maps corresponding to the remaining images to be processed are subjected to weighted average processing to obtain an enhanced fused image.
- the first image to be processed is the image with the highest brightness among the at least two images to be processed.
- the image enhancement device provided by the embodiment of the present disclosure has the same realization principle and technical effect as the aforementioned method embodiment.
- the part not mentioned in the device embodiment please refer to the corresponding content in the aforementioned method embodiment. .
- a computer-readable medium having non-volatile program code executable by a processor is also provided, the program code causes the processor to execute the method described in any of the above method embodiments. steps of the method.
- a computer program product includes a computer program, and when the computer program is executed by a processor, the method according to any of the above method embodiments is implemented. method steps.
- connection should be interpreted in a broad sense, for example, it can be a fixed connection or a detachable connection , or integrally connected; it may be mechanically connected or electrically connected; it may be directly connected or indirectly connected through an intermediary, and it may be the internal communication of two components.
- installation e.g., it can be a fixed connection or a detachable connection , or integrally connected; it may be mechanically connected or electrically connected; it may be directly connected or indirectly connected through an intermediary, and it may be the internal communication of two components.
- the disclosed systems, devices and methods may be implemented in other ways.
- the device embodiments described above are only illustrative.
- the division of the units is only a logical function division.
- multiple units or components can be combined or May be integrated into another system, or some features may be ignored, or not implemented.
- the mutual coupling or direct coupling or communication connection shown or discussed may be through some communication interfaces, and the indirect coupling or communication connection of devices or units may be in electrical, mechanical or other forms.
- the units described as separate components may or may not be physically separated, and the components shown as units may or may not be physical units, that is, they may be located in one place, or may be distributed to multiple network units. Part or all of the units can be selected according to actual needs to achieve the purpose of the solution of this embodiment.
- each functional unit in each embodiment of the present disclosure may be integrated into one processing unit, each unit may exist separately physically, or two or more units may be integrated into one unit.
- the functions are realized in the form of software function units and sold or used as independent products, they can be stored in a non-volatile computer-readable storage medium executable by a processor.
- the computer software product is stored in a storage medium, including several
- the instructions are used to make a computer device (which may be a personal computer, a server, or a network device, etc.) execute all or part of the steps of the methods described in various embodiments of the present disclosure.
- the aforementioned storage media include: U disk, mobile hard disk, read-only memory (ROM, Read-Only Memory), random access memory (RAM, Random Access Memory), magnetic disk or optical disc, etc., which can store program codes. .
- the present disclosure provides an image enhancement method and electronic equipment, including: acquiring at least two images to be processed of a target scene, and processing the brightness of the first image to be processed to obtain a second image to be processed;
- the image to be processed and the second image to be processed are first fused to obtain the fused image to be processed;
- the fused image to be processed is second fused with the remaining images to be processed except the first image to be processed in at least two images to be processed processed to obtain an enhanced fused image.
- the first fusion processing is performed on the first image to be processed and the second image to be processed, so that the image effect of the obtained fusion image to be processed is good, and then the enhanced fusion image obtained based on the fusion image to be processed with good image effect The image effect is good.
- the image enhancement method and electronic device of the present disclosure are reproducible and can be used in a variety of industrial applications.
- the image enhancement method and electronic device disclosed in the present disclosure can be used in the technical field of image processing.
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Image Processing (AREA)
Abstract
一种图像的增强方法和电子设备,方法包括:获取目标场景的至少两张待处理图像,对其中的第一待处理图像的亮度进行处理,得到第二待处理图像;对第一待处理图像和第二待处理图像进行第一融合处理,得到待处理融合图像;将待处理融合图像与至少两张待处理图像中除第一待处理图像之外的其余待处理图像进行第二融合处理,得到增强后的融合图像。通过对第一待处理图像和第二待处理图像进行第一融合处理,使得得到的待处理融合图像的图像效果好,进而基于图像效果好的待处理融合图像得到的增强后的融合图像的图像效果好。
Description
相关申请的交叉引用
本公开要求于2021年08月19日提交中国国家知识产权局的申请号为202110955265.7、名称为“图像的增强方法、装置和电子设备”的中国专利申请的优先权,其全部内容通过引用结合在本公开中。
本公开涉及图像处理的技术领域,尤其是涉及一种图像的增强方法和电子设备。
图像信息在目前的生产生活中发挥着巨大的作用,它们作为重要的信息载体,促进了信息的交流,帮助人们更加直观的去认知这个世界。然而,在现实生活中,在低光照的条件下,由于环境光线昏暗,照度不充分,环境中物体反射出的光线较弱,成像设备捕获的光子数不足,从而导致最终成像的图像对比度较低,饱和度不足,图像细节丢失等问题。
近年来,作为图像处理领域的研究热点之一的低照度图像增强技术。许多研究学者已经专注于其算法的研究并且提出了各种各样的有效方法,但是局限于低照度环境的变换性和拍摄设备的各异性,相关领域的一些低照度图像增强算法都无法全面有效的适用于所有类型的低照度图像。特别是在高动态范围(High Dynamic Range)图像合成技术领域,目前常用的方法有全局直方图均衡化(Global Histogram Equalization,GHE),这种方法适用于整体图像增强,但是它不适用于图像的局部特征,增强后的图像效果差;还有学者提出基于深度学习的单幅图像对比度增强算法,但是,当图像存在大面积过曝区域时,该算法并不能对过曝区域进行良好的校正。
综上,相关领域的图像增强方法得到的增强后的图像效果差。
发明内容
本有鉴于此,本公开的目的在于提供一种图像的增强方法、装置和电子设备,以缓解相关领域的图像增强方法得到的增强后的图像效果差的技术问题。
第一方面,本公开实施例提供了一种图像的增强方法,包括:
获取目标场景的至少两张待处理图像,并对所述至少两张待处理图像中的第一待处理图像的亮度进行处理,得到第二待处理图像,其中,所述至少两张待处理图像的曝光参数不同;
对所述第一待处理图像和所述第二待处理图像进行第一融合处理,得到待处理融合图像;
将所述待处理融合图像与所述至少两张待处理图像中除所述第一待处理图像之外的其余待处理图像进行第二融合处理,得到增强后的融合图像。
可选地,所述曝光参数可以包括曝光时间、通光量中至少一项。
可选地,所述至少两张待处理图像是具有不同的曝光时间的低动态范围图像,并且所述增强后的融合图像是能够提供更多的动态范围和图像细节的增强后的高动态范围图像。
可选地,对所述第一待处理图像和所述第二待处理图像进行第一融合处理,得到待处理融合图像,包括:
对所述第一待处理图像进行自适应权重检测,得到融合系数特征图;
基于所述融合系数特征图对所述第一待处理图像和所述第二待处理图像进行第一融合处理,得到所述待处理融合图像。
可选地,对所述至少两张待处理图像中的第一待处理图像的亮度进行处理,得到第二待处理图像,包括:
确定所述至少两张待处理图像的亮度值;
根据所述亮度值,从所述至少两张待处理图像中确定所述第一待处理图像;
对所述第一待处理图像进行提亮处理,得到所述第二待处理图像。
可选地,对所述第一待处理图像进行自适应权重检测,得到融合系数特征图,包括:
基于所述第一待处理图像中各个像素点的像素值,确定所述第一待处理图像中各个像素点所对应的权重值,进而得到与所述第一待处理图像对应的融合系数特征图。
可选地,基于所述第一待处理图像中各个像素点的像素值,确定所述第一待处理图像中各个像素点所对应的权重值,包括:
通过如下公式对所述第一待处理图像中的各个像素点进行自适应权重检测,得到所述第一待处理图像中各个像素点所对应的权重值;
其中,W表示所述第一待处理图像中各个像素点所对应的权重值,
表示高斯曲线,c表示通道数,u
c(x)表示所述第一待处理图像中各个像素点的像素值归一化后x坐标的像素点的像素值,x表示所述第一待处理图像中像素点的坐标,σ
2表示高斯分布中的方差参数。
可选地,所述融合系数特征图为与所述第一待处理图像对应的融合系数特征图,基于所述融合系数特征图对所述第一待处理图像和所述第二待处理图像进行第一融合处理,包括:
根据与所述第一待处理图像对应的融合系数特征图确定与所述第二待处理图像对应的融合系数特征图;
对所述第一待处理图像、与所述第一待处理图像对应的融合系数特征图、所述第二待处理图像、与所述第二待处理图像对应的融合系数特征图进行加权平均处理,得到所述待处理融合图像。
可选地,对所述第一待处理图像、与所述第一待处理图像对应的融合系数特征图、所述第二待处理图像、与所述第二待处理图像对应的融合系数特征图进行加权平均处理,包括:
基于加权平均算式I=I
1*mask+I
2*(1-mask)对所述第一待处理图像、与所述第一待处理图像对应的融合系数特征图、所述第二待处理图像、与所述第二待处理图像对应的融合系数特征图进行加权平均处理,得到所述待处理融合图像,其中,I表示所述待处理融合图像,I
1表示所述第一待处理图像,I
2表示所述第二待处理图像,mask表示与所述第一待处理图像对应的融合系数特征图,(1-mask)表示与所述第二待处理图像对应的融合系数特征图。
可选地,将所述待处理融合图像与所述至少两张待处理图像中除所述第一待处理图像之外的其余待处理图像进行第二融合处理,包括:
根据所述待处理融合图像中各像素点的对比度、饱和度和曝光度确定所述待处理融合图像中各像素点对应的权重,得到所述待处理融合图像对应的权重图,并根据所述其余待处理图像中各像素点的对比度、饱和度和曝光度确定所述其余待处理图像中各像素点对应的权重,得到所述其余待处理图像对应的权重图;
对所述待处理融合图像、所述待处理融合图像对应的权重图、所述其余待处理图像、所述其余待处理图像对应的权重图进行加权平均处理,得到所述增强后的融合图像。
可选地,所述根据所述待处理融合图像中各像素点的对比度、饱和度和曝光度确定所述待处理融合图像中各像素点对应的权重,包括:
对于每个像素,通过如下公式计算所述像素对应的权重:
W
ij,k=(C
ij,k)
wc×(S
ij,k)
ws×(E
ij,k)
wE,
其中,C、S、E分别表示对比度、饱和度、好的曝光,指数w表示对比度、饱和度、好的曝光的权重,ij、k表示第k张图像的第(i,j)像素。
可选地,所述第一待处理图像为所述至少两张待处理图像中亮度最大的图像。
第二方面,本公开实施例提供了一种电子设备,包括存储器、处理器及存储在所述存储器上并可在所述处理器上运行的计算机程序,所述处理器执行所述计算机程序时实现上述第一方面任一项所述的方法的步骤。
第三方面,本公开实施例提供了一种具有处理器可执行的非易失的程序代码的计算机可读介质,所述程序代码使所述处理器执行上述第一方面任一项所述的方法的步骤。
第四方面,本公开实施例还提供了一种计算机程序产品,所述计算机程序产品包括计算机程序,所述计算机程序被处理器执行时实现上述第一方面任一项所述的方法的步骤。
在本公开实施例中,提供了一种图像的增强方法,包括:获取目标场景的至少两张待处理图像,并对至少两张待处理图像中的第一待处理图像的亮度进行处理,得到第二待处理图像;然后,对第一待处理图像和第二待处理图像进行第一融合处理,得到待处理融合图像;最后,将待处理融合图像与至少两张待处理图像中除第一待处理图像之外的其余待处理图像进行第二融合处理,得到增强后的融合图像。通过上述描述可知,本公开对第一待处理图像和第二待处理图像进行第一融合处理,使得得到的待处理融合图像的图像效果好,进而基于图像效果好的待处理融合图像得到的增强后的融合图像的图像效果好,缓解了相关领域的图像增强方法得到的增强后的图像效果差的技术问题。
为了更清楚地说明本公开具体实施方式或相关技术中的技术方案,下面将对具体实施方式或相关技术描述中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图是本公开的一些实施方式,对于本领域普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图获得其他的附图。
图1为本公开实施例提供的一种电子设备的示意图;
图2为本公开实施例提供的一种图像的增强方法的流程图;
图3为本公开实施例提供的对至少两张待处理图像中的第一待处理图像的亮度进行处理的流程图;
图4为本公开实施例提供的第一待处理图像的示意图;
图5为本公开实施例提供的与第一待处理图像对应的融合系数特征图的示意图;
图6为本公开实施例提供的两张待处理图像中另一待处理图像的示意图;
图7为本公开实施例提供的没有增强的融合图像和增强后的融合图像的对比示意图;
图8为本公开实施例提供的一种图像的增强装置的示意图。
下面将结合实施例对本公开的技术方案进行清楚、完整地描述,显然,所描述的实施例是本公开一部分实施例,而不是全部的实施例。基于本公开中的实施例,本领域普通技术人员在没有做出创造性劳动前提下所获得的所有其他实施例,都属于本公开保护的范围。
首先,参照图1来描述用于实现本公开实施例的电子设备100,该电子设备可以用于运行本公开各实施例的图像的增强方法。
如图1所示,电子设备100包括一个或多个处理器102、一个或多个存储器104、输入装置106、输出装置108以及摄像机110,这些组件通过总线系统112和/或其它形式的连接机构(未示出)互连。应当注意,图1所示的电子设备100的组件和结构只是示例性的,而非限制性的,根据需要,所述电子设备也可以具有其他组件和结构。
所述处理器102可以采用数字信号处理器(DSP,Digital Signal Processing)、现场可编程门阵列(FPGA,Field-Programmable Gate Array)、可编程逻辑阵列(PLA,Programmable Logic Array)和ASIC(Application Specific Integrated Circuit)中的至少一种硬件形式来实现,所述处理器102可以是中央处理单元(CPU,Central Processing Unit)或者具有数据处理能力和/或指令执行能力的其它形式的处理单元,并且可以控制所述电子设备100中的其它组件以执行期望的功能。
所述存储器104可以包括一个或多个计算机程序产品,所述计算机程序产品可以包括各种形式的计算机可读存储介质,例如易失性存储器和/或非易失性存储器。所述易失性存储器例如可以包括随机存取存储器(RAM)和/或高速缓冲存储器(cache)等。所述非易失性存储器例如可以包括只读存储器(ROM)、硬盘、闪存等。在所述计算机可读存储介质上可以存储一个或多个计算机程序指令,处理器102可以运行所述程序指令,以实现下文所述的本公开实施例中(由处理器实现)的客户端功能以及/或者其它期望的功能。在所述计算机可读存储介质中还可以存储各种应用程序和各种数据,例如所述应用程序使用和/或产生的各种数据等。
所述输入装置106可以是用户用来输入指令的装置,并且可以包括键盘、鼠标、麦克风和触摸屏等中的一个或多个。
所述输出装置108可以向外部(例如,用户)输出各种信息(例如,图像或声音),并且可以包括显示器、扬声器等中的一个或多个。
所述摄像机110用于进行至少两张待处理图像的采集,其中,摄像机所采集的至少两张待处理图像经过所述图像的增强方法进行处理之后得到增强后的融合图像,例如,摄像机可以拍摄用户期望的图像(例如照片、视频等),然后,将该图像经过所述图像的增强方法进行处理之后得到增强后的融合图像,摄像机还可以将所拍摄的图像存储在所述存储器104中以供其它组件使用。
示例性地,用于实现根据本公开实施例的图像的增强方法的电子设备可以被实现为诸如智能手机、平板电脑等智能移动终端。
近年来,基于人工智能的计算机视觉、深度学习、机器学习、图像处理、图像识别等技术研究取得了重要进展。人工智能(Artificial Intelligence,AI)是研究、开发用于模拟、延伸人的智能的理论、方法、技术及应用系统的新兴科学技术。人工智能学科是一门综合性学科,涉及芯片、大数据、云计算、物联网、分布式存储、深度学习、机器学习、神经网络等诸多技术种类。计算机视觉作为人工智能的一个重要分支,具体是让机器识别世界,计算机视觉技术通常包括人脸识别、活体检测、指纹识别与防伪验证、生物特征识别、人脸检测、行人检测、目标检测、行人识别、图像处理、图像识别、图像语义理解、图像检索、文字识别、视频处理、视频内容识别、三维重建、虚拟现实、增强现实、同步定位与地图构建(SLAM)、计算摄影、机器人导航与定位等技术。随着人工智能技术的研究和进步,该项技术在众多领域展开了应用,例如安防、城市管理、交通管理、楼宇管理、园区管理、人脸通行、人脸考勤、物流管理、仓储管理、机器人、智能营销、计算摄影、手机影像、云服务、智能家居、穿戴设备、无人驾驶、自动驾驶、智能医疗、人脸支付、人脸解锁、指纹解锁、人证核验、智慧屏、智能电视、摄像机、移动互联网、网络直播、美颜、美妆、医疗美容、智能测温等领域。
根据本公开实施例,提供了一种图像的增强方法,需要说明的是,在附图的流程图示出的步骤可以在诸如一组计算机可执行指令的计算机系统中执行,并且,虽然在流程图中示出了逻辑顺序,但是在某些情况下,可以以不同于此处的顺序执行所示出或描述的步骤。
图2是根据本公开实施例的一种图像的增强方法的流程图,如图2所示,该方法包括如下步骤:
步骤S202,获取目标场景的至少两张待处理图像,并对至少两张待处理图像中的第一待处理图像的亮度进行处理,得到第二待处理图像,其中,至少两张待处理图像的曝光参数不同;
在本公开实施例中,上述目标场景可以为任意场景,上述至少两张待处理图像可以是对同一目标场景进行拍摄得到的图像,也可以为事先存储的同一目标场景的图像,本公开实施例对上述至少两张待处理图像的获取方式不进行具体限定。
需要说明的是,上述至少两张待处理图像的尺度(在本公开实施例中,尺度表示图像的尺寸,即H*W)相同,且上述至少两张待处理图像的曝光参数不同。在一些实施例中上述曝光参数例如可以是曝光时间、通光量等,本公开实施例对此不作限定。
在得到至少两张待处理图像后,对至少两张待处理图像中的第一待处理图像的亮度进行处理,进而得到第二待处理图像。上述第一待处理图像可以根据至少两张待处理图像的亮度值从至少两张待处理图像中确定得到。
步骤S204,对第一待处理图像和第二待处理图像进行第一融合处理,得到待处理融合图像;
上述第一融合处理可以为对第一待处理图像和第二待处理图像进行加权处理,得到待处理融合图像。
步骤S206,将待处理融合图像与至少两张待处理图像中除第一待处理图像之外的其余待处理图像进行第二融合处理,得到增强后的融合图像。
上述增强后的融合图像实际为增强后的高动态范围图像。
高动态范围图像(High-Dynamic Range,简称HDR),相比普通的图像,可以提供更多的动态范围和图像细节,根据不同的曝光时间的LDR(Low-Dynamic Range,低动态范围图像),并利用每个曝光时间相对应最佳细节的LDR图像来合成最终HDR图像。它能够更好的反映出真实环境中的视觉效果。本公开实施例中,根据待处理融合图像和至少两张待处理图像中除第一待处理图像之外的其余待处理图像来合成最终增强后的HDR图像,最终得到的增强后的高动态范围图像的图像效果好。
在本公开实施例中,提供了一种图像的增强方法,包括:获取目标场景的至少两张待处理图像,并对至少两张待处理图像中的第一待处理图像的亮度进行处理,得到第二待处理图像;然后,对第一待处理图像和第二待处理图像进行第一融合处理,得到待处理融合图像;最后,将待处理融合图像与至少两张待处理图像中除第一待处理图像之外的其余待处理图像进行第二融合处理,得到增强后的融合图像。通过上述描述可知,本公开对第一待处理图像和第二待处理图像进行第一融合处理,使得得到的待处理融合图像的图像效果好,进而基于图像效果好的待处理融合图像得到的增强后的融合图像的图像效果好,缓解了相关领域的图像增强方法得到的增强后的图像效果差的技术问题。
上述内容对本公开的图像的增强方法进行了简要介绍,下面对其中涉及到的具体内容进行详细描述。
在本公开的一个可选实施例中,对第一待处理图像和第二待处理图像进行第一融合处理,得到待处理融合图像,具体包括:对第一待处理图像进行自适应权重检测,得到融合系数特征图;基于融合系数特征图对第一待处理图像和第二待处理图像进行第一融合处理,得到待处理融合图像。
具体可以采用相关算式对第一待处理图像进行自适应权重检测,还可以采用自适应权重检测模型对第一待处理图像进行自适应权重检测,从而,得到融合系数特征图。
在本公开的一个可选实施例中,参考图3,对至少两张待处理图像中的第一待处理图像的亮度进行处理,具体包括如下步骤:
步骤S301,确定至少两张待处理图像的亮度值;
步骤S302,根据亮度值,从至少两张待处理图像中确定第一待处理图像;
步骤S303,对第一待处理图像进行提亮处理,得到第二待处理图像。
上述第一待处理图像可以为至少两张待处理图像中亮度最大的图像,本公开实施例对上述第一待处理图像不进行具体限制。
具体的,可以采用加权分布自适应伽马校正(AGCWD)算法对第一待处理图像进行提亮处理,得到第二待处理图像。其中,AGCWD算法中,会根据分布函数设计一个自适应的gamma曲线(一种特殊的色调曲线)用以提亮图像。当Gamma值等于1的时候,曲线为与坐标轴成45°的直线,这个时候表示输入和输出密度相同。高于1的Gamma值将会造成输出暗化,低于1的Gamma值将会造成输出亮化。
选择上述AGCWD算法的原因主要在于在达到效果的前提下,可以兼容速度。
在本公开的一个可选实施例中,上述步骤S204,对第一待处理图像进行自适应权重检测,得到融合系数特征图,具体包括:基于第一待处理图像中各个像素点的像素值,确定第一待处理图像中各个像素点所对应的权重值,进而得到与第一待处理图像对应的融合系数特征图。
可选地,通过如下公式对第一待处理图像中的各个像素点进行自适应权重检测,得到第一待处理图像中各个像素点所对应的权重值;
其中,W表示第一待处理图像中各个像素点所对应的权重值,
表示高斯曲线,c表示通道数,u
c(x)表示第一待处理图像中各个像素点的像素值归一化后x坐标的像素点的像素值,x表示第一待处理图像中像素点的坐标,σ
2表示高斯分布中的方差参数,0.5表示理想的像素值。理想的像素值一般可以取0到1之间的值,在一些实施例中还可以根据需要将理想的像素值设置为大于或小于0.5的值。优选地,σ等于0.2。
图4为第一待处理图像的示意图,图5为对第一待处理图像图4进行自适应权重检测后,得到的与第一待处理图像对应的融合系数特征图的示意图,图6为两张待处理图像中另一待处理图像的示意图。
在本公开的一个可选实施例中,融合系数特征图为与第一待处理图像对应的融合系数特征图,上述步骤S206,基于融合系数特征图对第一待处理图像和第二待处理图像进行第一融合处理,具体包括:根据与第一待处理图像对应的融合系数特征图确定与第二待处理图像对应的融合系数特征图;对第一待处理图像、与第一待处理图像对应的融合系数特征图、第二待处理图像、与第二待处理图像对应的融合系数特征图进行加权平均处理,得到待处理融合图像。
具体包括:基于加权平均算式I=I
1*mask+I
2*(1-mask)对第一待处理图像、与第一待处理图像对应的融合系数特征图、第二待处理图像、与第二待处理图像对应的融合系数特征图进行加权平均处理,得到待处理融合图像,其中,I表示待处理融合图像,I
1表示第一待处理图像,I
2表示第二待处理图像,mask表示与第一待处理图像对应的融合系数特征图,(1-mask)表示与第二待处理图像对应的融合系数特征图。
具体的,在进行上述第一融合处理时,第一待处理图像与融合系数特征图进行乘积运算,同时,第二待处理图像与(1-融合系数特征图)进行乘积运算,得到的两个乘积运算结果再进行相加运算。
例如,当第一待处理图像A’与融合系数特征图mask进行乘积运算时,A’中各个像素点的像素值与mask中对应的权重值(也即融合系数)进行乘积运算。
通过上述的第一融合处理,得到的待处理融合图像中,低光区域来自第二待处理图像,高光区域来自第一待处理图像,如此,达到了低光增强的效果。
在本公开的一个可选实施例中,上述步骤S208,将待处理融合图像与至少两张待处理图像中除第一待处理图像之外的其余待处理图像进行第二融合处理,具体包括:
根据待处理融合图像中各像素点的对比度、饱和度和曝光度确定待处理融合图像中各像素点对应的权重,得到待处理融合图像对应的权重图,并根据其余待处理图像中各像素点的对比度、饱和度和曝光度确定其余待处理图像中各像素点对应的权重,得到其余待处理图像对应的权重图;对待处理融合图像、待处理融合图像对应的权重图、其余待处理图像、其余待处理图像对应的权重图进行加权平均处理,得到增强后的融合图像。
上述过程即采用了曝光融合算法将待处理融合图像与至少两张待处理图像中除第一待处理图像之外的其余待处理图像进行第二融合处理,得到增强后的融合图像。
上述曝光融合(exposure fusion)算法,利用图像的对比度、饱和度、曝光良好三个质量指标对多帧图像进行融合。曝光融合算法能够直接从曝光度不同的LDR图像序列提取信息融合成一幅局部自适应曝光的HDR图像(即增强后的融合图像)。
具体的,由于曝光不足和过曝,序列中的许多图像包含平面、无色区域。这样的区域应该得到较少的权重,而包含明亮颜色和细节的有趣区域应该被保留。为此,我们会采取以下措施:
对比度:我们对每个图像的灰度图应用Laplacian滤波器,取滤波器响应的绝对值。这将产生一个简单的指示符C,用于对比。它倾向于给重要的元素,如边缘和纹理赋予高度的权重。
饱和度:当一张照片经过长时间的曝光后,所产生的颜色会降低饱和度并最终被剪掉。饱和的颜色是可取的,使图像看起来生动。我们包括一个饱和度测量S,它是计算在R,G和B通道内的每个像素的标准偏差。
良好曝光:只看一个通道内的原始强度,就能揭示一个像素被曝光的程度。我们想要保持强度不接近0(曝光不足)或1(曝光过度)。我们使用高斯曲线根据强度i与0.5的接近程度来计算其权重:
其中,在我们的实际操作中σ=0.2。为了考虑多个颜色通道,我们分别对每个通道应用高斯曲线,并将结果相乘,得到测量值E。
对于每个像素,我们使用乘法将来自不同度量的信息组合成标量权重映射。我们选择了一个产品而不是线性组合,因为我们想要一次性执行由度量定义的所有质量(例如,像一个“和”选择,相对于一个“或”选择)。类似于线性组合的加权项,我们可以使用幂函数控制每个测度的影响:
W
ij,k=(C
ij,k)
wc×(S
ij,k)
ws×(E
ij,k)
wE,C、S、E分别表示对比度、饱和度、好的曝光,指数w表示它们三个的权重,ij、k表示第k张图像的第(i,j)像素。如果指数等于0,相对应的度量没有用于考虑。最后的像素权重用于引导融合过程。
将沿着每个像素计算一个加权平均值来融合多幅图像,使用上述过程中计算出的权重,为了得到一致的结果,对权重进行归一化,使得在每个像素(i,j)处和为1,对多幅图像进行加权混合得到增强后的融合图像。
图7中的左图示出了没有增强的融合图像的示意图,右图示出了增强后的融合图像的示意图。从图7中的对比可知,相比于没有增强的融合图像,增强后的融合图像的高光区域(分别对应左图和右图的上面方框)没有扩散,并且也没有过曝现象,低光区域(分别对应左图和右图的下面方框)得到了很好的增强,图像效果好。
传统的图像的增强方法往往都是基于单张图进行增强,在移动端无法达到实时的效果,而本公开图像的增强方法基于高动态范围场景,利用曝光时间不同的待处理图像,通过自适应权重检测可以快速计算出高光区和低光区的融合系数特征图,从而使得本公开的增强方法可以得到效果好的增强后的高动态范围图像,可以兼容大面积过曝的低光场景,实时性好。
本公开实施例还提供了一种图像的增强装置,该图像的增强装置主要用于执行本公开实施例上述内容所提供的图像的增强方法,以下对本公开实施例提供的图像的增强装置做具体介绍。
图8是根据本公开实施例的一种图像的增强装置的示意图,如图8所示,该图像的增强装置主要包括:处理单元10、第一融合处理单元20和第二融合处理单元30,其中:
处理单元,可以被配置成用于获取目标场景的至少两张待处理图像,并对至少两张待处理图像中的第一待处理图像的亮度进行处理,得到第二待处理图像,其中,至少两张待处理图像的曝光参数不同;
第一融合处理单元,可以被配置成用于对第一待处理图像和第二待处理图像进行第一融合处理,得到待处理融合图像;
第二融合处理单元,可以被配置成用于将待处理融合图像与至少两张待处理图像中除第一待处理图像之外的其余待处理图像进行第二融合处理,得到增强后的融合图像。
在本公开实施例中,提供了一种图像的增强装置,包括:获取目标场景的至少两张待处理图像,并对至少两张待处理图像中的第一待处理图像的亮度进行处理,得到第二待处理图像;然后,对第一待处理图像和第二待处理图像进行第一融合处理,得到待处理融合图像;最后,将待处理融合图像与至少两张待处理图像中除第一待处理图像之外的其余待处理图像进行第二融合处理,得到增强后的融合图像。通过上述描述可知,本公开对第一待处理图像和第二待处理图像进行第一融合处理,使得得到的待处理融合图像的图像效果 好,进而基于图像效果好的待处理融合图像得到的增强后的融合图像的图像效果好,缓解了相关领域的图像增强方法得到的增强后的图像效果差的技术问题。
可选地,第一融合处理单元还可以被配置成用于:对第一待处理图像进行自适应权重检测,得到融合系数特征图;基于融合系数特征图对第一待处理图像和第二待处理图像进行第一融合处理,得到待处理融合图像。
可选地,处理单元还可以被配置成用于:确定至少两张待处理图像的亮度值;根据亮度值,从至少两张待处理图像中确定第一待处理图像;对第一待处理图像进行提亮处理,得到第二待处理图像。
可选地,第一融合处理单元还可以被配置成用于:基于第一待处理图像中各个像素点的像素值,确定第一待处理图像中各个像素点所对应的权重值,进而得到与第一待处理图像对应的融合系数特征图。
可选地,第一融合处理单元还可以被配置成用于:通过如下公式对第一待处理图像中的各个像素点进行自适应权重检测,得到第一待处理图像中各个像素点所对应的权重值;
其中,W表示第一待处理图像中各个像素点所对应的权重值,
表示高斯曲线,c表示通道数,u
c(x)表示第一待处理图像中各个像素点的像素值归一化后x坐标的像素点的像素值,x表示第一待处理图像中像素点的坐标,σ
2表示高斯分布中的方差参数。
可选地,融合系数特征图为与第一待处理图像对应的融合系数特征图,第一融合处理单元还可以被配置成用于:根据与第一待处理图像对应的融合系数特征图确定与第二待处理图像对应的融合系数特征图;对第一待处理图像、与第一待处理图像对应的融合系数特征图、第二待处理图像、与第二待处理图像对应的融合系数特征图进行加权平均处理,得到待处理融合图像。
可选地,第一融合处理单元还可以被配置成用于:基于加权平均算式I=I
1*mask+I
2*(1-mask)对第一待处理图像、与第一待处理图像对应的融合系数特征图、第二待处理图像、与第二待处理图像对应的融合系数特征图进行加权平均处理,得到待处理融合图像,其中,I表示待处理融合图像,I
1表示第一待处理图像,I
2表示第二待处理图像,mask表示与第一待处理图像对应的融合系数特征图,(1-mask)表示与第二待处理图像对应的融合系数特征图。
可选地,第二融合处理单元还可以被配置成用于:根据待处理融合图像中各像素点的对比度、饱和度和曝光度确定待处理融合图像中各像素点对应的权重,得到待处理融合图像对应的权重图,并根据其余待处理图像中各像素点的对比度、饱和度和曝光度确定其余待处理图像中各像素点对应的权重,得到其余待处理图像对应的权重图;对待处理融合图像、待处理融合图像对应的权重图、其余待处理图像、其余待处理图像对应的权重图进行加权平均处理,得到增强后的融合图像。
可选地,第一待处理图像为至少两张待处理图像中亮度最大的图像。
本公开实施例所提供的图像的增强装置,其实现原理及产生的技术效果和前述方法实施例相同,为简要描述,装置实施例部分未提及之处,可参考前述方法实施例中相应内容。
在另一个实施例中,还提供了一种具有处理器可执行的非易失的程序代码的计算机可读介质,所述程序代码使所述处理器执行上述方法实施例中任意实施例所述的方法的步骤。
在本公开的又一实施例中,还提供了一种计算机程序产品,所述计算机程序产品包括计算机程序,所述计算机程序被处理器执行时实现根据上述方法实施例中任意实施例所述的方法的步骤。
另外,在本公开实施例的描述中,除非另有明确的规定和限定,术语“安装”、“相连”、“连接”应做广义理解,例如,可以是固定连接,也可以是可拆卸连接,或一体地连接;可以是机械连接,也可以是电连接;可以是直接相连,也可以通过中间媒介间接相连,可以是两个元件内部的连通。对于本领域的普通技术人员而言,可以具体情况理解上述术语在本公开中的具体含义。
在本公开的描述中,需要说明的是,术语“中心”、“上”、“下”、“左”、“右”、“竖直”、“水平”、“内”、“外”等指示的方位或位置关系为基于附图所示的方位或位置关系,仅是为了便于描述本公开和简化描述,而不是指示或暗示所指的装置或元件必须具有特定的方位、以特定的方位构造和操作,因此不能理解为对本公开的限制。此外,术语“第一”、“第二”、“第三”仅用于描述目的,而不能理解为指示或暗示相对重要性。
所属领域的技术人员可以清楚地了解到,为描述的方便和简洁,上述描述的系统、装置和单元的具体工作过程,可以参考前述方法实施例中的对应过程,在此不再赘述。
在本公开所提供的几个实施例中,应该理解到,所揭露的系统、装置和方法,可以通过其它的方式实现。以上所描述的装置实施例仅仅是示意性的,例如,所述单元的划分,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式,又例如,多个单元或组件可以结合或者可以集成到另一个系统,或一些特征可以忽略,或不执行。另一点,所显示 或讨论的相互之间的耦合或直接耦合或通信连接可以是通过一些通信接口,装置或单元的间接耦合或通信连接,可以是电性,机械或其它的形式。
所述作为分离部件说明的单元可以是或者也可以不是物理上分开的,作为单元显示的部件可以是或者也可以不是物理单元,即可以位于一个地方,或者也可以分布到多个网络单元上。可以根据实际的需要选择其中的部分或者全部单元来实现本实施例方案的目的。
另外,在本公开各个实施例中的各功能单元可以集成在一个处理单元中,也可以是各个单元单独物理存在,也可以两个或两个以上单元集成在一个单元中。
所述功能如果以软件功能单元的形式实现并作为独立的产品销售或使用时,可以存储在一个处理器可执行的非易失的计算机可读取存储介质中。基于这样的理解,本公开的技术方案本质上或者说对相关技术做出贡献的部分或者该技术方案的部分可以以软件产品的形式体现出来,该计算机软件产品存储在一个存储介质中,包括若干指令用以使得一台计算机设备(可以是个人计算机,服务器,或者网络设备等)执行本公开各个实施例所述方法的全部或部分步骤。而前述的存储介质包括:U盘、移动硬盘、只读存储器(ROM,Read-Only Memory)、随机存取存储器(RAM,Random Access Memory)、磁碟或者光盘等各种可以存储程序代码的介质。
最后应说明的是:以上所述实施例,仅为本公开的具体实施方式,用以说明本公开的技术方案,而非对其限制,本公开的保护范围并不局限于此,尽管参照前述实施例对本公开进行了详细的说明,本领域的普通技术人员应当理解:任何熟悉本技术领域的技术人员在本公开揭露的技术范围内,其依然可以对前述实施例所记载的技术方案进行修改或可轻易想到变化,或者对其中部分技术特征进行等同替换;而这些修改、变化或者替换,并不使相应技术方案的本质脱离本公开实施例技术方案的精神和范围,都应涵盖在本公开的保护范围之内。因此,本公开的保护范围应所述以权利要求的保护范围为准。
本公开提供了一种图像的增强方法和电子设备,包括:获取目标场景的至少两张待处理图像,对其中的第一待处理图像的亮度进行处理,得到第二待处理图像;对第一待处理图像和第二待处理图像第一融合处理,得到待处理融合图像;将待处理融合图像与至少两张待处理图像中除第一待处理图像之外的其余待处理图像进行第二融合处理,得到增强后的融合图像。本公开通过对第一待处理图像和第二待处理图像进行第一融合处理,使得得到的待处理融合图像的图像效果好,进而基于图像效果好的待处理融合图像得到的增强后的融合图像的图像效果好。
此外,可以理解的是,本公开的图像的增强方法和电子设备是可以重现的,并且可以用在多种工业应用中。例如,本公开的图像的增强方法和电子设备可以用于图像处理的技术领域。
Claims (15)
- 一种图像的增强方法,其特征在于,包括:获取目标场景的至少两张待处理图像,对所述至少两张待处理图像中的第一待处理图像的亮度进行处理,得到第二待处理图像,其中,所述至少两张待处理图像的曝光参数不同;对所述第一待处理图像和所述第二待处理图像进行第一融合处理,得到待处理融合图像;将所述待处理融合图像与所述至少两张待处理图像中除所述第一待处理图像之外的其余待处理图像进行第二融合处理,得到增强后的融合图像。
- 根据权利要求1所述的方法,其特征在于,所述曝光参数包括曝光时间、通光量中至少一项。
- 根据权利要求1或2所述的方法,其特征在于,所述至少两张待处理图像是低动态范围图像,所述增强后的融合图像是高动态范围图像。
- 根据权利要求1至3中任一项所述的方法,其特征在于,对所述第一待处理图像和所述第二待处理图像进行第一融合处理,得到待处理融合图像,包括:对所述第一待处理图像进行自适应权重检测,得到融合系数特征图;基于所述融合系数特征图对所述第一待处理图像和所述第二待处理图像进行第一融合处理,得到所述待处理融合图像。
- 根据权利要求1至4中任一项所述的方法,其特征在于,对所述至少两张待处理图像中的第一待处理图像的亮度进行处理,得到第二待处理图像,包括:确定所述至少两张待处理图像的亮度值;根据所述亮度值,从所述至少两张待处理图像中确定所述第一待处理图像;对所述第一待处理图像进行提亮处理,得到所述第二待处理图像。
- 根据权利要求4或5所述的方法,其特征在于,对所述第一待处理图像进行自适应权重检测,得到融合系数特征图,包括:基于所述第一待处理图像中各个像素点的像素值,确定所述第一待处理图像中各个像素点所对应的权重值,进而得到与所述第一待处理图像对应的融合系数特征图。
- 根据权利要求4至7中任一项所述的方法,其特征在于,所述融合系数特征图为与所述第一待处理图像对应的融合系数特征图,基于所述融合系数特征图对所述第一待处理图像和所述第二待处理图像进行第一融合处理,包括:根据与所述第一待处理图像对应的融合系数特征图确定与所述第二待处理图像对应的融合系数特征图;对所述第一待处理图像、与所述第一待处理图像对应的融合系数特征图、所述第二待处理图像、与所述第二待处理图像对应的融合系数特征图进行加权平均处理,得到所述待处理融合图像。
- 根据权利要求8所述的方法,其特征在于,对所述第一待处理图像、与所述第一待处理图像对应的融合系数特征图、所述第二待处理图像、与所述第二待处理图像对应的融合系数特征图进行加权平均处理,包括:基于加权平均算式I=I 1*mask+I 2*(1-mask)对所述第一待处理图像、与所述第一待处理图像对应的融合系数特征图、所述第二待处理图像、与所述第二待处理图像对应的融合系数特征图进行加权平均处理,得到所述待处理融合图像,其中,I表示所述待处理融合图像,I 1表示所述第一待处理图像,I 2表示所述第二待处理图像,mask表示与所述第一待处理图像对应的融合系数特征图,(1-mask)表示与所述第二待处理图像对应的融合系数特征图。
- 根据权利要求1至9任一项所述的方法,其特征在于,将所述待处理融合图像与所述至少两张待处理图像中除所述第一待处理图像之外的其余待处理图像进行第二融合处理,包括:根据所述待处理融合图像中各像素点的对比度、饱和度和曝光度确定所述待处理融合图像中各像素点对应的权重,得到所述待处理融合图像对应的权重图,并根据所述其余待 处理图像中各像素点的对比度、饱和度和曝光度确定所述其余待处理图像中各像素点对应的权重,得到所述其余待处理图像对应的权重图;对所述待处理融合图像、所述待处理融合图像对应的权重图、所述其余待处理图像、所述其余待处理图像对应的权重图进行加权平均处理,得到所述增强后的融合图像。
- 根据权利要求10所述的方法,其特征在于,所述根据所述待处理融合图像中各像素点的对比度、饱和度和曝光度确定所述待处理融合图像中各像素点对应的权重,包括:对于每个像素,通过如下公式计算所述像素对应的权重:W ij,k=(C ij,k) wc×(S ij,k) ws×(E ij,k) wE,其中,C、S、E分别表示对比度、饱和度、好的曝光,指数w表示对比度、饱和度、曝光度的权重,ij、k表示第k张图像的第(i,j)像素。
- 根据权利要求1至9中任一项所述的方法,其特征在于,所述第一待处理图像为所述至少两张待处理图像中亮度最大的图像。
- 一种电子设备,包括存储器、处理器及存储在所述存储器上并可在所述处理器上运行的计算机程序,其特征在于,所述处理器执行所述计算机程序时实现上述权利要求1至11中任一项所述的方法的步骤。
- 一种具有处理器可执行的非易失的程序代码的计算机可读介质,其特征在于,所述程序代码使所述处理器执行上述权利要求1至11中任一项所述的方法的步骤。
- 一种计算机程序产品,其特征在于,所述计算机程序产品包括计算机程序,所述计算机程序被处理器执行时实现根据上述权利要求1至11中任一项所述的方法的步骤。
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110955265.7A CN113781370B (zh) | 2021-08-19 | 2021-08-19 | 图像的增强方法、装置和电子设备 |
CN202110955265.7 | 2021-08-19 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2023020201A1 true WO2023020201A1 (zh) | 2023-02-23 |
Family
ID=78838444
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2022/107425 WO2023020201A1 (zh) | 2021-08-19 | 2022-07-22 | 图像的增强方法和电子设备 |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN113781370B (zh) |
WO (1) | WO2023020201A1 (zh) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116523775A (zh) * | 2023-04-14 | 2023-08-01 | 海的电子科技(苏州)有限公司 | 高速图像信号的增强优化方法和设备、存储介质 |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113781370B (zh) * | 2021-08-19 | 2024-09-13 | 北京旷视科技有限公司 | 图像的增强方法、装置和电子设备 |
CN115293994B (zh) * | 2022-09-30 | 2022-12-16 | 腾讯科技(深圳)有限公司 | 图像处理方法、装置、计算机设备和存储介质 |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101833754A (zh) * | 2010-04-15 | 2010-09-15 | 青岛海信网络科技股份有限公司 | 图像增强方法及系统 |
US20140365644A1 (en) * | 2013-05-09 | 2014-12-11 | Bay Sensors | Internet traffic analytics for non-internet traffic |
CN107220956A (zh) * | 2017-04-18 | 2017-09-29 | 天津大学 | 一种基于多幅具有不同曝光度的ldr图像的hdr图像融合方法 |
CN107845128A (zh) * | 2017-11-03 | 2018-03-27 | 安康学院 | 一种多尺度细节融合的多曝光高动态图像重建方法 |
CN110728648A (zh) * | 2019-10-25 | 2020-01-24 | 北京迈格威科技有限公司 | 图像融合的方法、装置、电子设备及可读存储介质 |
CN110751608A (zh) * | 2019-10-23 | 2020-02-04 | 北京迈格威科技有限公司 | 一种夜景高动态范围图像融合方法、装置和电子设备 |
CN113781370A (zh) * | 2021-08-19 | 2021-12-10 | 北京旷视科技有限公司 | 图像的增强方法、装置和电子设备 |
Family Cites Families (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101394487B (zh) * | 2008-10-27 | 2011-09-14 | 华为技术有限公司 | 一种合成图像的方法与系统 |
CN110136071B (zh) * | 2018-02-02 | 2021-06-25 | 杭州海康威视数字技术股份有限公司 | 一种图像处理方法、装置、电子设备及存储介质 |
CN109712097B (zh) * | 2019-01-04 | 2021-04-30 | Oppo广东移动通信有限公司 | 图像处理方法、装置、存储介质及电子设备 |
CN109903224B (zh) * | 2019-01-25 | 2023-03-31 | 珠海市杰理科技股份有限公司 | 图像缩放方法、装置、计算机设备和存储介质 |
CN110599433B (zh) * | 2019-07-30 | 2023-06-06 | 西安电子科技大学 | 一种基于动态场景的双曝光图像融合方法 |
CN110648290A (zh) * | 2019-09-06 | 2020-01-03 | 西安交通大学 | 一种基于sure参数优化的双核非局部均值图像去噪方法 |
CN110619610B (zh) * | 2019-09-12 | 2023-01-10 | 紫光展讯通信(惠州)有限公司 | 图像处理方法及装置 |
CN110611750B (zh) * | 2019-10-31 | 2022-03-22 | 北京迈格威科技有限公司 | 一种夜景高动态范围图像生成方法、装置和电子设备 |
CN111028190A (zh) * | 2019-12-09 | 2020-04-17 | Oppo广东移动通信有限公司 | 图像处理方法、装置、存储介质及电子设备 |
CN112215875A (zh) * | 2020-09-04 | 2021-01-12 | 北京迈格威科技有限公司 | 图像处理方法、装置和电子系统 |
CN112288664A (zh) * | 2020-09-25 | 2021-01-29 | 北京迈格威科技有限公司 | 高动态范围图像的融合方法、装置和电子设备 |
CN112634183B (zh) * | 2020-11-05 | 2024-10-15 | 北京迈格威科技有限公司 | 图像处理方法及装置 |
CN112598609B (zh) * | 2020-12-09 | 2024-07-19 | 普联技术有限公司 | 一种动态图像的处理方法及装置 |
CN112614064B (zh) * | 2020-12-18 | 2023-04-25 | 北京达佳互联信息技术有限公司 | 图像处理方法、装置、电子设备及存储介质 |
CN112907497B (zh) * | 2021-03-19 | 2022-08-16 | 苏州科达科技股份有限公司 | 图像融合方法以及图像融合装置 |
-
2021
- 2021-08-19 CN CN202110955265.7A patent/CN113781370B/zh active Active
-
2022
- 2022-07-22 WO PCT/CN2022/107425 patent/WO2023020201A1/zh active Application Filing
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101833754A (zh) * | 2010-04-15 | 2010-09-15 | 青岛海信网络科技股份有限公司 | 图像增强方法及系统 |
US20140365644A1 (en) * | 2013-05-09 | 2014-12-11 | Bay Sensors | Internet traffic analytics for non-internet traffic |
CN107220956A (zh) * | 2017-04-18 | 2017-09-29 | 天津大学 | 一种基于多幅具有不同曝光度的ldr图像的hdr图像融合方法 |
CN107845128A (zh) * | 2017-11-03 | 2018-03-27 | 安康学院 | 一种多尺度细节融合的多曝光高动态图像重建方法 |
CN110751608A (zh) * | 2019-10-23 | 2020-02-04 | 北京迈格威科技有限公司 | 一种夜景高动态范围图像融合方法、装置和电子设备 |
CN110728648A (zh) * | 2019-10-25 | 2020-01-24 | 北京迈格威科技有限公司 | 图像融合的方法、装置、电子设备及可读存储介质 |
CN113781370A (zh) * | 2021-08-19 | 2021-12-10 | 北京旷视科技有限公司 | 图像的增强方法、装置和电子设备 |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116523775A (zh) * | 2023-04-14 | 2023-08-01 | 海的电子科技(苏州)有限公司 | 高速图像信号的增强优化方法和设备、存储介质 |
CN116523775B (zh) * | 2023-04-14 | 2023-11-07 | 海的电子科技(苏州)有限公司 | 高速图像信号的增强优化方法和设备、存储介质 |
Also Published As
Publication number | Publication date |
---|---|
CN113781370B (zh) | 2024-09-13 |
CN113781370A (zh) | 2021-12-10 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2023020201A1 (zh) | 图像的增强方法和电子设备 | |
CN110910486B (zh) | 室内场景光照估计模型、方法、装置、存储介质以及渲染方法 | |
CN110663045B (zh) | 用于数字图像的自动曝光调整的方法、电子系统和介质 | |
US9740916B2 (en) | Systems and methods for persona identification using combined probability maps | |
TWI766201B (zh) | 活體檢測方法、裝置以及儲存介質 | |
US11398041B2 (en) | Image processing apparatus and method | |
US9262696B2 (en) | Image capture feedback | |
WO2018005765A1 (en) | Systems and methods for capturing digital images | |
CN116324878A (zh) | 针对图像效果的分割 | |
WO2022160895A1 (zh) | 图像处理方法、图像处理装置、电子系统及可读存储介质 | |
WO2021057536A1 (zh) | 一种图像处理方法、装置、计算机设备以及存储介质 | |
JP2023521270A (ja) | 多様なポートレートから照明を学習すること | |
US20240296531A1 (en) | System and methods for depth-aware video processing and depth perception enhancement | |
US20160140748A1 (en) | Automated animation for presentation of images | |
CN114372931A (zh) | 一种目标对象虚化方法、装置、存储介质及电子设备 | |
CN111836058B (zh) | 用于实时视频播放方法、装置、设备以及存储介质 | |
CN115131419A (zh) | 一种形成丁达尔光效的图像处理方法及电子设备 | |
CN116797504A (zh) | 图像融合方法、电子设备及存储介质 | |
CN113920023B (zh) | 图像处理方法及装置、计算机可读介质和电子设备 | |
CN116055895B (zh) | 图像处理方法及其装置、芯片系统和存储介质 | |
Jung et al. | High dynamic range imaging on mobile devices using fusion of multiexposure images | |
US20220261970A1 (en) | Methods, systems and computer program products for generating high dynamic range image frames | |
CN112995635B (zh) | 图像的白平衡处理方法、装置、电子设备和存储介质 | |
CN112950641B (zh) | 图像处理方法及装置、计算机可读存储介质和电子设备 | |
JP2023078061A (ja) | イメージングにおける露出制御方法、装置、デバイス及び記憶媒体 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 22857514 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 22857514 Country of ref document: EP Kind code of ref document: A1 |