CN111986129B - HDR image generation method, equipment and storage medium based on multi-shot image fusion - Google Patents
HDR image generation method, equipment and storage medium based on multi-shot image fusion Download PDFInfo
- Publication number
- CN111986129B CN111986129B CN202010617015.8A CN202010617015A CN111986129B CN 111986129 B CN111986129 B CN 111986129B CN 202010617015 A CN202010617015 A CN 202010617015A CN 111986129 B CN111986129 B CN 111986129B
- Authority
- CN
- China
- Prior art keywords
- image
- exposure
- fusion
- hdr
- images
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 230000004927 fusion Effects 0.000 title claims abstract description 52
- 238000000034 method Methods 0.000 title claims abstract description 47
- 230000002159 abnormal effect Effects 0.000 claims abstract description 37
- 238000004590 computer program Methods 0.000 claims description 22
- 238000000354 decomposition reaction Methods 0.000 claims description 15
- 238000004364 calculation method Methods 0.000 claims description 10
- 238000007500 overflow downdraw method Methods 0.000 claims description 5
- 230000003247 decreasing effect Effects 0.000 claims description 4
- 238000003384 imaging method Methods 0.000 description 7
- 230000006872 improvement Effects 0.000 description 6
- 230000000694 effects Effects 0.000 description 5
- 238000001914 filtration Methods 0.000 description 5
- 230000004075 alteration Effects 0.000 description 3
- 238000006243 chemical reaction Methods 0.000 description 3
- 238000010586 diagram Methods 0.000 description 3
- 230000006870 function Effects 0.000 description 3
- 238000005516 engineering process Methods 0.000 description 2
- 230000003628 erosive effect Effects 0.000 description 2
- 238000013507 mapping Methods 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 238000003491 array Methods 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000001149 cognitive effect Effects 0.000 description 1
- 239000002131 composite material Substances 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 238000007499 fusion processing Methods 0.000 description 1
- 230000000877 morphologic effect Effects 0.000 description 1
- 238000010606 normalization Methods 0.000 description 1
- 230000008569 process Effects 0.000 description 1
- 238000005070 sampling Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/50—Image enhancement or restoration using two or more images, e.g. averaging or subtraction
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/30—Determination of transform parameters for the alignment of images, i.e. image registration
- G06T7/33—Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/30—Determination of transform parameters for the alignment of images, i.e. image registration
- G06T7/38—Registration of image sequences
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/90—Determination of colour characteristics
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20016—Hierarchical, coarse-to-fine, multiscale or multiresolution image processing; Pyramid transform
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20172—Image enhancement details
- G06T2207/20208—High dynamic range [HDR] image processing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20212—Image combination
- G06T2207/20221—Image fusion; Image merging
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Image Processing (AREA)
Abstract
The invention discloses an HDR image generation method based on multi-shot image fusion, terminal equipment and a computer storage medium, wherein the method acquires a plurality of images with different exposure degrees, which are shot simultaneously in the same scene, from a plurality of cameras, and the current exposure parameters of the cameras are generated by the last frame result; then, aligning at least one abnormal exposure image by taking the normal exposure image as a reference; and finally, respectively fusing the brightness channel and the chromaticity channel of the aligned images to generate an HDR image. The technical scheme of the invention can realize that the HDR image is generated by multiple cameras, and avoids the problems of ghosting and color cast possibly caused by continuous exposure of a single camera.
Description
Technical Field
The present invention relates to the field of image processing technologies, and in particular, to an HDR image generating method, apparatus, and storage medium based on multi-shot image fusion.
Background
Scenes in the real world often have a very high dynamic range in terms of brightness. While the dynamic range perceived by the sensor in a conventional digital imaging device (e.g., camera) is much smaller, it is difficult to reveal all details in a single exposure image when capturing a large dynamic range, and overexposure or underexposure may occur in too bright or too dark places. In order to make the imaging result become rich in color details and brightness levels, the cognitive characteristics of human eyes on real world scenes can be better matched, and the high dynamic range (High Dynamic Range, HDR for short) imaging becomes an imaging technology which is more and more popular in digital imaging equipment. The image obtained by HDR imaging is also referred to as an HDR image, as opposed to a low dynamic range (Low Dynamic Range, LDR) image. HDR images can provide a high dynamic range between darker to fully illuminated areas in a scene. Video composed of HDR images may be referred to as HDR video.
Because of the lack of a mature HDR imaging sensor, an HDR image can be shot at one time, and the current HDR video generation method is based on the same principle: the single camera is used for exposing the same scene for multiple times with different exposure amounts, so that the brightness range of the whole scene is covered, then the images with different exposure values are synthesized into one HDR image, and then the HDR video is formed by a plurality of HDR images.
In the prior art, patent CN105163047a, for example, proposes a method, a system and a shooting terminal for generating an HDR image based on color space conversion, by obtaining at least three images with different exposure degrees continuously shot, converting the images into image brightness and color components through color space conversion, then performing camera reflection function mapping, weighted summation fusion and tone mapping on the brightness components, fusing the color components, and obtaining the HDR image through inverse conversion. However, when shooting a scene with a moving object, the object appears at different positions on different photos, and the direct fusion of brightness can form a ghost effect, which is particularly shown as the phenomenon of ghost, blurring or ghost in a composite image. The second color component (the channel U, V in YUV color space or the channel A, B in LAB color space) is processed separately when the fusion processing is performed, which may cause the fused color components to not conform to the difference relationship between the original color components, so that the color cast of the picture occurs.
Disclosure of Invention
The invention provides an HDR image generation method, equipment and a storage medium based on multi-shot image fusion, which realize the generation of an HDR image by multiple cameras and avoid the problems of ghosting and color cast possibly caused by continuous exposure of a single camera.
In order to solve the technical problem, the invention provides an HDR image generation method based on multi-shot image fusion, which comprises the following steps:
acquiring a plurality of images with different exposure degrees, which are shot simultaneously in the same scene, from a plurality of cameras; wherein the different exposure degree images include: a normally exposed image and at least one abnormally exposed image; the exposure parameters corresponding to the cameras for shooting the abnormal exposure images are obtained by calculation according to the image shot by the first camera in the previous frame; the first camera is a camera for shooting the normal exposure image;
aligning the at least one abnormal exposure image with the normal exposure image as a reference;
and respectively fusing the brightness channel and the chromaticity channel of the aligned images to generate an HDR image.
As an improvement of the above solution, the exposure parameters corresponding to the cameras for capturing the abnormal exposure image are obtained by calculating according to the image captured by the first camera in the previous frame, specifically:
the normal exposure image shot by the first camera in the previous frame is a first image;
downsampling the first image, and generating an abnormal exposure binarization image according to pixel values of brightness channels of the downsampled first image at all position points;
and respectively calculating first exposure parameters corresponding to other cameras except the first camera in the next frame according to the abnormal exposure binarized image, and taking the first exposure parameters as exposure parameters corresponding to the cameras for shooting the abnormal exposure image.
As an improvement of the above solution, the calculating, according to the abnormal exposure binary image, first exposure parameters corresponding to the next frame of the other cameras except the first camera includes:
according to the proportion of the number of abnormal exposure pixels in the abnormal exposure binarized image to the number of full pixels, the first exposure parameters corresponding to the other cameras in the next frame are adjusted in combination with a preset proportion interval, so that all exposure parameters are formed by taking the first camera as a center, and the overall exposure parameters of the other cameras are increased or decreased in a step-by-step mode.
As an improvement of the scheme, when the sensor models of all cameras are identical, the exposure parameter is exposure time.
As an improvement of the above solution, the aligning the at least one abnormal exposure image with respect to the normal exposure image specifically includes:
and aligning the at least one abnormal exposure image by taking the normal exposure image as a reference according to a threshold bitmap alignment, feature point alignment or block matching alignment method.
As an improvement of the above scheme, the fusing of the luminance channel and the chrominance channel is performed on the aligned images, so as to generate an HDR image, which specifically includes:
performing brightness channel fusion on the aligned images by adopting a multi-exposure fusion method of Laplacian pyramid decomposition;
according to the corresponding chromaticity value when the color difference absolute value of the plurality of images at the same position is maximum, performing chromaticity channel fusion on the aligned images;
and generating the HDR image according to the image obtained by fusing the brightness channel and the chromaticity channel.
As an improvement of the above solution, the multi-exposure fusion method using laplacian pyramid decomposition performs luminance channel fusion on the aligned image, specifically:
calculating a weight map of each image to be fused;
l-level Laplacian pyramid decomposition is carried out on the brightness channel of each image to be fused, and L-level Gaussian pyramid decomposition is carried out on each weight graph; l is a positive integer;
and calculating the Laplace component of each layer after fusion, and reconstructing to obtain the image after brightness fusion.
The embodiment of the invention also provides an HDR video generation method based on multi-shot image fusion, which is used for generating a plurality of HDR images and generating an HDR video according to the generated HDR images.
The embodiment of the invention also provides a terminal device, which comprises a processor, a memory and a computer program stored in the memory and configured to be executed by the processor, wherein the HDR image generation method based on multi-shot image fusion is realized when the processor executes the computer program.
The embodiment of the invention also provides a computer readable storage medium, which comprises a stored computer program, wherein the computer program controls equipment where the computer readable storage medium is located to execute the HDR image generation method based on multi-shot image fusion according to any one of the above.
The embodiment of the invention has the following beneficial effects:
the invention provides an HDR image generation method based on multi-shot image fusion, terminal equipment and a computer storage medium, wherein the method acquires a plurality of images with different exposure degrees, which are shot simultaneously in the same scene, from a plurality of cameras, and the current exposure parameters of the cameras are generated by the last frame result; then, aligning at least one abnormal exposure image by taking the normal exposure image as a reference; and finally, respectively fusing the brightness channel and the chromaticity channel of the aligned images to generate an HDR image. Compared with the prior art that images with different exposure degrees are continuously photographed by using a single camera, and then the HDR images are generated by fusion, the technical scheme of the invention can realize that the HDR images are generated by using multiple cameras, and the problems of ghosting and color cast possibly caused by continuous exposure of the single camera are avoided.
Furthermore, when the invention performs chromaticity fusion, the corresponding chromaticity value with the largest absolute value of the chromatic aberration is selected, so that the difference relation among color channels is maintained, and the color cast problem in the prior art is further avoided.
Furthermore, in the invention, when the brightness is fused, downsampling is firstly carried out and then Gaussian blur is carried out, so that the data processing time can be shortened and the fusion efficiency can be improved.
Drawings
FIG. 1 is a schematic flow chart of one embodiment of a HDR image generation method based on multi-shot image fusion provided by the present invention;
FIG. 2 is a flow chart of one embodiment of a filtering template provided by the present invention;
FIG. 3 is a flowchart illustrating an embodiment of luminance channel fusion according to the present invention;
fig. 4 is a schematic structural diagram of an embodiment of a terminal device provided by the present invention.
Detailed Description
The following description of the embodiments of the present invention will be made more apparent and fully hereinafter with reference to the accompanying drawings, in which some, but not all embodiments of the invention are shown. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
Referring to fig. 1, fig. 1 is a schematic flow chart of an embodiment of an HDR image generation method based on multi-shot image fusion provided in the present invention, as shown in fig. 1, the method includes steps 101 to 103, and the steps are specifically as follows:
step 101: acquiring a plurality of images with different exposure degrees, which are shot simultaneously in the same scene, from a plurality of cameras; wherein the different exposure images include: a normally exposed image and at least one abnormally exposed image; the exposure parameters corresponding to the cameras for shooting the abnormal exposure images are obtained by calculation according to the image shot by the first camera in the last frame.
In this embodiment, a plurality of cameras are used to shoot the same scene at the same time, and the current exposure parameter of each camera is generated by the last frame result. The photographed images are images with different exposure degrees, and include a normal exposure image and at least one abnormal exposure image. The different exposure levels are classified into overexposure, underexposure and normal exposure. The camera shooting normal exposure is a first camera, and other cameras shoot abnormal exposure images.
In this embodiment, exposure parameters corresponding to a camera capturing an abnormally exposed image are obtained by calculating from an image captured by a first camera in a previous frame, and specifically include the following steps:
step 1011: the normal exposure image shot by the first camera in the previous frame is a first image.
Step 1012: and downsampling the first image, and generating an abnormal exposure binarization image according to pixel values of brightness channels of the downsampled first image at all the position points.
Step 1013: according to the abnormal exposure binarized image, first exposure parameters corresponding to other cameras except the first camera in the next frame are calculated respectively, and the first exposure parameters are used as exposure parameters corresponding to the cameras for shooting the abnormal exposure image.
To better illustrate the principles of this step, the following examples are presented.
1) The normal exposure image shot by the first camera in the last frame is a first image, and the abnormal exposure images shot by the other cameras in the last frame are second images. In this example, the second image represents all the abnormally exposed images captured by the other cameras, including at least one image, so that the first image and the second image together contain K images including the normally exposed image img_normal and the abnormally exposed image img_abnomal k (k=1,., K-1); the image format is a YUV format,if the image format is not YUV format, the image is converted to YUV format.
2) Downsampling the normal exposure image img_normal to obtain a downsampled picture img_normal down The length and the width are respectively changed into 1/n of the original length and the width 1 And 1/n 2 Wherein n is 1 And n 2 Is a positive integer, e.g. n 1 =8,n 2 =8. Detection of img_normal down The luminance channel Y of (i, j) at position (i, j) i,j If y is i,j Less than a threshold T1 (e.g., 25) or y i,j When the pixel value is greater than the threshold value T2 (e.g., 230), the pixel at the position is an abnormal exposure pixel, and the pixel value at the position (i, j) is set to y max (e.g., 1), if T1. Ltoreq.y i,j And less than or equal to T2, the pixel at the position is a normal exposure pixel, and the pixel value at the position (i, j) is set to be 0. After all pixels are detected and set, an abnormal exposure binarized image img_mask is generated.
3) Before the next step is executed, morphological erosion operation can be performed on the abnormal exposure binarized image img_mask, and noise existing in the image is removed to obtain an image img_mask2. The erosion operation may be implemented by filtering the image using a template of 3*3, as shown in fig. 2, where 0 and 1 are weight values, respectively, 0 is the weight value of the current position, filtering is performed centering on the pixel of the 0 weight position, and the filtering result is the sum of surrounding pixels. When the filtering result of the corresponding pixel position is not equal to 8*y max The pixel value of the position is 0, otherwise, the pixel value is y max 。
4) After removing the noise, step 1013 is specifically: according to the proportion of the number of abnormal exposure pixels in the img_mask2 to the number of the whole image pixels, the first exposure parameters corresponding to the other cameras in the next frame are adjusted in combination with a preset proportion interval, so that all exposure parameters are formed by taking the first camera as a center, and the overall exposure parameters of the other cameras are increased or decreased in a step-type manner.
The exposure parameters include exposure time, exposure gain, aperture, etc. Because the sensors of the lenses of different models are different, the adjustment modes can be different. Finally, the overall exposure parameters are stepwise increased or decreased when the obtained effect is adjusted, so that the shot images are stepwise effects, such as 5 images respectively being a maximum overexposed image, a minor overexposed image, a normal exposed image, a minor underexposed image and a maximum underexposed image, thereby forming the stepwise effects as described above.
The adjusted exposure parameters may be, but are not limited to, adjusting exposure time if the sensor models of all cameras are identical. For example, when the proportion of the number of the abnormal exposure pixels to the number of the full image pixels is 1/2, if the exposure time of the normally exposed lens is t, the exposure time of the next frame is 2t for the overexposure lens, and the exposure time of the rest overexposure lenses is extracted in proportion between [ t,2t ]; the exposure time of the underexposure lens is extracted in proportion between [ t/2, t ]. For example, when the ratio is 1/3, the ratio is 1-1/3=2/3, the interval of the overexposure lens is [ t,3t/2], and the interval of the underexposure lens is [2t/3, t ].
The invention shoots the exposure image of the next frame according to the adjusted exposure parameters, so that the whole control and shooting form self-adaption and self-adjustment, thereby providing a feasible method for generating the HDR image by multiple cameras.
Step 102: at least one of the non-normally exposed images is aligned with respect to the normally exposed image.
In this embodiment, step 102 specifically includes: and aligning at least one abnormal exposure image by taking the normal exposure image as a reference according to a threshold bitmap alignment, feature point alignment or block matching alignment method.
Taking the block matching alignment method as an example, the alignment method includes the steps of:
step 1021: the normal exposure image img_normal is divided into m×n blocks. m and n are positive integers.
Step 1022: at each of the different exposure images img_abnormal k (k=1..a., K-1) find every block_normal with img_normal j (j=1, …, m×n) best match block_abnormal j,k (j=1,..m×n), and the block is combined with block_normal j The length and the width are consistent. The best match may be with img_abnormal k In comparison with other blocks of uniform length, which are block normal j The sum of the absolute values of the differences of the corresponding pixels is minimal. When searching for a matching block, the matching calculation can be performed according to the following formula:
wherein the parameter n 1 ,n 2 Pixel coordinates, n, within a block b of the img normal image 1 +d 1 ,n 2 +d 2 For img_abnormal image pixel coordinates within a block b', where there is an integral d in position for both blocks 1 ,d 2 Positional deviation of N 1 ,N 2 The length and width of the blocks are the same, and the length and width of the blocks b and b' are the same.
Step 1023: by the method in img_abnormal k Search for each block_normal j Can be matched with img_abnormal k Match graph img_abnormal_align with consistent length and width k Thereby achieving alignment of the pictures.
In this embodiment, the threshold bitmap alignment and the feature point alignment are in the prior art, and are not described herein.
Step 103: and respectively fusing the brightness channel and the chromaticity channel of the aligned images to generate an HDR image.
In this embodiment, step 103 includes steps 1031 to 1033, and each step is specifically as follows:
step 1031: and (3) performing brightness channel fusion on the aligned images by adopting a multi-exposure fusion method of Laplacian pyramid decomposition.
Step 1032: and carrying out chroma channel fusion on the aligned images according to the corresponding chroma value when the absolute value of the chromatic aberration of the plurality of images on the same position is maximum.
Step 1033: and generating the HDR image according to the image obtained by fusing the brightness channel and the chromaticity channel.
In this embodiment, step 1031 is specifically: calculating a weight map of each image to be fused; l-level Laplacian pyramid decomposition is carried out on the brightness channel of each image to be fused, and L-level Gaussian pyramid decomposition is carried out on each weight graph; l is a positive integer; and calculating the Laplace component of each layer after fusion, and reconstructing to obtain the image after brightness fusion.
For a better illustration of step 1031 of the present invention, please refer to fig. 3, fig. 3 is a flowchart illustrating an embodiment of luminance channel fusion according to the present invention. As shown in fig. 3:
a) For K aligned images, calculating a weight graph W of each image to be fused k . Weight map W here k The weight of each position of the brightness channel Y in each graph is calculated by the following method:
wherein y is k (i,j)∈[0,255],
After weight normalization is performed after weight of each position is calculated:
b) L-level Laplacian pyramid decomposition is carried out on the brightness channels of the K images, and L-level Gaussian pyramid decomposition is carried out on each weight image.
i. First, L Laplace components lpY of each image on the luminance channel are calculated 1 (0)~lpY 1 (L-1),……lpY K (0)~lpY K (L-1), and a highest Gaussian component gsY 1 (L),...,gsY K (L). The calculation method comprises the following steps:
y-channel image g for each image 0 The downsampled image is then passed through a symmetrical low pass filter omega (e.g., 3 x 3) to obtain a (M/2) x (N/2) gaussian image g 1 This process is repeatedL times to obtain a series of images g with gradually halved size 0 …g L . Image g 1 An image expan (g) of size mxn is obtained by 2-fold up-sampling 1 ) Where EXPAND (·) represents a 2-fold upsampling operation. Laplacian lpY (0) =g at layer 0 0 -EXPAND(g 1 ) The method comprises the steps of carrying out a first treatment on the surface of the This procedure was performed L-1 times to obtain the laplace component lpY (i) =g for each layer i -EXPAND(g 1+1 ) (i=1, once again, N), gaussian component gsY (L) =g of the highest layer L . Wherein the low pass filter ω is at 3×3 as follows:
then, L+1 Laplace components gsW in weight are calculated for each graph 1 (0)~gsW 1 (L-1),……,gsW K (0)~gsW K (L-1). The calculation method comprises the following steps: the weight map W (i.e., gsW (0)) for each map is 1/2 downsampled to a size mxn, and then the downsampled image is passed through a symmetrical low pass filter ω (e.g., 3 x 3) to obtain a gaussian image gsW (1), which is repeated L times to obtain a series of progressively halved size images gsW (0),. GsW (L).
In this step, downsampling is performed first, and then gaussian blur is performed, so that the calculation time can be reduced to more than half of the original calculation time. In addition, the size of the template of Gaussian blur is 3*3, and is usually 5*5, when the number of pyramid layers reaches a certain number, the pyramid layers are not obviously different in effect, but the calculation time of the template of 3*3 is shorter than that of the template of 5*5 by about 5 times.
c) And calculating the Laplace component of each layer after fusion.
By gsW 1 (0)~gsW 1 (L-1),…,gsW K (0)~gsW K (L-1) pair lpY 1 (0)~lpY 1 (L-1),gsY 1 (L),…,lpY K (0)~lpY K (L-1),gsY K The corresponding levels of (L) are weighted fused. Representing the fused ith layer Laplacian decomposition at coordinates (x, y) with lpF (i) (x, y)The value gsF (i) (x, y) represents the value of the fused highest-level gaussian decomposition at coordinates (x, y).
d) And reconstructing to obtain a fused brightness channel result. The fused Laplacian pyramid is recursively submitted from the top layer to the bottom layer by layer according to the following method, and the final fusion result G can be obtained after the calculation is completed 0 。
Where EXPAND (.) represents a 2-fold upsampling operation.
In this embodiment, the chroma fusion of step 1032 is specifically: for K aligned images, the fused chrominance values (u) at coordinates (x, y) f (x,y),v f (x, y)) should be the chromaticity value corresponding to the maximum of the absolute value of the color difference of the plurality of images at that position. Under YUV space, the fused chromaticity values at coordinates (x, y) are calculated as follows:
wherein u is i (x,y),v i (x, y) are u and v values at coordinates (x, y) of an ith image in the K aligned images, abs is an absolute value operation, argmax is a specific value of the parameter i when the result takes the maximum value. Thus, the left hand side of the first equation coordinate represents comparing the values of the K images at coordinates (x, y), taking the value of variable i at maximum, and assigning this value to pos. In the second formula (u) f (x,y),v f (x, y)) is the fused value.
Finally, in step 1033, after the fusion result of the luminance channel and the chrominance channel is obtained, an HDR image with the same length and width as those of the normal exposure image is obtained.
From the above, the invention provides an HDR image generation method based on multi-shot image fusion, which acquires a plurality of images with different exposure degrees shot simultaneously by the same scene from a plurality of cameras, and the current exposure parameters of the cameras are generated by the previous frame result; then, aligning at least one abnormal exposure image by taking the normal exposure image as a reference; and finally, respectively fusing the brightness channel and the chromaticity channel of the aligned images to generate an HDR image. Compared with the prior art that images with different exposure degrees are continuously photographed by using a single camera, and then the HDR images are generated by fusion, the technical scheme of the invention can realize that the HDR images are generated by using multiple cameras, and the problems of ghosting and color cast possibly caused by continuous exposure of the single camera are avoided.
Furthermore, when the invention performs chromaticity fusion, the corresponding chromaticity value with the largest absolute value of the chromatic aberration is selected, so that the difference relation among color channels is maintained, and the color cast problem in the prior art is further avoided.
Furthermore, in the invention, when the brightness is fused, downsampling is firstly carried out and then Gaussian blur is carried out, so that the data processing time can be shortened and the fusion efficiency can be improved.
Correspondingly, the invention also provides an HDR video generation method based on multi-shot image fusion, which comprises the following steps: according to the HDR image generation method, a plurality of HDR images are generated, and HDR video is generated according to the generated HDR images. The method for generating the HDR image and the HDR video is the prior art, and is not described herein.
Referring to fig. 4, fig. 4 is a schematic structural diagram of an embodiment of a terminal device provided by the present invention.
The terminal device provided by the embodiment of the invention comprises a processor 71, a memory 72 and a computer program stored in the memory 72 and configured to be executed by the processor 71, wherein the steps of the HDR image generation method based on multi-shot image fusion in the embodiment are realized when the processor 71 executes the computer program, for example, all the steps of the HDR image generation method shown in fig. 1.
In addition, an embodiment of the present invention further provides a computer readable storage medium, where the computer readable storage medium includes a stored computer program, where when the computer program runs, the device where the computer readable storage medium is controlled to execute the HDR image generation method based on multi-shot image fusion according to any one of the embodiments.
It will be appreciated by those skilled in the art that the schematic diagram is merely an example of a terminal device and does not constitute a limitation of the terminal device, and may include more or less components than illustrated, or may combine certain components, or different components, e.g., the terminal device may further include an input-output device, a network access device, a bus, etc.
The processor 71 may be a central processing unit (Central Processing Unit, CPU), but may also be other general purpose processors, digital signal processors (Digital Signal Processor, DSP), application specific integrated circuits (Application Specific Integrated Circuit, ASIC), off-the-shelf programmable gate arrays (Field-Programmable Gate Array, FPGA) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, or the like. The general purpose processor may be a microprocessor or the processor may be any conventional processor or the like, and the processor 71 is a control center of the terminal device, and connects various parts of the entire terminal device using various interfaces and lines.
The memory 72 may be used to store the computer program and/or module, and the processor 71 implements various functions of the terminal device by running or executing the computer program and/or module stored in the memory 72 and invoking data stored in the memory 72. The memory 72 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required for at least one function, and the like; the storage data area may store data created according to the use of the terminal device, etc. In addition, the memory may include high-speed random access memory, and may also include non-volatile memory, such as a hard disk, memory, plug-in hard disk, smart Media Card (SMC), secure Digital (SD) Card, flash Card (Flash Card), at least one disk storage device, flash memory device, or other volatile solid-state storage device.
Wherein the terminal device integrated modules/units may be stored in a computer readable storage medium if implemented in the form of software functional units and sold or used as stand alone products. Based on such understanding, the present invention may implement all or part of the flow of the method of the above embodiment, or may be implemented by a computer program to instruct related hardware, where the computer program may be stored in a computer readable storage medium, and when the computer program is executed by a processor, the computer program may implement the steps of each of the method embodiments described above. Wherein the computer program comprises computer program code which may be in source code form, object code form, executable file or some intermediate form etc. The computer readable medium may include: any entity or device capable of carrying the computer program code, a recording medium, a U disk, a removable hard disk, a magnetic disk, an optical disk, a computer Memory, a Read-Only Memory (ROM), a random access Memory (RAM, random Access Memory), an electrical carrier signal, a telecommunications signal, a software distribution medium, and so forth.
While the foregoing is directed to the preferred embodiments of the present invention, it will be appreciated by those skilled in the art that changes and modifications may be made without departing from the principles of the invention, such changes and modifications are also intended to be within the scope of the invention.
Those skilled in the art will appreciate that implementing all or part of the above-described embodiments may be accomplished by way of computer programs, which may be stored on a computer readable storage medium, which when executed may comprise the steps of the above-described embodiments. The storage medium may be a magnetic disk, an optical disk, a Read-Only Memory (ROM), a random access Memory (Random Access Memory, RAM), or the like.
Claims (8)
1. An HDR image generation method based on multi-shot image fusion, comprising:
acquiring a plurality of images with different exposure degrees, which are shot simultaneously in the same scene, from a plurality of cameras; wherein the different exposure degree images include: a normally exposed image and at least one abnormally exposed image; the exposure parameters corresponding to the cameras for shooting the abnormal exposure images are obtained by calculation according to the image shot by the first camera in the previous frame; the first camera is a camera for shooting the normal exposure image;
aligning the at least one abnormal exposure image with the normal exposure image as a reference;
respectively fusing a brightness channel and a chromaticity channel of the aligned images to generate an HDR image;
the exposure parameters corresponding to the cameras for shooting the abnormal exposure images are obtained by calculating according to the image shot by the first camera in the previous frame, and specifically include:
the normal exposure image shot by the first camera in the previous frame is a first image;
downsampling the first image, and generating an abnormal exposure binarization image according to pixel values of brightness channels of the downsampled first image at all position points;
according to the abnormal exposure binarized image, respectively calculating first exposure parameters corresponding to other cameras except the first camera in the next frame, and taking the first exposure parameters as exposure parameters corresponding to the camera for shooting the abnormal exposure image;
and according to the proportion of the number of abnormal exposure pixels in the abnormal exposure binarized image to the number of full pixels, adjusting first exposure parameters corresponding to the other cameras in the next frame by combining a preset proportion interval, so that all exposure parameters are formed by taking the first camera as a center, and the whole exposure parameters of the other cameras are increased or decreased in a step-like manner.
2. The HDR image generation method based on multi-shot image fusion of claim 1, wherein when the sensor models of all cameras are identical, then the exposure parameter is exposure time.
3. The HDR image generation method based on multi-shot image fusion according to claim 1 or 2, wherein the aligning the at least one abnormal exposure image with respect to the normal exposure image is specifically:
and aligning the at least one abnormal exposure image by taking the normal exposure image as a reference according to a threshold bitmap alignment, feature point alignment or block matching alignment method.
4. The HDR image generation method based on multi-shot image fusion of claim 3, wherein the fusing of the luminance channel and the chrominance channel is performed on the aligned images respectively to generate the HDR image, specifically:
performing brightness channel fusion on the aligned images by adopting a multi-exposure fusion method of Laplacian pyramid decomposition;
according to the corresponding chromaticity value when the color difference absolute value of the plurality of images at the same position is maximum, performing chromaticity channel fusion on the aligned images;
and generating the HDR image according to the image obtained by fusing the brightness channel and the chromaticity channel.
5. The HDR image generation method based on multi-shot image fusion of claim 4, wherein the multi-exposure fusion method using laplacian pyramid decomposition performs luminance channel fusion on the aligned images, specifically:
calculating a weight map of each image to be fused;
l-level Laplacian pyramid decomposition is carried out on the brightness channel of each image to be fused, and L-level Gaussian pyramid decomposition is carried out on each weight graph; l is a positive integer;
and calculating the Laplace component of each layer after fusion, and reconstructing to obtain the image after brightness fusion.
6. An HDR video generation method based on multi-shot image fusion, wherein a plurality of HDR images are generated according to the HDR image generation method of any one of claims 1 to 5, and an HDR video is generated according to the generated HDR images.
7. A terminal device comprising a processor, a memory and a computer program stored in the memory and configured to be executed by the processor, the processor implementing the HDR image generation method based on multi-shot image fusion as claimed in any one of claims 1-5 when executing the computer program.
8. A computer readable storage medium, characterized in that the computer readable storage medium comprises a stored computer program, wherein the computer program when run controls a device in which the computer readable storage medium is located to perform the HDR image generation method based on multi-shot image fusion as claimed in any one of claims 1-5.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010617015.8A CN111986129B (en) | 2020-06-30 | 2020-06-30 | HDR image generation method, equipment and storage medium based on multi-shot image fusion |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010617015.8A CN111986129B (en) | 2020-06-30 | 2020-06-30 | HDR image generation method, equipment and storage medium based on multi-shot image fusion |
Publications (2)
Publication Number | Publication Date |
---|---|
CN111986129A CN111986129A (en) | 2020-11-24 |
CN111986129B true CN111986129B (en) | 2024-03-19 |
Family
ID=73438433
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202010617015.8A Active CN111986129B (en) | 2020-06-30 | 2020-06-30 | HDR image generation method, equipment and storage medium based on multi-shot image fusion |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN111986129B (en) |
Families Citing this family (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112651911B (en) * | 2020-12-01 | 2023-10-13 | 广东工业大学 | High dynamic range imaging generation method based on polarized image |
CN112233049B (en) * | 2020-12-14 | 2021-03-02 | 成都中轨轨道设备有限公司 | Image fusion method for improving image definition |
CN112614083A (en) * | 2020-12-18 | 2021-04-06 | 北京迈格威科技有限公司 | Image fusion method and device and electronic system |
CN112651918A (en) * | 2021-01-15 | 2021-04-13 | 北京小米松果电子有限公司 | Image processing method and device, electronic device and storage medium |
CN113012081B (en) * | 2021-01-28 | 2024-10-11 | 北京迈格威科技有限公司 | Image processing method, device and electronic system |
CN112837254B (en) * | 2021-02-25 | 2024-06-11 | 普联技术有限公司 | Image fusion method and device, terminal equipment and storage medium |
CN113038025B (en) * | 2021-02-26 | 2023-06-20 | Oppo广东移动通信有限公司 | Image processing method, terminal and storage medium |
CN113222869B (en) * | 2021-05-06 | 2024-03-01 | 杭州海康威视数字技术股份有限公司 | Image processing method |
CN113706429B (en) * | 2021-07-30 | 2023-08-04 | 爱芯元智半导体(上海)有限公司 | Image processing method, device, electronic equipment and storage medium |
CN114418912B (en) * | 2021-12-27 | 2024-05-14 | 杭州意象科技有限公司 | Multi-angle illumination image fusion algorithm for eliminating reflection and multi-frame multi-angle illumination |
CN114529477B (en) * | 2022-02-28 | 2024-07-12 | 山东威高手术机器人有限公司 | Binocular endoscope with high dynamic range, binocular endoscope system with high dynamic range and binocular imaging method |
EP4369728A1 (en) * | 2022-11-10 | 2024-05-15 | Beijing Xiaomi Mobile Software Co., Ltd. | Photographing method and device |
CN117135468B (en) * | 2023-02-21 | 2024-06-07 | 荣耀终端有限公司 | Image processing method and electronic equipment |
Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105163047A (en) * | 2015-09-15 | 2015-12-16 | 厦门美图之家科技有限公司 | HDR (High Dynamic Range) image generation method and system based on color space conversion and shooting terminal |
CN105578042A (en) * | 2015-12-18 | 2016-05-11 | 深圳市金立通信设备有限公司 | Image data transmission method and terminal |
CN107395998A (en) * | 2017-08-24 | 2017-11-24 | 维沃移动通信有限公司 | A kind of image capturing method and mobile terminal |
US9955084B1 (en) * | 2013-05-23 | 2018-04-24 | Oliver Markus Haynold | HDR video camera |
CN108063902A (en) * | 2018-01-08 | 2018-05-22 | 信利光电股份有限公司 | HDR image pickup methods, filming apparatus and the mobile terminal and readable storage medium storing program for executing of multi-cam |
WO2018137267A1 (en) * | 2017-01-25 | 2018-08-02 | 华为技术有限公司 | Image processing method and terminal apparatus |
WO2020038069A1 (en) * | 2018-08-22 | 2020-02-27 | Oppo广东移动通信有限公司 | Exposure control method and device, and electronic apparatus |
CN110930440A (en) * | 2019-12-09 | 2020-03-27 | Oppo广东移动通信有限公司 | Image alignment method and device, storage medium and electronic equipment |
CN111147755A (en) * | 2020-01-02 | 2020-05-12 | 普联技术有限公司 | Zoom processing method and device for double cameras and terminal equipment |
-
2020
- 2020-06-30 CN CN202010617015.8A patent/CN111986129B/en active Active
Patent Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9955084B1 (en) * | 2013-05-23 | 2018-04-24 | Oliver Markus Haynold | HDR video camera |
CN105163047A (en) * | 2015-09-15 | 2015-12-16 | 厦门美图之家科技有限公司 | HDR (High Dynamic Range) image generation method and system based on color space conversion and shooting terminal |
CN105578042A (en) * | 2015-12-18 | 2016-05-11 | 深圳市金立通信设备有限公司 | Image data transmission method and terminal |
WO2018137267A1 (en) * | 2017-01-25 | 2018-08-02 | 华为技术有限公司 | Image processing method and terminal apparatus |
CN107395998A (en) * | 2017-08-24 | 2017-11-24 | 维沃移动通信有限公司 | A kind of image capturing method and mobile terminal |
CN108063902A (en) * | 2018-01-08 | 2018-05-22 | 信利光电股份有限公司 | HDR image pickup methods, filming apparatus and the mobile terminal and readable storage medium storing program for executing of multi-cam |
WO2020038069A1 (en) * | 2018-08-22 | 2020-02-27 | Oppo广东移动通信有限公司 | Exposure control method and device, and electronic apparatus |
CN110930440A (en) * | 2019-12-09 | 2020-03-27 | Oppo广东移动通信有限公司 | Image alignment method and device, storage medium and electronic equipment |
CN111147755A (en) * | 2020-01-02 | 2020-05-12 | 普联技术有限公司 | Zoom processing method and device for double cameras and terminal equipment |
Non-Patent Citations (6)
Title |
---|
High dynamic range imaging using camera arrays;Kalpana Seshadrinathan等;《2017 IEEE International Conference on Image Precessing》;725-729 * |
Multi-camera platform for panoramic real-time HDR video construction and rendering;Vladan Popovic等;《Journal of Real-Time Image Processing》;697-708 * |
Multi-exposure image fusion for dynamic scenes without ghost effect;Ashish V.Vanmali等;《2015 Twenty First National Conference on Communications》;1-6 * |
一种改进金字塔的多曝光HDR图像生成方法;周燕琴等;《现代计算机》(第15期);130-136 * |
基于YUV空间的高动态范围图像的合成方法;徐桂忠等;《中国传媒大学学报(自然科学版)》;第24卷(第3期);11-13 * |
基于质量度量与颜色校正的多曝光图像融合算法;杜永生等;《电子测量与仪器学报》;第33卷(第1期);90-98 * |
Also Published As
Publication number | Publication date |
---|---|
CN111986129A (en) | 2020-11-24 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN111986129B (en) | HDR image generation method, equipment and storage medium based on multi-shot image fusion | |
US11055827B2 (en) | Image processing apparatus and method | |
CN108335279B (en) | Image fusion and HDR imaging | |
JP4234195B2 (en) | Image segmentation method and image segmentation system | |
TWI621099B (en) | Array camera image combination with feature-based ghost removal | |
US8860816B2 (en) | Scene enhancements in off-center peripheral regions for nonlinear lens geometries | |
CN111932587B (en) | Image processing method and device, electronic equipment and computer readable storage medium | |
CN105578021B (en) | The imaging method and its device of binocular camera | |
CN108717530B (en) | Image processing method, image processing device, computer-readable storage medium and electronic equipment | |
EP3198853A2 (en) | High dynamic range image composition using multiple images | |
Ko et al. | Artifact-free low-light video enhancement using temporal similarity and guide map | |
Chen et al. | Deep exposure fusion with deghosting via homography estimation and attention learning | |
CN109413335B (en) | Method and device for synthesizing HDR image by double exposure | |
CN111028170B (en) | Image processing method, image processing apparatus, electronic device, and readable storage medium | |
CN111127303A (en) | Background blurring method and device, terminal equipment and computer readable storage medium | |
US11880963B2 (en) | Apparatus and method for image processing | |
CN111953893B (en) | High dynamic range image generation method, terminal device and storage medium | |
CN112258417B (en) | Image generation method, device and equipment | |
US20240248376A1 (en) | Saliency based capture or image processing | |
CN107633497A (en) | A kind of image depth rendering intent, system and terminal | |
CN111986106A (en) | High dynamic image reconstruction method based on neural network | |
CN111031241A (en) | Image processing method and device, terminal and computer readable storage medium | |
Merianos et al. | A hybrid multiple exposure image fusion approach for HDR image synthesis | |
CN109672810B (en) | Image processing apparatus, image processing method, and storage medium | |
CN112598609A (en) | Dynamic image processing method and device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |