CN110868544B - Shooting method and electronic equipment - Google Patents

Shooting method and electronic equipment Download PDF

Info

Publication number
CN110868544B
CN110868544B CN201911165774.9A CN201911165774A CN110868544B CN 110868544 B CN110868544 B CN 110868544B CN 201911165774 A CN201911165774 A CN 201911165774A CN 110868544 B CN110868544 B CN 110868544B
Authority
CN
China
Prior art keywords
image
images
mapping
fused
curve
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201911165774.9A
Other languages
Chinese (zh)
Other versions
CN110868544A (en
Inventor
施嘉察
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Hangzhou Co Ltd
Original Assignee
Vivo Mobile Communication Hangzhou Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Hangzhou Co Ltd filed Critical Vivo Mobile Communication Hangzhou Co Ltd
Priority to CN201911165774.9A priority Critical patent/CN110868544B/en
Publication of CN110868544A publication Critical patent/CN110868544A/en
Application granted granted Critical
Publication of CN110868544B publication Critical patent/CN110868544B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/73Circuitry for compensating brightness variation in the scene by influencing the exposure time
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof

Abstract

The invention provides a shooting method and electronic equipment, wherein the method comprises the following steps: acquiring at least three images with different exposure values, wherein the at least three images comprise images obtained by time-lapse photography; performing image fusion on the at least three images to obtain a fused image; performing first curve mapping on the fused image to obtain a first mapping image; performing second curve mapping on I first local block areas of the first mapping image to obtain a target mapping image, wherein I is a positive integer; and adjusting the gray value of the target mapping image to obtain a shot image. According to the method provided by the invention, the curve mapping is carried out on the fused image of at least three images, and the gray value adjustment is carried out on the target mapping image, so that the time-delay shooting effect of the shot image can be improved.

Description

Shooting method and electronic equipment
Technical Field
The present invention relates to the field of image processing technologies, and in particular, to a shooting method and an electronic device.
Background
Time-lapse photography is a photography technique that compresses time, and is also called time-lapse photography. Due to its time-compressive nature, it often appears as a quick-release effect on slowly changing scenic objects.
Time-lapse photography is usually implemented by video frame extraction, for example: when the effect of 30 multiplying power is required to be shown, firstly, video recording is carried out, and during the recording process or after the recording is finished, frame extraction operation of 30-to-1 is carried out on the video to obtain a delayed shooting video.
In the current time-lapse photography, the effect of the shot image in the obtained shot video is poor.
Disclosure of Invention
The embodiment of the invention provides a shooting method and electronic equipment, and aims to solve the problem that in the existing time-delay shooting, the obtained shooting image effect in a shooting video is poor.
In order to solve the technical problem, the invention is realized as follows:
in a first aspect, an embodiment of the present invention provides a shooting method applied to an electronic device, including:
acquiring at least three images with different exposure values, wherein the at least three images comprise images obtained by time-lapse photography;
performing image fusion on the at least three images to obtain a fused image;
performing first curve mapping on the fused image to obtain a first mapping image;
performing second curve mapping on I first local block areas of the first mapping image to obtain a target mapping image, wherein I is a positive integer;
and adjusting the gray value of the target mapping image to obtain a shot image.
In a second aspect, an embodiment of the present invention further provides an electronic device, including:
the device comprises a first acquisition module, a second acquisition module and a third acquisition module, wherein the first acquisition module is used for acquiring at least three images with different exposure values, and the at least three images comprise images obtained by time-lapse photography;
the second acquisition module is used for carrying out image fusion on the at least three images to obtain a fused image;
the third acquisition module is used for carrying out first curve mapping on the fusion image to obtain a first mapping image;
a fourth obtaining module, configured to perform second curve mapping on I first local block regions of the first mapping image to obtain a target mapping image, where I is a positive integer;
and the fifth acquisition module is used for adjusting the gray value of the target mapping image to obtain a shot image.
In the embodiment of the invention, at least three images with different exposure values are acquired, wherein the at least three images comprise images obtained by time-lapse photography; performing image fusion on the at least three images to obtain a fused image; performing first curve mapping on the fused image to obtain a first mapping image; performing second curve mapping on I first local block areas of the first mapping image to obtain a target mapping image, wherein I is a positive integer; and adjusting the gray value of the target mapping image to obtain a shot image. Therefore, the time-delay shooting effect of the shot image can be improved by performing curve mapping on the fused image of at least three images and performing gray value adjustment on the target mapping image.
Drawings
Fig. 1 is a flowchart of a photographing method according to an embodiment of the present invention;
FIG. 2 is a second flowchart of a photographing method according to an embodiment of the present invention;
fig. 3 is a third flowchart of a photographing method according to an embodiment of the present invention;
FIG. 4 is a block diagram of an electronic device provided by an embodiment of the invention;
fig. 5 is a block diagram of an electronic device according to another embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Referring to fig. 1, fig. 1 is a flowchart of a shooting method according to an embodiment of the present invention, and as shown in fig. 1, the embodiment provides a shooting method applied to an electronic device, including the following steps:
step 101, acquiring at least three images with different exposure values, wherein the at least three images comprise images obtained by time-lapse photography.
The at least three images include a first image, a second image and a third image, and the exposure value of the first image is the largest. For example, the Exposure Value (EV) of the first image is 0EV, the Exposure Value of the second image is-3 EV, and the Exposure Value of the third image is-6 EV. Where 0EV is the reference EV and smoothly changes with Auto Exposure (AE), the images of-3 EV and-6 EV can be used to complement the overexposure of the 0EV image.
The electronic device acquires the first image more frequently than the second image (or the third image), for example, the electronic device acquires 30 frames of the first image in 1 second and 1 frame of the second image (or the third image) in 1 second. The first image is captured at normal video magnification and the second image (or third image) is captured at delayed photographic magnification. The second image is an image obtained by time-lapse photography, that is, an image acquired by time-lapse photography magnification.
And 102, carrying out image fusion on the at least three images to obtain a fused image.
In the image fusion, a weight may be set for the gray level value of each of the at least three images. For example, the weight set for the gray value of the first image is a constant maximum distribution on the left half, and a gaussian distribution on the right half; the weight set for the gray value of the second image is a complete gaussian distribution, the weight set for the gray value of the third image is a gaussian distribution on the left half and a constant maximum distribution on the right half. And performing single-scale image fusion according to the at least three images and the weight to obtain a fused image.
To ensure the accuracy of image fusion, a high bitmap can be used to record the grayscale values of the fused image. Each of the at least three images is a low bitmap. The high bitmap and the low bitmap are relative concepts, and the number of bits for storing one pixel gray value in the high bitmap is larger than that in the low bitmap. For example, 16bits (bit) are used to store one pixel gray value in the high bitmap and 8 bits are used to store one pixel gray value in the low bitmap.
And 103, performing first curve mapping on the fused image to obtain a first mapped image.
The first curve mapping is a tone mapping, which includes a global sigmoid curve mapping, an inverse sigmoid curve mapping, and so on. The curve included in the tone mapping can be displayed on the electronic equipment, so that a user can select a specific mapping curve used by the tone mapping, and the user can customize the stylized delayed shot image.
And 104, performing second curve mapping on the I first local block areas of the first mapping image to obtain a target mapping image, wherein I is a positive integer.
In this step, the first mapping image is divided into a plurality of first local block regions, and the division manner may be performed according to actual situations, which is not limited herein. For example, the first mapping image is divided into 9 blocks or 16 blocks, etc., and the sizes of the first partial block regions may be equal or unequal. A second curve mapping is then performed for I first partial block areas of the plurality of first partial block areas. Different first partial block areas may use the same second curve mapping, and also different second curve mappings.
The steps 103-104 may implement mapping from the high bitmap to the low bitmap, i.e. the number of bits storing one pixel gray value in the target mapping image is smaller than the number of bits storing one pixel gray value in the fusion image.
And 105, adjusting the gray value of the target mapping image to obtain a shot image.
After the target mapping image is obtained, the gray value of the target mapping image is adjusted to improve the display effect of the shot image.
In an embodiment of the present invention, the electronic Device may be a Mobile phone, a Tablet Personal Computer (Tablet Personal Computer), a Laptop Computer (Laptop Computer), a Personal Digital Assistant (PDA), a Mobile Internet Device (MID), a Wearable Device (Wearable Device), or the like.
According to the shooting method provided by the embodiment of the invention, at least three images with different exposure values are obtained, wherein the at least three images comprise images obtained by time-delay shooting; performing image fusion on the at least three images to obtain a fused image; performing first curve mapping on the fused image to obtain a first mapping image; performing second curve mapping on I first local block areas of the first mapping image to obtain a target mapping image, wherein I is a positive integer; and adjusting the gray value of the target mapping image to obtain a shot image. Therefore, the time-delay shooting effect of the shot image can be improved by performing curve mapping on the fused image of at least three images and performing gray value adjustment on the target mapping image.
Referring to fig. 2, fig. 2 is a second flowchart of a shooting method according to an embodiment of the present invention, and as shown in fig. 2, the embodiment provides a shooting method applied to an electronic device, including the following steps:
step 201, at least two images with different exposure values are acquired, wherein the at least two images comprise images obtained by time-lapse photography.
Step 202, performing image fusion on the at least two images to obtain a fused image.
And 203, performing first curve mapping on the fused image to obtain a first mapped image.
Steps 201 to 203 are consistent with the records of steps 101 to 103, and are not described herein. And 204, performing second curve mapping on the I first local block areas of the first mapping image to obtain a target mapping image, wherein I is a positive integer.
In this step, the first mapping image is divided into a plurality of first local block regions, and the division manner may be performed according to actual situations, which is not limited herein. For example, the first mapping image is divided into 9 blocks or 16 blocks, etc., and the sizes of the first partial block regions may be equal or unequal. A second curve mapping is then performed for I first partial block areas of the plurality of first partial block areas. Different first partial block areas may use the same second curve mapping, and also different second curve mappings.
Specifically, step 204, performing second curve mapping on the I first local block regions of the first mapping image to obtain a target mapping image, may include:
acquiring gray level histograms of I first local block regions of the I first mapping images;
and performing second curve mapping on the I first local block areas according to the distribution characteristics of the I gray level histograms to obtain a target mapping image.
Specifically, for each first local block region, gray value statistics is performed on the first local block region to obtain a gray histogram, and corresponding shape curve mapping is performed according to the distribution characteristics of the gray histogram, that is, the second curve mapping is shape curve mapping. For example, in the gray histogram, the bright area pile-up is mapped by using a downward convex curve, and the dark area pile-up is mapped by using an upward convex curve. And performing second curve mapping on the first local block area corresponding to each gray level histogram according to the distribution characteristics of the gray level histograms to obtain a target mapping image. The target mapping image is an image obtained by performing second curve mapping on the I first local block areas in the first mapping image.
The first curve mapping of the fused image can guarantee the contrast of a middle brightness area of the image, and the second curve mapping of the first local block area can guarantee the detail presentation of a bright area and a dark area of the image.
Step 205, statistics is performed on the variance of the gray values of a local block region set, where the local block region set includes a second region of a plurality of second local block regions of the target map image and a region adjacent to the second region.
The second partial block area may be divided in the same manner as or different from the first partial block area, and is not limited herein. Preferably, the second partial block region is divided in the same manner as the first partial block region.
And dividing the target mapping image into a plurality of second local block areas, and performing gray value statistics on each second local block area and the adjacent local block areas to obtain the variance of the gray values. Preferably, for the second regions, there are 8 regions adjacent to the second region.
And step 206, if the variance is smaller than a preset threshold, adjusting the gray value of the first area in the target mapping image to obtain a shot image.
If the variance is smaller than a preset threshold value, adjusting the gray value of the first area in the target mapping image, namely adjusting the transformation degree of the second curve mapping of the first area to enable the mapped curve to be closer to linearity. If the variance is not less than the preset threshold, the gray value of the first area is not adjusted.
By adjusting the first area, the first area of the shot image can be distinguished: the small variance over a flat area, the mapping curve is more nearly linear; the large variance over the texture region, the mapping curve is designed to conform to the histogram distribution characteristics.
Step 205-step 206 are a specific implementation manner of step 105. Since the first mapping image is subjected to the operation of partial block inconsistency, that is, the plurality of first partial block areas are separately processed, the flat area in the second mapping image is likely to become uneven. The steps 205 to 206 can make the flat area uniform, and improve the effect of shooting images.
If the first curve mapping adopts an inverse S-shaped curve, the second curve mapping is adjusted to be approximate to a straight line, so that the mapping from the high bitmap to the low bitmap is simple scale reduction operation from a large range to a small range, the brightness of the image (video) in the same area or the same plane is compressed, and the change of light and shadow is lost, so that the time-delay photographing effect of the cartoon can be obtained; the histogram of the local block is mapped to be uniformly distributed (i.e. the second curve mapping is adjusted to be approximately straight line), that is, some accumulated picture brightness is split to be distributed to each brightness tone, so that an exaggerated time-delay shooting effect can be obtained.
Since it is usually necessary to increase the brightness of the shadow region and decrease the brightness of the highlight region when performing the tone mapping (i.e., the first curve mapping), the shadow and highlight details are clearly visible. In video, however, the degree of brightness enhancement and dimming is likely to cause flicker if there is no smooth control. Alternatively, to solve this problem, the present application provides the following two ways.
In a first mode, after the image fusion is performed on the at least three images to obtain a fused image, and before the curve mapping is performed on the fused image to obtain a target mapping image, the method further includes:
if N continuous fusion images are obtained, N pieces of first brightness information of the N fusion images are obtained, wherein N is a positive integer larger than 1;
filtering the N pieces of first brightness information as a whole to obtain first filtered brightness information, wherein the first filtered brightness information comprises second brightness information of the N fused images;
and respectively carrying out brightness adjustment on the N fused images according to the second brightness information of the N fused images.
Specifically, the first luminance information may be luminance information of a grayscale value of the fused image, and the grayscale value of the fused image may be luminance information of a designated grayscale value section of the fused image, for example, luminance information of the fused image having a grayscale value in a range of 0 to 25 (dark region), or luminance information of the fused image having a grayscale value in a range of 230 to 255 (bright region). The luminance information may be an average value of gray values, or an intermediate value, etc.
For each fused image, there is corresponding first luminance information. The first luminance information may include a plurality of information, for example, the first luminance information includes both a gray scale average value of a dark region and a gray scale average value of a bright region, the gray scale value of the bright region being greater than the gray scale value of the dark region.
And filtering the first brightness information of the N fused images, and filtering out high-frequency brightness information which changes rapidly to obtain first filtering brightness information, wherein the first filtering brightness information comprises second brightness information of the N fused images. If the first luminance information includes a plurality of pieces of information, when the first luminance information of the N fused images is filtered, each piece of information of the N fused images is filtered. For example, if the first luminance information includes the gray-scale mean value of the dark area and the gray-scale mean value of the bright area, the gray-scale mean values of the dark area of the N fused images are filtered, the gray-scale mean values of the bright area of the N fused images are filtered, and the obtained second luminance information includes new gray-scale mean values of the dark area and the bright area.
After the second brightness information of the N fused images is obtained, brightness adjustment is carried out on the fused images according to the second brightness information, so that the realization of subsequent first curve mapping (tone mapping) operation is smooth, and delayed photographic images (namely, shot images) cannot flicker when being played.
A second way, a first image of the at least three images has a maximum exposure value of the at least three images;
before the image fusion is performed on the at least three images to obtain a fused image, the method further includes:
acquiring an image set, wherein the image set comprises M continuous initial images with the same exposure value, the first image is one of the M initial images, and M is a positive integer greater than 1;
acquiring M pieces of third brightness information of M initial images;
filtering the M pieces of third brightness information as a whole to obtain second filtered brightness information, wherein the second filtered brightness information comprises fourth brightness information of the M initial images;
and respectively carrying out brightness adjustment on the M initial images according to the fourth brightness information of the M initial images.
Specifically, the first image may be a 0EV image, and the third luminance information may be luminance information of a gradation value of the initial image. The third luminance information may be an average value of gray values, or an intermediate value, etc.
For each initial image, there is corresponding third luminance information, and the third luminance information is preferably an average value of gray values.
And filtering the third brightness information of the M initial images, and filtering out high-frequency brightness information which changes rapidly to obtain second filtered brightness information, wherein the second filtered brightness information comprises fourth brightness information of the M initial images.
And after fourth brightness information of the M initial images is obtained, brightness adjustment is carried out on the initial images according to the fourth brightness information. For example, comparing the third luminance information and the fourth luminance information of the initial image, and if the value of the third luminance information is greater than that of the fourth luminance information, not processing the initial image; and if the value of the third brightness information is not greater than the value of the fourth brightness information, adjusting the initial image according to the fourth brightness information to compensate the reference brightness of the initial image. In this way, during the subsequent first curve mapping (tone mapping), the brightness of the picture is only smoothly changed along with the time, and the sudden brightness flicker is avoided.
The following describes the shooting method provided by the present application in detail by taking an example that the at least three images include three images. Fig. 3 is a schematic flow chart of the shooting method provided in the present application, and as shown in fig. 3, the high dynamic range delayed shooting includes three parts, which are described below.
The first section acquires a captured image.
First, three images are acquired according to an exposure strategy. Three images were acquired at each fixed exposure 0EV, -3EV, -6EV, where 0EV is the reference EV and varies smoothly with AE, -3EV and-6 EV are used to complement the 0EV frame overexposure. The images obtained using 0EV were taken at the normal photographic magnification, and the images using-3 EV and-6 EV were taken at the same time-lapse photographic magnification.
And secondly, synthesizing high bitmaps to obtain a fused image. Setting the weight distribution of the 0EV image (third image) as a left half constant maximum value distribution, and setting the right half as a Gaussian distribution; -the weight distribution of the 3EV image (fourth image) is a complete gaussian distribution; the weight distribution of the-6 EV image (fifth image) is gaussian on the left half and constant maximum on the right half.
And (4) performing single-scale image fusion according to the three images and the weights thereof, and recording a high bitmap by using 16bits in order to ensure numerical precision.
And thirdly, Mapping a first curve (Tone Mapping). Mapping a global S-shaped curve on a high-level map, dividing a mapped image (namely a first mapping image) into a plurality of local blocks (namely a first local block area), performing histogram statistics on the local blocks, and mapping corresponding shape curves according to the distribution characteristics of the histogram, for example: the stacking of the bright area high pixel area uses a downward convex curve, and the stacking of the dark area low pixel area uses an upward convex curve. The method has the advantages that the global curve guarantees the contrast of the middle brightness area of the image, and the local curve guarantees the detail presentation of the bright area and the dark area of the image according to the local characteristics.
Flat areas in an image (i.e., a target map image) tend to become uneven due to inconsistent operations on local blocks (i.e., processing the local blocks separately) being performed on the image. The problem of uneven flat area can be solved in the following manner.
Smooth transition operation between local image blocks: for each local block (namely the first area in the second local block area), eight local block histograms adjacent to the local block histogram are counted at the same time, so that the spatial continuity of statistical information is guaranteed, and in addition, the continuous interpolation transition between the blocks is guaranteed when the brightness is mapped.
Smooth mapping operation on local image blocks: for each local block, the variance information is counted, the variance is used as a variable, and a linear function is designed to correct curve mapping, so that the following can be distinguished: the small variance over a flat area, the mapping curve is more nearly linear; the large variance over the texture region, the mapping curve is designed to conform to the histogram distribution characteristics.
In addition, by setting an interactive option on the interface of the electronic equipment, the user can select a tone mapping curve, so that the user can customize stylized delayed photographic works.
For example: the global S-shaped curve is changed into an inverse S-shaped curve, and the local mapping curve is adjusted to be approximately linear, so that the mapping from the high bitmap to the low bitmap is simple scale reduction operation from a large range to a small range, and the brightness of the image (video) in the same area or the same plane is compressed, and the change of light and shadow is lost, so that the time-delay photographing effect of the cartoon can be obtained; the histogram of the local block is mapped to be uniformly distributed, namely, the histogram is equivalent to that some accumulated image brightness is split to be distributed to each brightness tone, so that an exaggerated time-delay shooting effect can be obtained.
A second part: time domain filtering
And fourthly, caching the image. The 0EV image is kept as capture of normal video magnification, while the-3 EV and-6 EV images satisfy capture of delayed photographic magnification. And establishing a queue, pushing the 0EV into the queue in sequence, and buffering. And the composite image (i.e., the fused image) obtained in the second step is also buffered in another queue. The label A is a first queue and stores a third image and a sixth image, the third image is an image participating in image fusion, the sixth image and the third image are both images acquired by adopting 0EV, and the label B is a second queue and stores a plurality of fused images.
And fifthly, performing time-domain filtering on the brightness information. And (3) carrying out brightness information statistics on the cached images (including 0EV images and fused images), designing a low-pass digital filter, filtering out high-frequency changes of brightness, and keeping low-frequency components.
When shadow mapping is carried out, the brightness of shadow areas is generally required to be improved, and the brightness of highlight areas is required to be reduced, so that shadow and highlight details are clearly visible. In video, however, the degree of brightness enhancement and dimming is likely to cause flicker if there is no smooth control.
In this step, the problem of flicker can be solved as follows. Statistics are carried out on the dark regions (for example, the gray values are 0-25) and the bright regions (for example, the gray value is 230-255) information (for example, the mean luminance, the median luminance, and the like) of the fused image of the consecutive frames, a high-frequency luminance change which changes rapidly is filtered out through a digital filter, the luminance information change which changes smoothly is kept, and then the low-frequency luminance information is used for the tone mapping operation of each frame (see the description of the first mode specifically above), so that the tone mapping operation is guaranteed to be a smooth operation in implementation. Thus, the bright area and the dark area of the time-lapse photography picture of the high dynamic range imaging are not flickered.
In this step, the problem of flicker can also be solved in the following manner. And filtering the brightness information of continuous multiple frames of 0EV images and keeping the low-frequency components of the brightness information. After obtaining the fused image and before performing tone mapping on the fused image, comparing the filtered low-frequency component value with the original 0EV, and compensating for the high bitmap reference brightness (see the above description of the second mode specifically). And in the process of tone mapping, the front and the back of the brightness tone mapping are fixed to be consistent by taking the compensated reference brightness as the center. Thus, the brightness of the picture is only smoothly changed along with the time, and the sudden brightness flicker is avoided.
And a third part: electronic Image Stabilization (EIS).
And sixthly, aligning the images. And sequentially carrying out inter-frame alignment on the two frames of images before and after the buffering in the fourth step. After alignment, the center point position is recorded.
And seventhly, performing time-domain filtering on the image position. And filtering the central point position of the obtained image by a low-pass digital filter, filtering out high-frequency change of the position, retaining low-frequency components, remapping the position of the image according to the low-frequency components, and finishing electronic image stabilization. And finally, outputting the image sequence output by the electronic image stabilization to a screen for displaying or inputting the image sequence into video coding hardware for video coding to obtain a video.
The shooting method improves the effect of common time-delay shooting in a scene with a large light ratio; the problem of brightness flicker caused by time-delay photography is solved; in a motion state, stabilizing the time-delay shooting picture jitter; the user may select the tone mapping curve so that the electronic device may capture time-lapse photographic works of various styles.
Referring to fig. 4, fig. 4 is a block diagram of an electronic device according to an embodiment of the present invention, and as shown in fig. 4, the electronic device 400 includes:
a first obtaining module 401, configured to obtain at least three images with different exposure values, where the at least three images include an image obtained by time-lapse photography;
a second obtaining module 402, configured to perform image fusion on the at least three images to obtain a fused image;
a third obtaining module 403, configured to perform first curve mapping on the fused image to obtain a first mapped image;
a fourth obtaining module 404, configured to perform second curve mapping on I first local block regions of the first mapping image to obtain a target mapping image, where I is a positive integer;
a fifth obtaining module 405, configured to adjust the gray scale value of the target mapping image to obtain a captured image.
Further, the fourth obtaining module 404 includes:
a first obtaining sub-module, configured to obtain I grayscale histograms of the I first local block regions of the first mapping image;
and the second acquisition submodule is used for carrying out second curve mapping on the I first local block areas according to the distribution characteristics of the I gray level histograms to obtain a target mapping image.
Further, the fifth obtaining module 405 includes:
a statistics submodule for counting variances of gray values of a set of local block regions, the set of local block regions including a second region of a plurality of second local block regions of the target mapped image and a region adjacent to the second region;
and the third obtaining submodule is used for adjusting the gray value of the second area in the target mapping image to obtain a shot image if the variance is smaller than a preset threshold value.
Further, the method also comprises the following steps:
a sixth obtaining module, configured to obtain N pieces of first luminance information of N pieces of the fused images if N consecutive pieces of the fused images are obtained, where N is a positive integer greater than 1;
a seventh obtaining module, configured to filter the N pieces of first luminance information as a whole to obtain first filtered luminance information, where the first filtered luminance information includes second luminance information of the N pieces of fused images;
and the first adjusting module is used for respectively adjusting the brightness of the N fused images according to the second brightness information of the N fused images.
Further, a first image of the at least three images has a maximum exposure value of the at least three images;
the electronic device 400 further comprises:
an eighth obtaining module, configured to obtain an image set, where the image set includes M consecutive initial images with the same exposure value, the first image is one of the M initial images, and M is a positive integer greater than 1;
a ninth obtaining module, configured to obtain M third luminance information of the M initial images;
a tenth obtaining module, configured to filter the M pieces of third luminance information as a whole to obtain second filtered luminance information, where the second filtered luminance information includes fourth luminance information of the M pieces of initial images;
and the second adjusting module is used for respectively adjusting the brightness of the M initial images according to the fourth brightness information of the M initial images.
The electronic device 400 can implement each process implemented by the electronic device in the method embodiments of fig. 1 to fig. 2, and details are not repeated here to avoid repetition.
The electronic device 400 of the embodiment of the present invention obtains at least three images with different exposure values, where the at least three images include an image obtained by time-lapse photography; performing image fusion on the at least three images to obtain a fused image; performing first curve mapping on the fused image to obtain a first mapping image; performing second curve mapping on I first local block areas of the first mapping image to obtain a target mapping image, wherein I is a positive integer; and adjusting the gray value of the target mapping image to obtain a shot image. Therefore, the time-delay shooting effect of the shot image can be improved by performing curve mapping on the fused image of at least three images and performing gray value adjustment on the target mapping image.
Fig. 5 is a schematic diagram of a hardware structure of an electronic device for implementing various embodiments of the present invention, and as shown in fig. 5, the electronic device 500 includes, but is not limited to: a radio frequency unit 501, a network module 502, an audio output unit 503, an input unit 504, a sensor 505, a display unit 506, a user input unit 507, an interface unit 508, a memory 509, a processor 510, and a power supply 511. Those skilled in the art will appreciate that the electronic device configuration shown in fig. 5 does not constitute a limitation of the electronic device, and that the electronic device may include more or fewer components than shown, or some components may be combined, or a different arrangement of components. In the embodiment of the present invention, the electronic device includes, but is not limited to, a mobile phone, a tablet computer, a notebook computer, a palm computer, a vehicle-mounted terminal, a wearable device, a pedometer, and the like.
The processor 510 is configured to acquire at least three images with different exposure values, where the at least three images include an image obtained by time-lapse photography;
performing image fusion on the at least three images to obtain a fused image;
performing first curve mapping on the fused image to obtain a first mapping image;
performing second curve mapping on I first local block areas of the first mapping image to obtain a target mapping image, wherein I is a positive integer;
and adjusting the gray value of the target mapping image to obtain a shot image.
The electronic device 500 can implement the processes implemented by the electronic device in the foregoing embodiments, and in order to avoid repetition, the detailed description is omitted here.
The electronic device 500 of the embodiment of the present invention obtains at least three images with different exposure values, where the at least three images include an image obtained by time-lapse photography; performing image fusion on the at least three images to obtain a fused image; performing first curve mapping on the fused image to obtain a first mapping image; performing second curve mapping on I first local block areas of the first mapping image to obtain a target mapping image, wherein I is a positive integer; and adjusting the gray value of the target mapping image to obtain a shot image. Therefore, the time-delay shooting effect of the shot image can be improved by performing curve mapping on the fused image of at least three images and performing gray value adjustment on the target mapping image.
It should be understood that, in the embodiment of the present invention, the radio frequency unit 501 may be used for receiving and sending signals during a message sending and receiving process or a call process, and specifically, receives downlink data from a base station and then processes the received downlink data to the processor 510; in addition, the uplink data is transmitted to the base station. In general, radio frequency unit 501 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like. In addition, the radio frequency unit 501 can also communicate with a network and other devices through a wireless communication system.
The electronic device provides wireless broadband internet access to the user via the network module 502, such as assisting the user in sending and receiving e-mails, browsing web pages, and accessing streaming media.
The audio output unit 503 may convert audio data received by the radio frequency unit 501 or the network module 502 or stored in the memory 509 into an audio signal and output as sound. Also, the audio output unit 503 may also provide audio output related to a specific function performed by the electronic apparatus 500 (e.g., a call signal reception sound, a message reception sound, etc.). The audio output unit 503 includes a speaker, a buzzer, a receiver, and the like.
The input unit 504 is used to receive an audio or video signal. The input Unit 504 may include a Graphics Processing Unit (GPU) 5041 and a microphone 5042, and the Graphics processor 5041 processes image data of a still picture or video obtained by an image capturing device (e.g., a camera) in a video capturing mode or an image capturing mode. The processed image frames may be displayed on the display unit 506. The image frames processed by the graphic processor 5041 may be stored in the memory 509 (or other storage medium) or transmitted via the radio frequency unit 501 or the network module 502. The microphone 5042 may receive sounds and may be capable of processing such sounds into audio data. The processed audio data may be converted into a format output transmittable to a mobile communication base station via the radio frequency unit 501 in case of the phone call mode.
The electronic device 500 also includes at least one sensor 505, such as light sensors, motion sensors, and other sensors. Specifically, the light sensor includes an ambient light sensor that can adjust the brightness of the display panel 5061 according to the brightness of ambient light, and a proximity sensor that can turn off the display panel 5061 and/or a backlight when the electronic device 500 is moved to the ear. As one type of motion sensor, an accelerometer sensor can detect the magnitude of acceleration in each direction (generally three axes), detect the magnitude and direction of gravity when stationary, and can be used to identify the posture of an electronic device (such as horizontal and vertical screen switching, related games, magnetometer posture calibration), and vibration identification related functions (such as pedometer, tapping); the sensors 505 may also include fingerprint sensors, pressure sensors, iris sensors, molecular sensors, gyroscopes, barometers, hygrometers, thermometers, infrared sensors, etc., which are not described in detail herein.
The display unit 506 is used to display information input by the user or information provided to the user. The Display unit 506 may include a Display panel 5061, and the Display panel 5061 may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like.
The user input unit 507 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the electronic device. Specifically, the user input unit 507 includes a touch panel 5071 and other input devices 5072. Touch panel 5071, also referred to as a touch screen, may collect touch operations by a user on or near it (e.g., operations by a user on or near touch panel 5071 using a finger, stylus, or any suitable object or attachment). The touch panel 5071 may include two parts of a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 510, and receives and executes commands sent by the processor 510. In addition, the touch panel 5071 may be implemented in various types such as a resistive type, a capacitive type, an infrared ray, and a surface acoustic wave. In addition to the touch panel 5071, the user input unit 507 may include other input devices 5072. In particular, other input devices 5072 may include, but are not limited to, a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, and a joystick, which are not described in detail herein.
Further, the touch panel 5071 may be overlaid on the display panel 5061, and when the touch panel 5071 detects a touch operation thereon or nearby, the touch operation is transmitted to the processor 510 to determine the type of the touch event, and then the processor 510 provides a corresponding visual output on the display panel 5061 according to the type of the touch event. Although in fig. 5, the touch panel 5071 and the display panel 5061 are two independent components to implement the input and output functions of the electronic device, in some embodiments, the touch panel 5071 and the display panel 5061 may be integrated to implement the input and output functions of the electronic device, and is not limited herein.
The interface unit 508 is an interface for connecting an external device to the electronic apparatus 500. For example, the external device may include a wired or wireless headset port, an external power supply (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, an audio input/output (I/O) port, a video I/O port, an earphone port, and the like. The interface unit 508 may be used to receive input (e.g., data information, power, etc.) from external devices and transmit the received input to one or more elements within the electronic apparatus 500 or may be used to transmit data between the electronic apparatus 500 and external devices.
The memory 509 may be used to store software programs as well as various data. The memory 509 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the cellular phone, and the like. Further, the memory 509 may include high-speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid-state storage device.
The processor 510 is a control center of the electronic device, connects various parts of the whole electronic device by using various interfaces and lines, performs various functions of the electronic device and processes data by running or executing software programs and/or modules stored in the memory 509 and calling data stored in the memory 509, thereby performing overall monitoring of the electronic device. Processor 510 may include one or more processing units; preferably, the processor 510 may integrate an application processor, which mainly handles operating systems, user interfaces, application programs, etc., and a modem processor, which mainly handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into processor 510.
The electronic device 500 may further include a power supply 511 (e.g., a battery) for supplying power to various components, and preferably, the power supply 511 may be logically connected to the processor 510 via a power management system, so as to implement functions of managing charging, discharging, and power consumption via the power management system.
In addition, the electronic device 500 includes some functional modules that are not shown, and are not described in detail herein.
Preferably, an embodiment of the present invention further provides an electronic device, which includes a processor 510, a memory 509, and a computer program that is stored in the memory 509 and can be run on the processor 510, and when the computer program is executed by the processor 510, the processes of the shooting method embodiment are implemented, and the same technical effect can be achieved, and in order to avoid repetition, details are not described here again.
The embodiment of the present invention further provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the computer program implements each process of the above-mentioned shooting method embodiment, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here. The computer-readable storage medium may be a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present invention may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal (such as a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present invention.
While the present invention has been described with reference to the embodiments shown in the drawings, the present invention is not limited to the embodiments, which are illustrative and not restrictive, and it will be apparent to those skilled in the art that various changes and modifications can be made therein without departing from the spirit and scope of the invention as defined in the appended claims.

Claims (10)

1. A shooting method is applied to electronic equipment, and is characterized by comprising the following steps:
acquiring at least three images with different exposure values, wherein the at least three images comprise images obtained by time-lapse photography;
performing image fusion on the at least three images to obtain a fused image;
performing first curve mapping on the fused image to obtain a first mapping image;
performing second curve mapping on I first local block areas of the first mapping image to obtain a target mapping image, wherein I is a positive integer;
adjusting the gray value of the target mapping image to obtain a shot image; wherein the first curve mapping is tone mapping and the second curve mapping is shape curve mapping.
2. The shooting method according to claim 1, wherein the second curve mapping I first partial block areas of the first mapping image to obtain a target mapping image comprises:
acquiring I gray level histograms of the I first local block regions of the first mapping image;
and performing second curve mapping on the I first local block areas according to the distribution characteristics of the I gray level histograms to obtain a target mapping image.
3. The shooting method according to claim 1 or 2, wherein the adjusting the gray-scale value of the target mapping image to obtain a shot image comprises:
counting a variance of gray values of a set of local block regions, the set of local block regions comprising a second region of a plurality of second local block regions of the target mapped image and a region adjacent to the second region;
and if the variance is smaller than a preset threshold value, adjusting the gray value of the second area in the target mapping image to obtain a shot image.
4. The shooting method according to claim 1, wherein after the image fusion of the at least three images to obtain a fused image, the curve mapping of the fused image to obtain a target mapping image further comprises:
if N continuous fusion images are obtained, N pieces of first brightness information of the N fusion images are obtained, wherein N is a positive integer larger than 1;
filtering the N pieces of first brightness information as a whole to obtain first filtered brightness information, wherein the first filtered brightness information comprises second brightness information of the N pieces of fused images;
and respectively carrying out brightness adjustment on the N fused images according to the second brightness information of the N fused images.
5. The photographing method according to claim 1, wherein a first image of the at least three images has a maximum exposure value of the at least three images;
before the image fusion is performed on the at least three images to obtain a fused image, the method further includes:
acquiring an image set, wherein the image set comprises M continuous initial images with the same exposure value, the first image is one of the M initial images, and M is a positive integer greater than 1;
acquiring M pieces of third brightness information of M initial images;
filtering the M pieces of third brightness information as a whole to obtain second filtered brightness information, wherein the second filtered brightness information comprises fourth brightness information of the M pieces of initial images;
and respectively carrying out brightness adjustment on the M initial images according to fourth brightness information of the M initial images.
6. An electronic device, comprising:
the device comprises a first acquisition module, a second acquisition module and a third acquisition module, wherein the first acquisition module is used for acquiring at least three images with different exposure values, and the at least three images comprise images obtained by time-lapse photography;
the second acquisition module is used for carrying out image fusion on the at least three images to obtain a fused image;
the third acquisition module is used for carrying out first curve mapping on the fusion image to obtain a first mapping image;
a fourth obtaining module, configured to perform second curve mapping on I first local block regions of the first mapping image to obtain a target mapping image, where I is a positive integer;
the fifth acquisition module is used for adjusting the gray value of the target mapping image to obtain a shot image;
wherein the first curve mapping is tone mapping and the second curve mapping is shape curve mapping.
7. The electronic device of claim 6, wherein the fourth obtaining module comprises:
a first obtaining sub-module, configured to obtain I grayscale histograms of the I first local block regions of the first mapping image;
and the second acquisition submodule is used for carrying out second curve mapping on the I first local block areas according to the distribution characteristics of the I gray level histograms to obtain a target mapping image.
8. The electronic device according to claim 6 or 7, wherein the fifth obtaining module comprises:
a statistics submodule for counting variances of gray values of a set of local block regions, the set of local block regions including a second region of a plurality of second local block regions of the target mapped image and a region adjacent to the second region;
and the third obtaining submodule is used for adjusting the gray value of the second area in the target mapping image to obtain a shot image if the variance is smaller than a preset threshold value.
9. The electronic device of claim 6, further comprising:
a sixth obtaining module, configured to obtain N pieces of first luminance information of N pieces of the fused images if N consecutive pieces of the fused images are obtained, where N is a positive integer greater than 1;
a seventh obtaining module, configured to filter the N pieces of first luminance information as a whole to obtain first filtered luminance information, where the first filtered luminance information includes second luminance information of the N pieces of fused images;
and the first adjusting module is used for respectively adjusting the brightness of the N fused images according to the second brightness information of the N fused images.
10. The electronic device of claim 6, wherein a first image of the at least three images has a maximum exposure value of the at least three images;
the electronic device further includes:
an eighth obtaining module, configured to obtain an image set, where the image set includes M consecutive initial images with the same exposure value, the first image is one of the M initial images, and M is a positive integer greater than 1;
a ninth obtaining module, configured to obtain M third luminance information of the M initial images;
a tenth obtaining module, configured to filter the M pieces of third luminance information as a whole to obtain second filtered luminance information, where the second filtered luminance information includes fourth luminance information of the M pieces of initial images;
and the second adjusting module is used for respectively adjusting the brightness of the M initial images according to the fourth brightness information of the M initial images.
CN201911165774.9A 2019-11-25 2019-11-25 Shooting method and electronic equipment Active CN110868544B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911165774.9A CN110868544B (en) 2019-11-25 2019-11-25 Shooting method and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911165774.9A CN110868544B (en) 2019-11-25 2019-11-25 Shooting method and electronic equipment

Publications (2)

Publication Number Publication Date
CN110868544A CN110868544A (en) 2020-03-06
CN110868544B true CN110868544B (en) 2021-04-30

Family

ID=69656486

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911165774.9A Active CN110868544B (en) 2019-11-25 2019-11-25 Shooting method and electronic equipment

Country Status (1)

Country Link
CN (1) CN110868544B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113674181A (en) * 2020-05-13 2021-11-19 武汉Tcl集团工业研究院有限公司 Alignment fusion method and equipment for multi-exposure images
CN112150399B (en) * 2020-09-27 2023-03-07 安谋科技(中国)有限公司 Image enhancement method based on wide dynamic range and electronic equipment
WO2022109897A1 (en) * 2020-11-26 2022-06-02 深圳市大疆创新科技有限公司 Time-lapse photography method and device, and time-lapse video generation method and device
CN115086567B (en) * 2021-09-28 2023-05-19 荣耀终端有限公司 Time delay photographing method and device

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102895001A (en) * 2012-09-21 2013-01-30 飞依诺科技(苏州)有限公司 Ultrasonic color blood flow imaging dynamic range compression processing method and system
CN105787891A (en) * 2016-01-31 2016-07-20 厦门美图之家科技有限公司 Image processing method, system and shooting terminal for optimizing edge aliasing
CN109819176A (en) * 2019-01-31 2019-05-28 深圳达闼科技控股有限公司 A kind of image pickup method, system, device, electronic equipment and storage medium

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104639920B (en) * 2013-11-13 2018-01-26 上海微锐智能科技有限公司 Wide dynamic fusion method based on double exposure modes of single frames
CN106691505B (en) * 2016-12-27 2020-07-28 深圳市德力凯医疗设备股份有限公司 Method and device for processing uniformity and contrast of ultrasonic image
CN107230192B (en) * 2017-05-31 2020-07-21 Oppo广东移动通信有限公司 Image processing method, image processing device, computer-readable storage medium and mobile terminal

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102895001A (en) * 2012-09-21 2013-01-30 飞依诺科技(苏州)有限公司 Ultrasonic color blood flow imaging dynamic range compression processing method and system
CN105787891A (en) * 2016-01-31 2016-07-20 厦门美图之家科技有限公司 Image processing method, system and shooting terminal for optimizing edge aliasing
CN109819176A (en) * 2019-01-31 2019-05-28 深圳达闼科技控股有限公司 A kind of image pickup method, system, device, electronic equipment and storage medium

Also Published As

Publication number Publication date
CN110868544A (en) 2020-03-06

Similar Documents

Publication Publication Date Title
CN107230192B (en) Image processing method, image processing device, computer-readable storage medium and mobile terminal
CN110868544B (en) Shooting method and electronic equipment
CN107172364B (en) Image exposure compensation method and device and computer readable storage medium
CN110198412B (en) Video recording method and electronic equipment
CN107566739B (en) photographing method and mobile terminal
CN109688322B (en) Method and device for generating high dynamic range image and mobile terminal
CN108234882B (en) Image blurring method and mobile terminal
CN107566730B (en) A kind of panoramic picture image pickup method and mobile terminal
CN107707827A (en) A kind of high-dynamics image image pickup method and mobile terminal
CN109462745B (en) White balance processing method and mobile terminal
CN110930329B (en) Star image processing method and device
CN110213484B (en) Photographing method, terminal equipment and computer readable storage medium
CN108449541B (en) Panoramic image shooting method and mobile terminal
CN107623818B (en) Image exposure method and mobile terminal
CN107730460B (en) Image processing method and mobile terminal
CN111064895B (en) Virtual shooting method and electronic equipment
CN109104578B (en) Image processing method and mobile terminal
CN109727212B (en) Image processing method and mobile terminal
CN109151348B (en) Image processing method, electronic equipment and computer readable storage medium
CN111145151B (en) Motion area determining method and electronic equipment
CN107807488B (en) Camera assembly, aperture adjusting method and mobile terminal
CN109639981B (en) Image shooting method and mobile terminal
CN111131722A (en) Image processing method, electronic device, and medium
CN112950499B (en) Image processing method, device, electronic equipment and storage medium
CN107798662B (en) Image processing method and mobile terminal

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant