CN110213484B - Photographing method, terminal equipment and computer readable storage medium - Google Patents

Photographing method, terminal equipment and computer readable storage medium Download PDF

Info

Publication number
CN110213484B
CN110213484B CN201910472318.2A CN201910472318A CN110213484B CN 110213484 B CN110213484 B CN 110213484B CN 201910472318 A CN201910472318 A CN 201910472318A CN 110213484 B CN110213484 B CN 110213484B
Authority
CN
China
Prior art keywords
exposure
dynamic range
target
range parameter
parameter value
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910472318.2A
Other languages
Chinese (zh)
Other versions
CN110213484A (en
Inventor
费勇超
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN201910472318.2A priority Critical patent/CN110213484B/en
Publication of CN110213484A publication Critical patent/CN110213484A/en
Application granted granted Critical
Publication of CN110213484B publication Critical patent/CN110213484B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/62Control of parameters via user interfaces
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/71Circuitry for evaluating the brightness variation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/741Circuitry for compensating brightness variation in the scene by increasing the dynamic range of the image compared to the dynamic range of the electronic image sensors

Abstract

The invention relates to the technical field of computers, and provides a photographing method, terminal equipment and a computer readable storage medium, which are used for solving the problem of poor image quality effect in the photographing process. The method comprises the following steps: calculating a reference exposure amount based on a preview image acquired by a camera; acquiring a dynamic range parameter value; determining the target exposure according to the reference exposure and the dynamic range parameter value; and acquiring a target image shot by the camera based on the target exposure. In the process of determining the target exposure amount, not only the reference exposure amount is considered, but also the dynamic range parameter value is considered, the dynamic range parameter value is determined based on at least two exposure amounts determined according to the reference exposure amount, and the camera is determined based on the images acquired by the at least two exposure amounts, exposure adjustment is carried out on the terminal equipment through the reference exposure amount and the dynamic range parameter value, the accuracy of the target exposure amount can be improved, and therefore when shooting is carried out according to the target exposure amount, the quality of a target image obtained through shooting can be improved.

Description

Photographing method, terminal equipment and computer readable storage medium
Technical Field
The embodiment of the invention relates to the technical field of computers, in particular to a photographing method, terminal equipment and a computer readable storage medium.
Background
With the development of computer technology, the functions of terminal equipment become more and more powerful, and the photographing function is a common function of equipment on a terminal. The shooting technology is changing day by day, so that the shooting function of the terminal equipment is more and more powerful, and the requirement of users on shooting is higher and higher.
The current automatic exposure control method is based on the gray world theory, a brightness Target (Target) is set for the current scene, and then the current brightness Target value is reached by adjusting the exposure. In the high-luminance scene, the luminance target value can be dynamically adjusted by judging the ratio of the highlight area. That is, the exposure amount is adjusted by controlling the ratio of the highlight area to the dark area (or the ratio of the dark area) in the image, and the exposure amount is decreased or increased to realize the image capturing. In addition, exposure control can be performed through an HDR (high dynamic range) algorithm to realize image shooting, that is, on the basis of reference exposure, two exposures are obtained through equal proportion adjustment, exposure is performed through 3 exposures (the two exposures are obtained through the reference exposure and the equal proportion adjustment), images corresponding to 3 different exposures of a current scene are obtained, and finally, an output image with a high dynamic range is obtained through synthesis.
However, in the process of image capturing by adjusting the exposure amount through the highlight area ratio or performing exposure control through the HDR algorithm, poor image quality is likely to be caused.
Disclosure of Invention
The embodiment of the invention provides a photographing method, terminal equipment and a computer readable storage medium, which aim to solve the problem that the quality effect of images photographed in the prior art is poor.
In order to solve the technical problem, the invention is realized as follows:
in a first aspect, an embodiment of the present invention provides a photographing method, which is applied to a terminal device including a camera, and the method includes:
calculating a reference exposure based on a preview image acquired by the camera;
acquiring a dynamic range parameter value, wherein the dynamic range parameter value is determined based on at least two exposure values determined according to the reference exposure value and images acquired by the camera based on the at least two exposure values;
determining a target exposure according to the reference exposure and the dynamic range parameter value;
and acquiring a target image shot by the camera based on the target exposure.
In a second aspect, an embodiment of the present invention further provides a terminal device, which has a camera, where the terminal device includes:
the calculation module is used for calculating the reference exposure amount based on the preview image acquired by the camera;
a dynamic range parameter value acquisition module for acquiring a dynamic range parameter value, wherein the dynamic range parameter value is determined based on at least two exposure values determined according to the reference exposure value and an image acquired by the camera based on the at least two exposure values;
the target exposure amount determining module is used for determining the target exposure amount according to the reference exposure amount and the dynamic range parameter value;
and the target image acquisition module is used for acquiring a target image shot by the camera based on the target exposure.
In a third aspect, an embodiment of the present invention further provides a terminal device, including: the photographing method comprises a memory and a processor, wherein the memory stores a computer program, and the processor realizes the steps of the photographing method of the embodiment of the invention when executing the computer program.
In a fourth aspect, an embodiment of the present invention further provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the computer program implements the steps in the photographing method provided in the embodiment of the present invention.
According to the photographing method, the reference exposure is calculated through the preview image acquired by the camera, the dynamic range parameter value is acquired to determine the target exposure, and the target image shot by the camera based on the target exposure is acquired. In the process of determining the target exposure amount, not only the reference exposure amount is considered, but also the dynamic range parameter value is considered, the dynamic range parameter value is determined based on at least two exposure amounts determined according to the reference exposure amount, and the camera is determined based on the images acquired by the at least two exposure amounts, namely the actual situation of the images under the at least two exposure amounts is considered, exposure adjustment is carried out on the terminal equipment through the reference exposure amount and the dynamic range parameter value, the target exposure amount is obtained, the accuracy of the target exposure amount can be improved, and therefore when shooting is carried out according to the target exposure amount, the quality of the shot target image can be improved.
Drawings
FIG. 1 is a flowchart of a photographing method according to an embodiment of the present invention;
FIG. 2 is a graph of the range of human eye visible luminance;
FIG. 3 is a diagram of an actual scene;
fig. 4 is a schematic pixel diagram of a first image and a second image obtained in a photographing method according to an embodiment of the present invention;
fig. 5 is a schematic diagram of a terminal device according to an embodiment of the present invention.
Fig. 6 is a schematic diagram of a hardware structure of a terminal device according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Referring to fig. 1, in an embodiment, a photographing method is provided and applied to a terminal device including a camera, and the method includes:
step 101: based on the preview image acquired by the camera, the reference exposure amount is calculated.
The shooting is started (for example, a shooting icon on the terminal device is clicked, the terminal device responds to the click operation to start shooting, or the shooting can be started through voice and the like), the camera can acquire images of the current scene, the preview images acquired by the camera can be displayed on a display screen of the terminal device, the images of the current scene can be previewed, and a user can view the displayed preview images so that the user can view the effect of the preview images in real time.
The exposure is controlled by the diaphragm and the shutter together, and is an important parameter influencing the image quality, and the dynamic range parameter is used for measuring the capacity of recording the bright and dark range of a scene. In the photographing process of the embodiment, first, the reference exposure amount of the current scene may be calculated based on the preview image acquired by the camera.
Step 102: and acquiring a dynamic range parameter value.
The dynamic range parameter is related to the ratio of the maximum pixel value to the minimum pixel value in the image, and can describe the difference of brightness of the image. Wherein the dynamic range parameter value is determined based on at least two exposures determined from the baseline exposure and the image captured by the camera based on the at least two exposures. The dynamic range parameter value can be calculated according to at least two exposure values determined by the reference exposure value and the images acquired by the camera under the at least two exposure values.
The images acquired by the camera under different exposure amounts have differences, the camera is controlled to expose with at least two determined exposure amounts respectively, and the dynamic range parameter value can be calculated based on the images acquired by the at least two exposure amounts and the at least two exposure amounts. Therefore, the dynamic range parameter value is determined according to at least two exposure values determined by utilizing the reference exposure value and the image acquired by the camera based on the at least two exposure values, the actual conditions of the image under the at least two exposure values are considered, and the dynamic range parameter value in the actual scene can be accurately obtained, so that the determination of the subsequent target exposure value can be improved, and the quality effect of the shot target image is improved.
Step 103: and determining the target exposure according to the reference exposure and the dynamic range parameter value.
And adjusting the exposure of the terminal equipment by using the reference exposure and the dynamic range parameter value to obtain the target exposure. In other words, in the process of determining the target exposure amount, not only the reference exposure amount but also the dynamic range parameter value are considered, so that the obtained target exposure amount is more accurate.
Step 104: and acquiring a target image shot by the camera based on the target exposure.
After the target exposure quantity is obtained, the camera is controlled to perform image shooting at the target exposure quantity, so that a target image shot by the camera based on the target exposure quantity can be obtained. The target exposure amount takes the dynamic range parameter value into consideration, and the camera is controlled to be exposed according to the target exposure amount, so that the camera can be more accurately exposed, and the quality effect of a shot target image is improved.
According to the photographing method, the reference exposure is calculated through the preview image acquired by the camera, the dynamic range parameter value is acquired to determine the target exposure, and the target image shot by the camera based on the target exposure is acquired. In the process of determining the target exposure amount, not only the reference exposure amount is considered, but also the dynamic range parameter value is considered, the dynamic range parameter value is determined based on at least two exposure amounts determined according to the reference exposure amount, and the camera is determined based on the images acquired by the at least two exposure amounts, namely the actual situation of the images under the at least two exposure amounts is considered, exposure adjustment is carried out on the terminal equipment through the reference exposure amount and the dynamic range parameter value, the target exposure amount is obtained, the accuracy of the target exposure amount can be improved, and therefore when shooting is carried out according to the target exposure amount, the quality of the shot target image can be improved.
In the embodiment of the present invention, the terminal Device may include, but is not limited to, a Mobile phone, a camera, a Tablet Personal Computer (Tablet Personal Computer), a Laptop Computer (Laptop Computer), a Personal Digital Assistant (PDA), a Mobile Internet Device (MID), a Wearable Device (Wearable Device), or the like.
In one example, after acquiring a target image captured by a camera based on a target exposure amount, the method further comprises: and synthesizing the target images to obtain a synthesized image.
If the number of the target images is one, the synthesized image is the target image, if the number of the target images is at least two, the target exposure amount can be determined according to the dynamic range parameter values, the target exposure amount is in one-to-one correspondence with the target images, namely the number of the target exposure amount is the same, the synthesized image is an image obtained by synthesizing the at least two target images, and therefore the image quality is improved.
In one example, the base exposure may be calculated by an automatic exposure control (ACE) algorithm, including: calculating the brightness value of a preview image acquired by a camera; comparing the brightness value with a target brightness value; and under the condition that the absolute value of the difference between the brightness value and the target brightness value is greater than or equal to a preset difference, adjusting the exposure of the camera so as to enable the brightness value of the preview image to reach the target brightness value. Wherein the target brightness value is adjustable based on the high brightness area fraction of the preview image. Specifically, when the absolute value of the difference between the brightness value and the target brightness value is greater than or equal to the preset difference, if the brightness value is greater than the target brightness value, the exposure amount corresponding to the acquired preview image is reduced, otherwise, the exposure amount corresponding to the acquired preview image is increased, so that the brightness value of the preview image reaches the target brightness value. And under the condition that the absolute value of the difference between the brightness value and the target brightness value is smaller than a preset difference, the exposure is convergent, the exposure dose is not changed as long as the current scene is unchanged, and the current exposure dose is the reference exposure of the current scene.
In one embodiment, the manner in which the dynamic range parameter values are determined includes: acquiring a target exposure ratio; determining at least two exposure amounts based on the reference exposure amount and the target exposure ratio; acquiring images acquired by a camera based on at least two exposure quantities; and determining a dynamic range parameter value according to the image acquired by the camera based on the at least two exposure quantities and the at least two exposure quantities.
As shown in fig. 2, in practical applications, a general sensor chip (i.e., a light-sensing chip, for example, a sensor chip of a single lens reflex) can capture details of 90dB, while a CMOS sensor chip on a general mobile terminal (for example, a mobile phone) can capture details of only about 60dB, data output by the currently commonly used CMOS sensor chip is 10bit, and the dynamic range parameter value sensed by human eyes is between 100dB and 180dB, and according to DR 20 LOG, the CMOS sensor chip can capture details of only about 60dB10210Approximately equal to 60dB, the current sensor chip can only cover the scene with the real scene DR (dynamic range parameter) about 60dB, but the dynamic range parameter values of many scenes in the real scene are far larger than 60dB, as shown in the figure 3 of crossing portrait scene, the DR is about 102dB, and the scene can not be covered by the current sensor chip.
The tolerance of the current common terminal equipment is 60dB as an example. Since the tolerance of the human eye is at least 100dB, to achieve 100dB, according to
Figure GDA0002826493010000061
The calculation formula (MaxLumaValue is the maximum pixel value of the image, MinLumaValue is the minimum pixel value of the image), the output of the photosensitive chip needs 17 bits, which is 7 bits more than the current 10 bits, therefore, the common dynamic range parameter value within 100dB can be accurately calculated, the exposure ratio is at least 27128. For setting the target exposure Ratio, there are two methods, fixed and dynamic:
fixing: i.e., it is desired to dynamically calculate a dynamic range parameter value of 100dB, the target exposure Ratio value needs to be set to 128, i.e., it needs to be set to a value for which it is desired to persist 100 dB. That is, the target exposure Ratio may be set to 128, and the determination of the above-described at least two exposure amounts is performed using 128 during the photographing.
Dynamic state: in many scenes, the dynamic range parameter value does not reach 100dB, so the target exposure Ratio value Ratio does not need to set 128 corresponding to 100dB, and a suitable value of the current scene is better, and the current Ratio value can be dynamically determined on the premise that the pixel value of the image does not overflow (specifically, the highlight of the short-exposure image does not overflow).
At least two exposures may then be determined based on the base exposure and the target exposure ratio. Taking two exposure amounts as an example, the reference exposure amount may be taken as one of the two exposure amounts, and the result of dividing the reference exposure amount by the target exposure ratio may be taken as the other of the two exposure amounts, and at least two exposure amounts may also be determined in other ways. And controlling the camera to respectively expose according to the determined at least two exposure quantities, acquiring images under the at least two exposure quantities, obtaining images corresponding to the at least two exposure quantities, and calculating a dynamic range parameter value according to the images acquired by the camera based on the at least two exposure quantities and the at least two exposure quantities. Therefore, the dynamic range parameter values are obtained according to at least two exposure values determined based on the reference exposure value and the target exposure ratio and the image acquired by the camera based on the at least two exposure values, so that the dynamic range parameter values in the actual scene can be accurately obtained, the determination of the subsequent target exposure values can be improved, and the quality effect of the shot target image is improved.
In one embodiment, the at least two exposures include a first exposure and a second exposure, the first exposure being greater than the second exposure;
determining at least two exposures based on the base exposure and the target exposure ratio, including: determining a reference exposure amount as a first exposure amount; the ratio between the reference exposure amount and the target exposure ratio is determined as the second exposure amount.
After the reference exposure and the Ratio, at least two exposure amounts need to be calculated, and it is understood that an exposure amount corresponding to a long exposure (corresponding to a first exposure amount) and an exposure amount corresponding to a short exposure (corresponding to a second exposure amount) are calculated. The values of two variables are already known: the base exposure and the Ratio, and therefore the first exposure amount and the second exposure amount can be derived based on the values of the two variables described above. For example, if the reference exposure is the first exposure amount (Long), the second exposure amount Short becomes Long/Ratio, and the calculation is simple, efficient, and easy. And the Long and Short can be calculated in a corresponding mode according to the actual situation through other calculation schemes, such as index taking and the like.
In one embodiment, the image corresponding to the first exposure amount is a first image, and the image corresponding to the second exposure amount is a second image;
determining a dynamic range parameter value according to an image acquired by a camera based on at least two exposure amounts and the at least two exposure amounts, including: acquiring a pixel value of a first target position in the first image, wherein the first target position is a position where the pixel value in the second image is minimum and the pixel value in the first image is minimum; acquiring a pixel value of a second target position in a second image, wherein the second target position is a position where the pixel value in the first image is smaller than a preset pixel threshold value and the pixel value in the second image is maximum; and calculating a dynamic range parameter value according to the pixel value of the first target position, the pixel value of the second target position, the first exposure amount and the second exposure amount.
That is, after the first exposure amount and the second exposure amount are determined, the camera may be controlled to perform exposure at the first exposure amount and the second exposure amount, respectively, so that a first image captured by the camera based on the first exposure amount and a second image captured based on the second exposure amount may be obtained. For example, for a camera, the calculated first exposure amount and the second exposure amount can be set into a cmos sensor chip, the cmos sensor chip will output image data corresponding to different exposure amounts, and the output format is determined by the design of the cmos sensor chip. Some of the images are output in the form of sensor chip arrangement, and a first image and a second image need to be separated according to the arrangement rule after the image data is taken. Some cmos sensor chips can directly output the divided images, and thus, the images can be directly processed in the next step.
In addition, after the first image and the second image are obtained, display may be performed, specifically, display is generally small-sized, and either one of the first image and the second image output from the sensor chip may be displayed, and for example, the first image exposed at the reference exposure amount may be displayed.
In calculating the dynamic range parameter value, a pixel value (LongMinPixelValue) of a first target location on the first image is first found, where the pixel value is the smallest and valid, and valid conditions include a and b, a: pixel value on the first image is minimum, b: the pixel value of the second image at this position is equal to 0 or is the minimum value (ShortMinPixelValue), i.e. the minimum value in the second image. For example, as shown in fig. 4, the position in the first image (i.e., the Long Exposure Frame) satisfying both the conditions a and b is (4, 3), and the pixel value corresponding to the first target position on the first image is 1.
The pixel value of the second target location (i.e., ShortMaxPixelValue) where the pixel value is largest and valid on the second image is then found. Effective conditions include c and d, c: the pixel value on the second image is the maximum value, d: the pixel value of the first image at this position is less than 1023 (1023 for 10 bits). For example, as shown in fig. 4, the position in the second image (i.e., Short Exposure Frame) satisfying both the conditions of c and d is (2, 3), and the pixel value corresponding to the second target position on the second image is 1023.
After the pixel value of the first target position and the pixel value of the second target position are obtained, a dynamic range parameter value can be obtained through calculation according to the pixel value of the first target position, the pixel value of the second target position, the first exposure amount and the second exposure amount.
In one embodiment, calculating a dynamic range parameter value based on a pixel value of a first target location, a pixel value of a second target location, a first exposure amount, and a second exposure amount includes: calculating to obtain an initial dynamic range parameter value based on the pixel value of the first target position, the pixel value of the second target position, the first exposure and the second exposure; and under the condition that the pixel value of the second target position is equal to the preset pixel threshold value, obtaining a dynamic range parameter value based on the initial dynamic range parameter value, wherein the dynamic range parameter value is greater than or equal to the initial dynamic range parameter value.
In the case where the pixel value of the second target position is equal to the preset pixel threshold value, it cannot be determined whether the pixel value of the second target position has overflowed or is at an overflow edge. Taking a 10-bit sensor chip as an example, the preset pixel threshold is 1023, if the pixel value is greater than 1023, the overflow is indicated, the pixel value is also 1023, if the pixel value is less than or equal to 1023, the overflow is not indicated, and the pixel value is the actual pixel value. A dynamic range parameter value is obtained based on the initial dynamic range parameter value.
In one example, in the case that the pixel value of the second target position is equal to the preset pixel threshold, if the minimum value of the second image is equal to zero, the initial dynamic range parameter value is calculated based on the pixel value of the first target position, the pixel value of the second target position, the first exposure amount and the second exposure amount, and in one example, the initial dynamic range parameter value may be calculated based on
Figure GDA0002826493010000091
And calculating to obtain an initial dynamic range parameter value, wherein longExposure is the first exposure, and shortExposure is the second exposure. Since the actual pixel value may be greater than the preset pixel threshold, in this embodiment, after obtaining the initial dynamic range parameter value, the dynamic range parameter value obtained based on the initial dynamic range parameter value is greater than or equal to the initial dynamic range parameter value. For example, if
Figure GDA0002826493010000092
Is 64, the initial dynamic range parameter value calculated according to the above formula is A, and the state range parameter value is greater than or equal to A, and can be gradually increased
Figure GDA0002826493010000093
To estimate the dynamic range parameter value. Suppose that will
Figure GDA0002826493010000094
Set to 128, resulting in a value of B, the initial dynamic range parameter value ranges from 0 to B, and a value may be determined between A and B as the dynamic range parameter value.
In one example, in a case where the pixel value of the second target position is equal to the preset pixel threshold, if the minimum value of the second image is greater than zero, the initial dynamic range parameter value is calculated based on the pixel value of the first target position, the pixel value of the second target position, the first exposure amount and the second exposure amount, and in one example, the initial dynamic range parameter value may be calculated based on
Figure GDA0002826493010000095
And calculating to obtain an initial dynamic range parameter value. When ShortMinPixelValue>0, the actual dynamic range parameter value of the current scene is smaller and can be covered in one frame, and the actual dynamic range parameter value is covered in the frame
Figure GDA0002826493010000096
Should be relatively small.
In one embodiment, calculating the dynamic range parameter value based on the pixel value of the first target location, the pixel value of the second target location, the first exposure amount, and the second exposure amount further comprises: and determining the initial dynamic range parameter value as the dynamic range parameter value under the condition that the pixel value of the second target position is smaller than the preset pixel threshold value.
And under the condition that the pixel value of the second target position is smaller than the preset pixel threshold, the highlight of the second image of the current scene is not overflowed, the calculated initial dynamic range parameter value is the dynamic range parameter of the real scene, and the initial dynamic range parameter value is determined to be the dynamic range parameter value.
In one example, in the case that the pixel value of the second target position is less than the preset pixel threshold, if the minimum value of the second image is equal to zero, the second target position may be based on
Figure GDA0002826493010000101
Calculating to obtain initial dynamic stateAnd (4) range parameter values, and then determining the initial dynamic range parameter values as the dynamic range parameter values.
In one example, in the case that the pixel value of the second target position is smaller than the preset pixel threshold, if the minimum value of the second image is larger than zero, the second target position may be based on
Figure GDA0002826493010000102
And calculating to obtain an initial dynamic range parameter value, and then determining the initial dynamic range parameter value as a dynamic range parameter value.
In one example, after the dynamic range parameter value of the current scene is obtained, scene identification can be performed based on the value, so as to realize scene differentiation. For example, the dynamic range parameter value is large, which indicates that the current scene is a high dynamic scene, and there is highlight and dark area. Therefore, the dynamic range parameter value can well quantize the dynamic range of the current scene, and can effectively distinguish a high-dynamic scene from a normal scene.
In one embodiment, determining the target exposure amount based on the base exposure amount and the dynamic range parameter value includes: determining N exposure adjustment amounts based on the dynamic range parameter value, wherein N is a positive integer; adding the reference exposure amount and the N exposure amount adjustment amounts respectively to obtain N first target exposure amounts; the target exposure amount is determined based on the N first target exposure amounts, and the target exposure amount includes the N first target exposure amounts.
After obtaining the dynamic range parameter value, the system is equivalent to knowing a dynamic range of the current scene, and can call different algorithms to perform exposure according to the dynamic range parameter value, such as a single-frame HDR algorithm, a multi-frame HDR algorithm, and the like. Specifically, based on the dynamic range parameter value, N exposure adjustment amounts are determined, where N is different and corresponds to different exposure algorithms, for example, when N is 1, it represents a single-frame HDR algorithm, and when N is an integer greater than 2, it represents a multi-frame HDR algorithm. Thus, the reference exposure is added to the N exposure adjustment amounts respectively to obtain N first target exposure amounts; the target exposure amount is determined based on the N first target exposure amounts. If N is 1, the first target exposure amount is determined as the target exposure amount, and if N is greater than or equal to 2, the N first target exposure amounts and the reference exposure amount are determined as the target exposure amount, and at this time, the target exposure amount further includes the reference exposure amount.
In one embodiment, determining N exposure adjustment amounts based on the dynamic range parameter value includes: determining a first exposure adjustment amount under the condition that the dynamic range parameter value is smaller than or equal to a preset value; and determining at least two second exposure adjustment amounts in the case that the dynamic range parameter value is larger than the preset value, wherein the absolute value of the second exposure adjustment amount is larger than that of the first exposure adjustment amount, and the target exposure amount further comprises the reference exposure amount.
When the dynamic range parameter value is less than or equal to the preset value, it indicates that the dynamic range parameter value is small, and a single-frame HDR algorithm may be used, that is, a first exposure adjustment amount is determined, and a value obtained by adding the first exposure adjustment amount to the reference exposure amount (i.e., the first target exposure amount) is used as the target exposure amount. When the dynamic range parameter value is larger than the preset value, the dynamic range parameter value is larger, at least two second exposure adjustment amounts can be determined by using a multi-frame HDR algorithm, at least two first target exposure amounts can be obtained by adding the at least two second exposure adjustment amounts to the reference exposure amount respectively, and the at least two first target exposure amounts and the reference exposure amount are determined as the target exposure amounts.
Taking the multi-frame HDR algorithm as an example, it can be understood that the exposure bracket is generally synthesized by using three target images with different exposures, or by using six or more target images with different exposures. In the method, after the dynamic range parameter value of the current scene is obtained through calculation, the target exposure amount can be effectively and accurately determined. For example, if a 3-frame bracketing exposure is used, the exposure amount of the reference frame is EV0 (reference exposure amount), and the exposure amounts of the other two frames are EV +2 and EV-2, respectively, frames having three different exposure amounts are combined.
In an example, when the dynamic range parameter value is greater than the preset value, in the process of determining at least two second exposure adjustment amounts, when the dynamic range parameter value is greater than the preset value, if the dynamic range parameter value is greater than the preset parameter threshold value, it indicates that the dynamic range of the current scene is too large, a fault may directly occur in the 3-frame image, and more frames, such as 6 frames, are required to synthesize a better result. We can dynamically select the number of input frames that the algorithm synthesizes, i.e., the size of N, according to the size of the dynamic range value.
In addition, the exposure adjustment amount can be determined according to the dynamic range parameter value. The exposure amount of the reference frame is EV0, after the dynamic range parameter value of the current scene is accurately acquired, at least two second exposure adjustment amounts are determined within a preset range by taking the EVO as a reference. For example, because the dynamic range value is too large, it is likely that the combination result of EV +3 (exposure adjustment amount is 3 based on the reference exposure amount) and EV-3 (exposure adjustment amount is-3 based on the reference exposure amount) is better than the combination of EV +2 (exposure adjustment amount is 2 based on the reference exposure amount) and EV-2 (exposure adjustment amount is-2 based on the reference exposure amount), and therefore, the bracketing setting can be dynamically adjusted in accordance with the dynamic range value.
Through the photographing method of the embodiment, the more accurate dynamic range value in the live-action scene is obtained through the multi-exposure technology, and the exposure is adjusted by utilizing the dynamic range value, so that the exposure is more accurate, the dynamic range of the image can be more accurately improved, and the image quality is improved.
As shown in fig. 5, the present invention further provides a terminal device 500 of an embodiment, which has a camera, and the terminal device 500 includes:
a calculating module 501, configured to calculate a reference exposure amount based on a preview image acquired by a camera;
a dynamic range parameter value obtaining module 502, configured to obtain a dynamic range parameter value, where the dynamic range parameter value is determined based on at least two exposure values determined according to a reference exposure value and an image acquired by a camera based on the at least two exposure values;
a target exposure amount determination module 503 for determining a target exposure amount based on the reference exposure amount and the dynamic range parameter value;
and a target image acquisition module 504, configured to acquire a target image captured by the camera based on the target exposure.
In one embodiment, the terminal device 500 further includes:
the exposure ratio acquisition module is used for acquiring a target exposure ratio;
an exposure amount determination module for determining at least two exposure amounts based on the reference exposure amount and the target exposure ratio;
the image acquisition module is used for acquiring images acquired by the camera based on at least two exposure quantities;
and the parameter value determining module is used for determining a dynamic range parameter value according to the image acquired by the camera based on the at least two exposure quantities and the at least two exposure quantities.
In one embodiment, the at least two exposures include a first exposure and a second exposure, the first exposure being greater than the second exposure;
an exposure determination module comprising:
a first exposure amount determination module for determining a reference exposure amount as a first exposure amount;
and a second exposure amount determination module for determining a ratio between the reference exposure amount and the target exposure ratio as a second exposure amount.
In one embodiment, the image corresponding to the first exposure amount is a first image, and the image corresponding to the second exposure amount is a second image;
a parameter value determination module comprising:
the first pixel value acquisition module is used for acquiring the pixel value of a first target position in the first image, wherein the first target position is a position where the pixel value in the second image is minimum and the pixel value in the first image is minimum;
the second pixel value acquisition module is used for acquiring a pixel value of a second target position in a second image, wherein the second target position is a position where the pixel value in the first image is smaller than a preset pixel threshold value and the pixel value in the second image is maximum;
and the dynamic range parameter value calculating module is used for calculating a dynamic range parameter value according to the pixel value of the first target position, the pixel value of the second target position, the first exposure and the second exposure.
In one embodiment, a dynamic range parameter value calculation module includes:
the initial parameter value calculation module is used for calculating to obtain an initial dynamic range parameter value based on the pixel value of the first target position, the pixel value of the second target position, the first exposure amount and the second exposure amount;
and the state range parameter value determining module is used for determining a dynamic range parameter value based on the initial dynamic range parameter value under the condition that the pixel value of the second target position is equal to the preset pixel threshold value, wherein the dynamic range parameter value is greater than or equal to the initial dynamic range parameter value.
In one embodiment, the state range parameter value determining module is further configured to determine the initial dynamic range parameter value as the dynamic range parameter value if the pixel value of the second target location is smaller than a preset pixel threshold value.
In one embodiment, the target exposure amount determination module includes:
the adjustment quantity determining module is used for determining N exposure adjustment quantities based on the dynamic range parameter value, wherein N is a positive integer;
the exposure processing module is used for adding the reference exposure and the N exposure adjustment quantities respectively to obtain N first target exposures;
and the target exposure quantity acquisition module is used for determining the target exposure quantity based on the N first target exposure quantities, and the target exposure quantity comprises the N first target exposure quantities.
In one embodiment, the adjustment amount determination module is configured to,
determining a first exposure adjustment amount under the condition that the dynamic range parameter value is smaller than or equal to a preset value;
and determining at least two second exposure adjustment amounts in the case that the dynamic range parameter value is larger than the preset value, wherein the absolute value of the second exposure adjustment amount is larger than that of the first exposure adjustment amount, and the target exposure amount further comprises the reference exposure amount.
The technical features in the terminal device provided by the embodiment of the present invention correspond to the technical features in the above-mentioned photographing method, and each process of the above-mentioned photographing method is implemented by the terminal device, and the same effect can be obtained, and in order to avoid repetition, the details are not repeated herein.
Fig. 6 is a schematic diagram of a hardware structure of a terminal device for implementing various embodiments of the present invention, where the terminal device 600 includes, but is not limited to: a radio frequency unit 601, a network module 602, an audio output unit 603, an input unit 604, a sensor 605, a display unit 606, a user input unit 607, an interface unit 608, a memory 609, a processor 610, and a power supply 611. Those skilled in the art will appreciate that the terminal device configuration shown in fig. 6 does not constitute a limitation of the terminal device, and that the terminal device may include more or fewer components than shown, or combine certain components, or a different arrangement of components. In the embodiment of the present invention, the terminal device includes, but is not limited to, a mobile phone, a tablet computer, a notebook computer, a palm computer, a vehicle-mounted mobile terminal, a wearable device, a pedometer, and the like.
The processor 610 is configured to calculate a reference exposure amount based on a preview image acquired by the camera; acquiring a dynamic range parameter value, wherein the dynamic range parameter value is determined based on at least two exposure values determined according to a reference exposure value and an image acquired by a camera based on the at least two exposure values; determining the target exposure according to the reference exposure and the dynamic range parameter value; and acquiring a target image shot by the camera based on the target exposure.
In the process of determining the target exposure amount, not only the reference exposure amount is considered, but also the dynamic range parameter value is considered, the dynamic range parameter value is determined based on at least two exposure amounts determined according to the reference exposure amount, and the camera is determined based on the image acquired by the at least two exposure amounts, exposure adjustment is carried out on the terminal equipment through the reference exposure amount and the dynamic range parameter value, the target exposure amount is obtained, the accuracy of the target exposure amount can be improved, and therefore when shooting is carried out according to the target exposure amount, the quality of the shot target image can be improved.
It should be understood that, in the embodiment of the present invention, the radio frequency unit 601 may be used for receiving and sending signals during a message sending and receiving process or a call process, and specifically, receives downlink data from a base station and then processes the received downlink data to the processor 610; in addition, the uplink data is transmitted to the base station. In general, radio frequency unit 601 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like. Further, the radio frequency unit 601 may also communicate with a network and other devices through a wireless communication system.
The terminal device provides the user with wireless broadband internet access through the network module 602, such as helping the user send and receive e-mails, browse webpages, access streaming media, and the like.
The audio output unit 603 may convert audio data received by the radio frequency unit 601 or the network module 602 or stored in the memory 609 into an audio signal and output as sound. Also, the audio output unit 603 can also provide audio output related to a specific function performed by the terminal apparatus 600 (e.g., a call signal reception sound, a message reception sound, etc.). The audio output unit 603 includes a speaker, a buzzer, a receiver, and the like.
The input unit 604 is used to receive audio or video signals. The input Unit 604 may include a Graphics Processing Unit (GPU) 6041 and a microphone 6042, and the Graphics processor 6041 processes image data of a still picture or video obtained by an image capturing terminal device (such as a camera) in a video capturing mode or an image capturing mode. The processed image frames may be displayed on the display unit 606. The image frames processed by the graphic processor 6041 may be stored in the memory 609 (or other storage medium) or transmitted via the radio frequency unit 601 or the network module 602. The microphone 6042 can receive sound, and can process such sound into audio data. The processed audio data may be converted into a format output transmittable to a mobile communication base station via the radio frequency unit 601 in case of the phone call mode.
The terminal device 600 further comprises at least one sensor 605, such as a light sensor, a motion sensor, and other sensors. Specifically, the light sensor includes an ambient light sensor that can adjust the luminance of the display panel 6091 according to the brightness of ambient light, and a proximity sensor that can turn off the display panel 6091 and/or the backlight when the terminal apparatus 600 is moved to the ear. As one of the motion sensors, the accelerometer sensor can detect the magnitude of acceleration in each direction (generally three axes), detect the magnitude and direction of gravity when stationary, and can be used to identify the terminal device posture (such as horizontal and vertical screen switching, related games, magnetometer posture calibration), vibration identification related functions (such as pedometer, tapping), and the like; the sensors 605 may also include fingerprint sensors, pressure sensors, iris sensors, molecular sensors, gyroscopes, barometers, hygrometers, thermometers, infrared sensors, etc., which are not described in detail herein.
The display unit 606 is used to display information input by the user or information provided to the user. The Display unit 606 may include a Display panel 6061, and the Display panel 6061 may be configured by a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like.
The user input unit 607 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the terminal device. Specifically, the user input unit 607 includes a touch panel 6071 and other input devices 6072. Touch panel 6071, also referred to as a touch screen, may collect touch operations by a user on or near it (e.g., operations by a user on or near touch panel 6071 using a finger, stylus, or any suitable object or accessory). The touch panel 6071 may include two parts of a touch detection terminal device and a touch controller. The touch detection terminal equipment detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch detection terminal device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 610, receives a command sent by the processor 610, and executes the command. In addition, the touch panel 6071 can be implemented by various types such as a resistive type, a capacitive type, an infrared ray, and a surface acoustic wave. The user input unit 607 may include other input devices 6072 in addition to the touch panel 6071. Specifically, the other input devices 6072 may include, but are not limited to, a physical keyboard, function keys (such as volume control keys, switch keys, etc.), a track ball, a mouse, and a joystick, which are not described herein again.
Further, the touch panel 6071 can be overlaid on the display panel 6061, and when the touch panel 6071 detects a touch operation on or near the touch panel 6071, the touch operation is transmitted to the processor 610 to determine the type of the touch event, and then the processor 610 provides a corresponding visual output on the display panel 6061 according to the type of the touch event. Although in fig. 5, the touch panel 6071 and the display panel 6061 are two independent components to implement the input and output functions of the terminal device, in some embodiments, the touch panel 6071 and the display panel 6061 may be integrated to implement the input and output functions of the terminal device, and this is not limited here.
The interface unit 608 is an interface for connecting an external terminal device to the terminal device 600. For example, the external terminal device may include a wired or wireless headset port, an external power supply (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a terminal device having an identification module, an audio input/output (I/O) port, a video I/O port, an earphone port, and the like. The interface unit 608 may be used to receive input (e.g., data information, power, etc.) from an external terminal device and transmit the received input to one or more elements within the terminal device 600 or may be used to transmit data between the terminal device 600 and the external terminal device.
The memory 609 may be used to store software programs as well as various data. The memory 609 may mainly include a program storage area and a data storage area, wherein the program storage area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the cellular phone, and the like. Further, the memory 609 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
The processor 610 is a control center of the terminal device, connects various parts of the entire terminal device by using various interfaces and lines, and performs various functions of the terminal device and processes data by running or executing software programs and/or modules stored in the memory 609 and calling data stored in the memory 609, thereby performing overall monitoring of the terminal device. Processor 610 may include one or more processing units; preferably, the processor 610 may integrate an application processor, which mainly handles operating systems, user interfaces, application programs, etc., and a modem processor, which mainly handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor 610.
The terminal device 600 may further include a power supply 611 (such as a battery) for supplying power to various components, and preferably, the power supply 611 may be logically connected to the processor 610 through a power management system, so as to implement functions of managing charging, discharging, and power consumption through the power management system.
In addition, the terminal device 600 includes some functional modules that are not shown, and are not described in detail herein.
The embodiment of the present invention further provides a terminal device, which includes a processor 610 and a memory 609, where the memory 609 stores a computer program that can be run on the processor 610, and when the computer program is executed by the processor 610, the computer program implements each process in the foregoing photographing method embodiment, and can achieve the same technical effect, and in order to avoid repetition, details are not described here again.
The embodiment of the present invention further provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the computer program implements the processes of the above-mentioned photographing method embodiment, and can achieve the same technical effects, and in order to avoid repetition, the descriptions thereof are omitted here. The computer-readable storage medium may be a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or terminal apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or terminal apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or terminal that comprises the element.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present invention may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal device (such as a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present invention.
The technical features of the above embodiments can be arbitrarily combined, and for the sake of brevity, all possible combinations of the technical features in the above embodiments are not described, but should be considered as the scope of the present specification as long as there is no contradiction between the combinations of the technical features.
While the present invention has been described with reference to the embodiments shown in the drawings, the present invention is not limited to the embodiments, which are illustrative and not restrictive, and it will be apparent to those skilled in the art that various changes and modifications can be made therein without departing from the spirit and scope of the invention as defined in the appended claims.

Claims (14)

1. A photographing method is applied to a terminal device comprising a camera, and the method comprises the following steps:
calculating a reference exposure based on a preview image acquired by the camera;
acquiring a dynamic range parameter value, wherein the dynamic range parameter value is determined based on at least two exposure values determined according to the reference exposure value and images acquired by the camera based on the at least two exposure values;
determining a target exposure according to the reference exposure and the dynamic range parameter value;
acquiring a target image shot by the camera based on the target exposure;
the manner of determining the dynamic range parameter value includes:
acquiring a target exposure ratio;
determining the at least two exposures based on the base exposure and the target exposure ratio;
acquiring images acquired by the camera based on the at least two exposures;
determining the dynamic range parameter value according to an image acquired by the camera based on at least two exposure values and the at least two exposure values;
the at least two exposures include a first exposure and a second exposure, the first exposure being greater than the second exposure;
the determining the at least two exposures based on the base exposure and the target exposure ratio comprises:
determining the base exposure amount as the first exposure amount;
determining a ratio between the base exposure amount and a target exposure ratio as the second exposure amount.
2. The method according to claim 1, wherein the image corresponding to the first exposure is a first image, and the image corresponding to the second exposure is a second image;
the determining the dynamic range parameter value according to the image acquired by the camera based on at least two exposure values and the at least two exposure values comprises:
acquiring a pixel value of a first target position in the first image, wherein the first target position is a position where the pixel value in the second image is minimum and the pixel value in the first image is minimum;
acquiring a pixel value of a second target position in the second image, wherein the second target position is a position where the pixel value in the first image is smaller than a preset pixel threshold and the pixel value in the second image is maximum;
and calculating the dynamic range parameter value according to the pixel value of the first target position, the pixel value of the second target position, the first exposure and the second exposure.
3. The method of claim 2, wherein the calculating the dynamic range parameter value as a function of the pixel value of the first target location, the pixel value of the second target location, the first exposure amount, and the second exposure amount comprises:
calculating to obtain an initial dynamic range parameter value based on the pixel value of the first target position, the pixel value of the second target position, the first exposure amount and the second exposure amount;
determining the dynamic range parameter value based on the initial dynamic range parameter value if the pixel value of the second target location is equal to the preset pixel threshold, the dynamic range parameter value being greater than or equal to the initial dynamic range parameter value;
calculating the initial dynamic range parameter value by the following formula:
Figure FDA0002826493000000021
or
Figure FDA0002826493000000022
Wherein, ShortMaxPixelValue is the pixel value of the second target position, ShortMinPixelValue is the minimum value in the second image, LongExposure is the first exposure, shortExposure is the second exposure, if ShortMinPixelValue is equal to zero, the formula is used for calculating the exposure value of the second target position
Figure FDA0002826493000000023
Calculating the initial dynamic range parameter, if ShortMinPixelValue is greater than zero, passing through a formula
Figure FDA0002826493000000024
Calculating the initial dynamic range parameter.
4. The method of claim 3, wherein the calculating the dynamic range parameter value based on the pixel value of the first target location, the pixel value of the second target location, the first exposure amount, and the second exposure amount further comprises:
determining the initial dynamic range parameter value as the dynamic range parameter value if the pixel value of the second target position is less than the preset pixel threshold value.
5. The method of claim 1, wherein determining a target exposure amount based on the base exposure amount and the dynamic range parameter value comprises:
determining N exposure adjustment amounts based on the dynamic range parameter value, wherein N is a positive integer;
adding the reference exposure amount and the N exposure amount adjustment amounts respectively to obtain N first target exposure amounts;
the target exposure amount is determined based on N first target exposure amounts, which include the N first target exposure amounts.
6. The method of claim 5, wherein determining N exposure adjustment amounts based on the dynamic range parameter value comprises:
determining a first exposure adjustment amount under the condition that the dynamic range parameter value is smaller than or equal to a preset value;
and determining at least two second exposure adjustment amounts in the case that the dynamic range parameter value is larger than a preset value, wherein the absolute value of the second exposure adjustment amount is larger than that of the first exposure adjustment amount, and the target exposure amount further comprises the reference exposure amount.
7. A terminal device characterized by having a camera, comprising:
the calculation module is used for calculating the reference exposure amount based on the preview image acquired by the camera;
a dynamic range parameter value acquisition module for acquiring a dynamic range parameter value, wherein the dynamic range parameter value is determined based on at least two exposure values determined according to the reference exposure value and an image acquired by the camera based on the at least two exposure values;
the target exposure amount determining module is used for determining the target exposure amount according to the reference exposure amount and the dynamic range parameter value;
the target image acquisition module is used for acquiring a target image shot by the camera based on the target exposure;
the terminal device further includes:
the exposure ratio acquisition module is used for acquiring a target exposure ratio;
an exposure amount determination module for determining the at least two exposure amounts based on the reference exposure amount and the target exposure ratio;
the image acquisition module is used for acquiring images acquired by the camera based on the at least two exposure quantities;
the parameter value determining module is used for determining the dynamic range parameter value according to the image acquired by the camera based on at least two exposure quantities and the at least two exposure quantities;
the at least two exposures include a first exposure and a second exposure, the first exposure being greater than the second exposure;
the exposure amount determination module includes:
a first exposure amount determination module for determining the reference exposure amount as the first exposure amount;
a second exposure amount determination module for determining a ratio between the reference exposure amount and a target exposure ratio as the second exposure amount.
8. The terminal device according to claim 7, wherein the image corresponding to the first exposure is a first image, and the image corresponding to the second exposure is a second image;
the parameter value determination module comprises:
a first pixel value obtaining module, configured to obtain a pixel value of a first target position in the first image, where the first target position is a position where a pixel value in the second image is minimum and a pixel value in the first image is minimum;
a second pixel value obtaining module, configured to obtain a pixel value of a second target position in the second image, where the second target position is a position where the pixel value in the first image is smaller than a preset pixel threshold and the pixel value in the second image is the largest;
and the dynamic range parameter value calculating module is used for calculating the dynamic range parameter value according to the pixel value of the first target position, the pixel value of the second target position, the first exposure and the second exposure.
9. The terminal device of claim 8, wherein the dynamic range parameter value calculating module comprises:
an initial parameter value calculation module, configured to calculate an initial dynamic range parameter value based on the pixel value of the first target location, the pixel value of the second target location, the first exposure amount, and the second exposure amount;
a state range parameter value determining module, configured to determine the dynamic range parameter value based on the initial dynamic range parameter value when the pixel value of the second target location is equal to the preset pixel threshold, where the dynamic range parameter value is greater than or equal to the initial dynamic range parameter value;
calculating the initial dynamic range parameter value by the following formula:
Figure FDA0002826493000000041
or
Figure FDA0002826493000000042
Wherein, ShortMaxPixelValue is the pixel value of the second target position, ShortMinPixelValue is the minimum value in the second image, LongExposure is the first exposure, shortExposure is the second exposure, if ShortMinPixelValue is equal to zero, the formula is used for calculating the exposure value of the second target position
Figure FDA0002826493000000043
Calculating the initial dynamic range parameter, if ShortMinPixelValue is greater than zero, passing through a formula
Figure FDA0002826493000000051
Calculating the initial dynamic range parameter.
10. The terminal device according to claim 9, wherein the state range parameter value determining module is further configured to determine the initial dynamic range parameter value as the dynamic range parameter value if the pixel value of the second target location is smaller than the preset pixel threshold value.
11. The terminal device according to claim 7, wherein the target exposure amount determination module includes:
an adjustment amount determining module, configured to determine N exposure adjustment amounts based on the dynamic range parameter value, where N is a positive integer;
the exposure processing module is used for adding the reference exposure with the N exposure adjustment quantities to obtain N first target exposures;
a target exposure amount acquisition module to determine the target exposure amount based on N first target exposure amounts, the target exposure amount including the N first target exposure amounts.
12. The terminal device of claim 11, wherein the adjustment amount determining module is configured to,
determining a first exposure adjustment amount under the condition that the dynamic range parameter value is smaller than or equal to a preset value;
and determining at least two second exposure adjustment amounts in the case that the dynamic range parameter value is larger than a preset value, wherein the absolute value of the second exposure adjustment amount is larger than that of the first exposure adjustment amount, and the target exposure amount further comprises the reference exposure amount.
13. A mobile terminal, comprising: a memory storing a computer program and a processor implementing the steps in the photographing method according to any one of claims 1 to 6 when the processor executes the computer program.
14. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the steps in the photographing method according to any one of claims 1 to 6.
CN201910472318.2A 2019-05-31 2019-05-31 Photographing method, terminal equipment and computer readable storage medium Active CN110213484B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910472318.2A CN110213484B (en) 2019-05-31 2019-05-31 Photographing method, terminal equipment and computer readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910472318.2A CN110213484B (en) 2019-05-31 2019-05-31 Photographing method, terminal equipment and computer readable storage medium

Publications (2)

Publication Number Publication Date
CN110213484A CN110213484A (en) 2019-09-06
CN110213484B true CN110213484B (en) 2021-03-12

Family

ID=67790210

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910472318.2A Active CN110213484B (en) 2019-05-31 2019-05-31 Photographing method, terminal equipment and computer readable storage medium

Country Status (1)

Country Link
CN (1) CN110213484B (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111131693B (en) * 2019-11-07 2021-07-30 深圳市艾为智能有限公司 Face image enhancement method based on multi-exposure face detection
CN113132644B (en) * 2019-12-31 2024-03-19 深圳市海思半导体有限公司 Method and equipment for generating high dynamic range image
CN113824892B (en) * 2020-06-19 2023-11-07 浙江宇视科技有限公司 Image acquisition method, device, equipment and storage medium
CN114785966B (en) * 2022-06-21 2022-09-06 深圳比特微电子科技有限公司 Exposure control method, shooting processing method, device and medium
CN116723419B (en) * 2023-07-03 2024-03-22 北京拙河科技有限公司 Acquisition speed optimization method and device for billion-level high-precision camera

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108833804A (en) * 2018-09-20 2018-11-16 Oppo广东移动通信有限公司 Imaging method, device and electronic equipment

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101604068B1 (en) * 2009-09-22 2016-03-17 삼성전자주식회사 High dynamic range imaging apparatus and method
JP6445831B2 (en) * 2014-10-16 2018-12-26 キヤノン株式会社 Imaging apparatus, control method thereof, and program
CN105100637A (en) * 2015-08-31 2015-11-25 联想(北京)有限公司 Image processing method and electronic equipment
CN108833802B (en) * 2018-09-18 2020-07-31 Oppo广东移动通信有限公司 Exposure control method and device and electronic equipment
CN108900785A (en) * 2018-09-18 2018-11-27 Oppo广东移动通信有限公司 Exposal control method, device and electronic equipment

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108833804A (en) * 2018-09-20 2018-11-16 Oppo广东移动通信有限公司 Imaging method, device and electronic equipment

Also Published As

Publication number Publication date
CN110213484A (en) 2019-09-06

Similar Documents

Publication Publication Date Title
CN110213484B (en) Photographing method, terminal equipment and computer readable storage medium
CN107613191B (en) Photographing method, photographing equipment and computer readable storage medium
CN108307109B (en) High dynamic range image preview method and terminal equipment
CN107592466B (en) Photographing method and mobile terminal
CN109688322B (en) Method and device for generating high dynamic range image and mobile terminal
CN107566730B (en) A kind of panoramic picture image pickup method and mobile terminal
CN107623818B (en) Image exposure method and mobile terminal
CN109218626B (en) Photographing method and terminal
CN108234894B (en) Exposure adjusting method and terminal equipment
CN108449541B (en) Panoramic image shooting method and mobile terminal
CN107948505B (en) Panoramic shooting method and mobile terminal
CN109462745B (en) White balance processing method and mobile terminal
CN108924414B (en) Shooting method and terminal equipment
CN107730460B (en) Image processing method and mobile terminal
CN108280817B (en) Image processing method and mobile terminal
CN111601032A (en) Shooting method and device and electronic equipment
CN109474784B (en) Preview image processing method and terminal equipment
CN111083386B (en) Image processing method and electronic device
CN108307123B (en) Exposure adjusting method and mobile terminal
CN110708475B (en) Exposure parameter determination method, electronic equipment and storage medium
CN110363729B (en) Image processing method, terminal equipment and computer readable storage medium
CN111131722A (en) Image processing method, electronic device, and medium
CN109167917B (en) Image processing method and terminal equipment
CN110913133B (en) Shooting method and electronic equipment
CN111416948A (en) Image processing method and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant