CN108337445B - Photographing method, related device and computer storage medium - Google Patents

Photographing method, related device and computer storage medium Download PDF

Info

Publication number
CN108337445B
CN108337445B CN201810257073.7A CN201810257073A CN108337445B CN 108337445 B CN108337445 B CN 108337445B CN 201810257073 A CN201810257073 A CN 201810257073A CN 108337445 B CN108337445 B CN 108337445B
Authority
CN
China
Prior art keywords
image
camera
exposure
preview image
images
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN201810257073.7A
Other languages
Chinese (zh)
Other versions
CN108337445A (en
Inventor
戴俊
李健
张熙
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN201810257073.7A priority Critical patent/CN108337445B/en
Publication of CN108337445A publication Critical patent/CN108337445A/en
Application granted granted Critical
Publication of CN108337445B publication Critical patent/CN108337445B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/741Circuitry for compensating brightness variation in the scene by increasing the dynamic range of the image compared to the dynamic range of the electronic image sensors

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)

Abstract

The embodiment of the invention discloses a photographing method, related equipment and a computer storage medium, wherein the method is applied to terminal equipment provided with at least two cameras, and comprises the following steps: acquiring a preview image through a first camera; determining a first exposure parameter through the second camera under the condition that the preview image has overexposure; the image acquired by the second camera corresponding to the first exposure parameter has no overexposure; acquiring at least two frames of images through the first camera; the exposure parameters of the first cameras corresponding to the at least two frames of images are different, and the exposure amount corresponding to the at least two frames of images is greater than or equal to the exposure amount corresponding to the first exposure parameter; and synthesizing the at least two acquired frames of images into a High Dynamic Range (HDR) image. By adopting the embodiment of the invention, the problems of poor image quality and the like in the conventional HDR photographing method can be solved, so that the quality of the photographed image can be improved.

Description

Photographing method, related device and computer storage medium
Technical Field
The present invention relates to the field of terminal technologies, and in particular, to a photographing method, a related device, and a computer storage medium.
Background
With the development of the terminal industry, photographing has become an indispensable important function in the terminal. Since the dynamic range supported and detected by an image sensor in the terminal is far smaller than the dynamic variation range of a real scene, how to shoot a high-dynamic range (HDR) image is always a difficult problem in the photographing technology.
The prior art provides two HDR photographing methods, one is to continuously acquire multi-frame images based on a single camera and then obtain a final photographed image through algorithm synthesis. Specifically, the photographing parameters (such as exposure parameters) used in the multi-frame image are obtained by analyzing the preview image empirically after the preview image is acquired by using a single camera. In practice, it is found that such HDR photographing method photographs completely by means of subjective experience of a user, and it is difficult to obtain a high-quality HDR image.
The other method is to simultaneously acquire images under different exposure parameters through two cameras and then synthesize the acquired images to obtain a final shot image. There are specifically two implementations:
in the first mode, the two cameras are used for respectively measuring light of a photographed foreground area and a photographed background area, the respective exposure amounts of the two cameras are adjusted, so that images suitable for the foreground exposure amount and the background exposure amount are respectively collected by the two cameras, and finally the images are synthesized into a final shot image through an algorithm. If the image is acquired by adopting the proper foreground exposure, the background area in the image has overexposure overflow (namely, is obviously too bright); if the image is acquired with a suitable background exposure, there will be under-exposed dead black (i.e. significantly too dark) in the foreground region of the image. Therefore, the quality of the images acquired by the two cameras is poor, and the images are not suitable for being used as preview images, so that the user experience is influenced.
In the second mode, the exposure parameters (i.e. exposure) of the two cameras are adjusted during photographing, the images under different exposure parameters are acquired by the two cameras, and the images are synthesized into a final photographed image through an algorithm. The scheme does not involve the change of the exposure of the preview image, namely the exposure parameter of the preview image is not changed, and the final shot image is obtained by adjusting the exposure parameter once during shooting. Due to the fact that overexposure overflow or underexposure dead black possibly exists in a preview image acquired by the two cameras, light metering is inaccurate during photographing, and once exposure parameters are inaccurate to adjust, an overexposure or underexposure area cannot be completely recovered in a finally synthesized photographed image, and the quality of the obtained photographed image is poor.
Disclosure of Invention
The embodiment of the invention discloses a photographing method, related equipment and a computer storage medium, which can solve the problems of poor image quality and the like in the conventional HDR photographing method.
In a first aspect, an embodiment of the present invention discloses a photographing method, where the method includes:
acquiring a preview image through a first camera;
determining a first exposure parameter through the second camera under the condition that the preview image has overexposure; the image acquired by the second camera corresponding to the first exposure parameter has no overexposure;
acquiring at least two frames of images through the first camera; the exposure parameters of the first cameras corresponding to the at least two frames of images are different, and the exposure amount corresponding to the exposure parameters of the first cameras corresponding to the at least two frames of images is greater than or equal to the exposure amount corresponding to the first exposure parameters;
and synthesizing the at least two acquired frames of images into a High Dynamic Range (HDR) image.
By implementing the embodiment of the invention, the problems that the image quality is poor, the image acquired by a camera is not suitable for being used as a preview image and the like in the conventional HDR photographing method can be solved.
In some possible embodiments, the preview image having overexposure includes:
the proportion of the first highlight area in the preview image is larger than or equal to a first preset proportion, or the average value of the pixel brightness of the second highlight area in the preview image is larger than or equal to a first brightness threshold value;
the preview image being absent of overexposure comprises:
the proportion of a first highlight area in the preview image is smaller than the first preset proportion, or the average value of the pixel brightness of a second highlight area in the preview image is smaller than the first brightness threshold, or the exposure amount of the first camera reaches a preset lower limit value;
the first highlight area is a pixel area of the highest N gray scales in the preview image, and N is a positive integer; the second highlight area is an area of the first M pixels with the highest brightness in the preview image, and M is a positive integer. Optionally, the second highlight area may also be an area corresponding to the top m% of pixels with the highest brightness in the preview image, where m is a custom common statement, for example, 1.5.
In some possible embodiments, the determining of the first exposure parameter by the second camera is performed only if there is overexposure in the first image acquired by the second camera. Understandably, in order to ensure the accuracy and reliability of the photographing processing, the second camera is required to be used for acquiring the first image, and the first exposure parameter of the second camera is determined under the condition that the first image still has overexposure.
In some possible embodiments, the exposure parameters corresponding to the at least two frames of images are between the first exposure parameter and a normal exposure parameter; and the average value of the pixel brightness in the image corresponding to the normal exposure parameter is equal to a preset brightness threshold value.
Specifically, in a photographing mode, after the exposure parameters are determined by the second camera, the first camera can be used to acquire a series of frame images (i.e., at least two frame images) from the first exposure parameters to the normal exposure parameters, so that the final HDR image can be synthesized by using a setting algorithm.
In some possible embodiments, the method further comprises:
determining a second exposure parameter through the second camera under the condition that the preview image has underexposure; and the image acquired by the second camera corresponding to the first exposure parameter has no underexposure.
In some possible embodiments, the preview image having underexposure includes:
the ratio of the first dim light area in the preview image is greater than or equal to a second preset ratio, or the average value of the pixel brightness of the second dim light area in the preview image is smaller than a second brightness threshold value;
the absence of underexposure of the preview image comprises:
the ratio of the first dim light area in the preview image is smaller than a first preset ratio, or the average value of the pixel brightness of the second dim light area in the preview image is larger than or equal to a second brightness threshold value, or the exposure amount of the first camera reaches a preset upper limit value;
the first dim light area is a pixel area of the lowest Q gray scales in the preview image, and Q is a positive integer; the second dim light area is an area of P pixels with the lowest brightness in the preview image, and P is a positive integer. Optionally, the second dim area may also be an area corresponding to a pixel with the lowest brightness of p% in the preview image, where p is a custom constant.
In some possible embodiments, the determining of the second exposure parameter by the second camera is performed only if there is an underexposure in the first image acquired by the second camera. Undersampled images obtained by the second camera are acquired by the first camera, and the second camera is used for determining the second exposure parameters.
In some possible embodiments, the exposure parameter corresponding to the at least two frames of images is between the first exposure parameter and the second exposure parameter.
Specifically, in another photographing mode, after the exposure parameters are determined by the second camera, the first camera may be used to capture a series of frame images (i.e., at least two frame images) from the first exposure parameters to the second exposure parameters, so that the final HDR image is synthesized by using a setting algorithm.
In some possible embodiments, the first image is an image obtained by the second camera under normal exposure parameters, and the preview image is an image obtained by the first camera under the normal exposure parameters, and the normal exposure parameters are such that an average value of pixel brightness in the preview image (or the first image) is equal to a preset brightness threshold.
In some possible embodiments, the first image is an image obtained by image transformation, and the spatial characteristics of the image content of the first image and the image content of the preview image are consistent. Image transformations herein include, but are not limited to, image translation, rotation, and projection correction, among others. The spatial features include, but are not limited to, any one or combination of more of the following: color features, texture features, shape features, and spatial relationship features, among others.
In a second aspect, the present invention discloses another terminal device, including a first camera, a second camera, and a processing unit; wherein the content of the first and second substances,
the processing unit is used for acquiring a preview image through the first camera;
the processing unit is further configured to determine a first exposure parameter through the second camera under the condition that the preview image has overexposure; the image acquired by the second camera corresponding to the first exposure parameter has no overexposure;
the processing unit is further used for acquiring at least two frames of images through the first camera; the exposure parameters of the first cameras corresponding to the at least two frames of images are different, and the exposure amount corresponding to the exposure parameters of the first cameras corresponding to the at least two frames of images is greater than or equal to the exposure amount corresponding to the first exposure parameters;
the processing unit is further configured to synthesize the at least two acquired frames of images into a high dynamic range HDR image.
In some possible embodiments, the preview image having overexposure includes:
the proportion of the first highlight area in the preview image is larger than or equal to a first preset proportion, or the average value of the pixel brightness of the second highlight area in the preview image is larger than or equal to a first brightness threshold value;
the preview image being absent of overexposure comprises:
the proportion of a first highlight area in the preview image is smaller than the first preset proportion, or the average value of the pixel brightness of a second highlight area in the preview image is smaller than the first brightness threshold, or the exposure amount of the first camera reaches a preset lower limit value;
the first highlight area is a pixel area of the highest N gray scales in the preview image, and N is a positive integer; the second highlight area is an area of the first M pixels with the highest brightness in the preview image, and M is a positive integer.
In some possible embodiments, the determining of the first exposure parameter by the second camera is performed only if there is overexposure in the first image acquired by the second camera. Understandably, in order to ensure the reliability and accuracy of the photographing processing, the second camera can be used for determining the second exposure parameter under the condition that the overexposure still exists in the first image obtained by the second camera.
In some possible embodiments, the exposure parameters corresponding to the at least two frames of images are between the first exposure parameter and a normal exposure parameter; and the average value of the pixel brightness in the image corresponding to the normal exposure parameter is equal to a preset brightness threshold value.
In some of the possible embodiments of the present invention,
the processing unit is further configured to determine a second exposure parameter through the second camera if the preview image has underexposure; and the image acquired by the second camera corresponding to the first exposure parameter has no underexposure.
In some possible embodiments, the preview image having underexposure includes:
the ratio of the first dim light area in the preview image is greater than or equal to a second preset ratio, or the average value of the pixel brightness of the second dim light area in the preview image is smaller than a second brightness threshold value;
the absence of underexposure of the preview image comprises:
the ratio of the first dim light area in the preview image is smaller than a first preset ratio, or the average value of the pixel brightness of the second dim light area in the preview image is larger than or equal to a second brightness threshold value, or the exposure amount of the first camera reaches a preset upper limit value;
the first dim light area is a pixel area of the lowest Q gray scales in the preview image, and Q is a positive integer; the second dim light area is an area of P pixels with the lowest brightness in the preview image, and P is a positive integer. Optionally, the second dim area may also be an area corresponding to a pixel with the lowest brightness of p% in the preview image, where p is a custom constant.
In some possible embodiments, the determining of the second exposure parameter by the second camera is performed only if there is an underexposure in the first image acquired by the second camera.
In some possible embodiments, the exposure parameter corresponding to the at least two frames of images is between the first exposure parameter and the second exposure parameter.
In some possible embodiments, the first image is an image obtained by the second camera under normal exposure parameters, and the preview image is an image obtained by the first camera under the normal exposure parameters, and the normal exposure parameters are such that an average value of pixel brightness in the preview image (or the first image) is equal to a preset brightness threshold.
In some possible embodiments, the first image is an image obtained by image transformation, and the spatial characteristics of the image content of the first image and the image content of the preview image are consistent.
For the embodiments of the present invention, reference may be made to the related explanations in the embodiments of the first aspect, and details are not described here.
In a third aspect, an embodiment of the present invention provides another terminal device, including at least two cameras including a first camera and a second camera, further including a memory and a processor coupled to the memory and the at least two cameras; the memory is used for storing instructions, and the processor is used for executing the instructions and communicating with the first camera and the second camera; wherein the processor executes the instructions to perform the method described in the first aspect.
In some possible embodiments, the terminal device further includes a display coupled to the processor, and the display is configured to display, under control of the processor, an image acquired by the first camera or the second camera, or an HDR image obtained by the processor when executing the method.
In some possible embodiments, the terminal device further includes a communication interface, which is in communication with the processor, and the communication interface is used for communicating with other devices (such as the first camera, the second camera, the display screen, the network device, and the like) under the control of the processor.
In a fourth aspect, a computer-readable storage medium having program code stored therein for service handover processing is provided. The program code comprises instructions for performing the method described in the first aspect above.
By implementing the embodiment of the invention, the problems of poor image quality and the like in the conventional HDR photographing method can be solved, so that the quality of the photographed image can be improved.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below.
Fig. 1 is a schematic diagram of an image according to an embodiment of the present invention.
Fig. 2 is a schematic flowchart of a photographing method according to an embodiment of the present invention.
Fig. 3 is a schematic flowchart of another photographing method according to an embodiment of the present invention.
Fig. 4A-4B are schematic diagrams of two preview images provided by embodiments of the present invention.
Fig. 5 is a schematic diagram of a distribution of pixel luminance according to an embodiment of the present invention.
Fig. 6 is a schematic view of a photographing scene according to an embodiment of the present invention.
Fig. 7 is a schematic structural diagram of a terminal device according to an embodiment of the present invention.
Fig. 8 is a schematic structural diagram of another terminal device according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be described in detail below with reference to the accompanying drawings of the present invention.
The applicant finds in the course of the present application that there are three types of existing HDR image photographing methods:
firstly, continuously collecting multi-frame images based on a single camera, and then synthesizing to obtain a final shot image. The exposure parameters respectively and correspondingly adopted by the multiple frames of images are obtained by empirical analysis of a user, and are specifically obtained by empirical analysis of a preview image during framing of a single camera. It can be seen that such a HDR image photographing method is completely dependent on human experience, and it is difficult to obtain a high-quality HDR image.
And secondly, acquiring images with different exposure parameters by using two cameras, and then synthesizing to obtain a final shot image. When the scheme is realized, the light is measured on the preview image acquired by the two cameras, so that the exposure (exposure parameter) of the two cameras is adjusted, the images suitable for the foreground exposure and the background exposure are acquired by the two cameras respectively, and then the images are synthesized. However, in practice, it is found that the images acquired by the two cameras have the problems that the background area is obviously too bright, or the foreground area is obviously too dark, and the like, and the image quality is poor, so that the images are not suitable for being used as preview images to be viewed by users, and the user experience is influenced.
And thirdly, acquiring images with different exposure parameters by using two cameras, and then synthesizing to obtain a final shot image. The exposure parameters of the image cannot be previewed at a high edge when the scheme is realized, and the exposure parameters of the double cameras are adjusted only when the image is shot. Because the preview image acquired by the double cameras may have over-exposure overflow or under-exposure dead black, which may cause inaccurate photometry during subsequent photographing, the acquired image after one-time exposure parameter adjustment is used to synthesize the final photographed image, which is easily to appear in the photographed image and cannot completely recover the over-exposure or under-exposure area, and the quality of the photographed image is still poor.
In order to solve the above problems, the present application provides a photographing method and a terminal device to which the method is applied. The following first presents several concepts related to the present application.
The foreground region refers to a person or an object located in front of or near the front edge of the image, and may also be referred to as a foreground. Accordingly, the background region refers to an environment for accompanying the subject in the captured image, and may be referred to as a background. Specifically, fig. 1 shows a human image, in which the black area is a human image portion, which may be a foreground area in the image. Accordingly, the remaining white area is the background area.
The high dynamic range HDR means that the ratio between the highest value and the lowest value of the signal is greater than or equal to a preset threshold. For example, it is true that there is a difference in brightness, i.e., the ratio between the brightness of the brightest object and the brightness of the darkest object is 108Beyond the range recognizable by the human eye 105. Taking HDR images as an example, it can be understood that: the brightest area and/or the darkest area exist in the image, and the brightest/darkest area cannot be observed by human eyes, namely, the brightness of the pixel in the brightest/darkest area reaches the ground step which cannot be observed by the human eyes. Alternatively, when the HDR image includes both the brightest area and the darkest area, it may also mean that the ratio between the brightest area and the darkest area in the image exceeds a preset threshold. Compared with a common image, the HDR image can provide more dynamic range and image details, and according to a low-dynamic range (LDR) image with different exposure times, a final HDR image can be synthesized by using an optimal LDR image for each exposure time, so that a visual effect in a real environment can be better reflected.
The highlight region may be a region corresponding to M pixels having the highest brightness in the image, or may be a pixel region having the highest N gradations in the image. And M and N can be positive integers which are self-defined and set by a user side or a system side. It is understood that each image is composed of k pixels, each pixel corresponds to a respective luminance value, hereinafter referred to as pixel luminance, and k is a positive integer. The gray scale herein may refer to the brightness level (i.e. brightness value level) of the pixels in the image, and may also refer to the gray scale level (i.e. gray scale value level) of the pixels in the image.
The dark light region may be a region corresponding to P pixels having the lowest brightness in the image, or may be a pixel region having the lowest Q gradations in the image. And P and Q are positive integers which are self-defined and set by a user side or a system side. The details of how to define the highlight region or the dim region by using the gray scale will be described in detail below, and will not be described herein.
Taking a typical HDR scene (taking a person image in a backlight) as an example, if photometry is performed according to the brightness of the person and the exposure parameters of the camera are set, since the background region belongs to a highlight region and far exceeds the maximum response range of the sensor, an overexposure phenomenon (i.e., the image is overexposed, and all pixels are white) occurs. In the foreground region, since the backlight may not reach the minimum sensing intensity of the sensor, an effective signal value cannot be generated, and at this time, an under-exposure phenomenon (i.e., an image is under-exposed, and all pixels are black) occurs.
Based on the foregoing embodiments, the photographing method according to the present application will be described below. The method is applied to terminal equipment, and the terminal equipment is at least provided with a plurality of cameras including a first camera and a second camera. Fig. 2 is a schematic flow chart of a photographing method according to an embodiment of the present invention. The method as shown in fig. 2 comprises the following implementation steps:
and step S101, acquiring a preview image through a first camera.
Specifically, the preview image may be acquired by directly using the first camera after the device is turned on, or may be an image acquired after the exposure parameter of the first camera is adjusted. For example, after the first camera acquires an initial image, the exposure parameter of the first camera may be adjusted according to the initial image, so that the brightness of a pixel in a subsequent image (e.g., a preview image) acquired by the first camera reaches (i.e., is equal to) a preset brightness threshold. Details of how to adjust the exposure parameters of the first camera will be described in detail below, and are not described herein.
Step S102, determining a first exposure parameter through a second camera under the condition that the preview image has overexposure; and the image acquired by the second camera corresponding to the first exposure parameter has no overexposure.
Step S103, collecting at least two frames of images through the first camera; the exposure parameters of the first cameras corresponding to the at least two frames of images are different, and the exposure amount corresponding to the exposure parameters of the first cameras corresponding to the at least two frames of images is greater than or equal to the exposure amount corresponding to the first cameras corresponding to the first exposure parameters.
And step S104, synthesizing the at least two collected frames of images into a high dynamic range HDR image.
In the application, the quality of the preview image acquired by the first camera in the HDR scene (such as backlighting shot, super night shot, etc.) is poor, and overexposure or underexposure of the image is likely to occur. In order to obtain a high dynamic range HDR image, the present application proposes to utilize a second camera to adjust and determine appropriate exposure parameters; and then, acquiring a corresponding frame image by using the first camera according to the determined exposure parameters so as to synthesize a final HDR image.
Specifically, in step S102, when the preview image has an overexposure, a first exposure parameter is determined by a second camera, so that the image acquired by the second camera under the first exposure parameter has no overexposure. Understandably, due to overexposure of the image, such as taking a picture in the backlight, the exposure corresponding to the determined first exposure parameter should be the minimum exposure. In the subsequent photographing process, the exposure amount adopted by the camera is greater than or equal to the minimum exposure amount (i.e. the exposure amount corresponding to the first exposure parameter).
In an optional embodiment, when the preview image has underexposure, determining a second exposure parameter by a second camera; and the image acquired by the second camera corresponding to the second exposure parameter has no underexposure. Undersampled images, such as taken in a super night scene, can be understood, and the exposure corresponding to the determined second exposure parameter should be the maximum exposure. In the subsequent photographing process, the exposure amount adopted by the camera is less than or equal to the maximum exposure amount (namely, the exposure amount corresponding to the second exposure parameter).
Correspondingly, the exposure amount of the first camera corresponding to the at least two frames of images in step S103 is less than or equal to the exposure amount corresponding to the second exposure parameter. That is, the exposure parameters corresponding to the at least two frames of images are between the first exposure parameter and the second exposure parameter.
Reference will now be made in detail to the embodiments of the present application.
By implementing the embodiment of the invention, the problems that the image quality is poor and the image acquired by the camera is not suitable for being used as a preview image in the existing HDR photographing method can be solved, so that the quality of the photographed image can be improved.
Fig. 3 is a schematic flow chart of another photographing method according to an embodiment of the present invention. The method as shown in fig. 3 comprises the following implementation steps:
step S201, acquiring an initial image through a first camera.
Step S202, according to the initial image, adjusting exposure parameters of the first camera so that the first camera can be exposed normally for taking pictures.
Step S203, acquiring a preview image in a target scene through the first camera, and determining whether the target scene corresponding to the preview image is an HDR scene.
The target scene refers to a photographing scene where the terminal device (specifically, the first camera and the second camera) is currently located, and may include, but is not limited to, an HDR scene (such as a backlight photograph, a super night photograph, and the like), a non-HDR scene (i.e., a common scene, such as a photograph in a natural environment), and the like.
Step S204, in the case that the target scene is an HDR scene, adjusting the exposure parameters of the second camera so that no overexposure or underexposure exists in the image acquired by the second camera after adjustment.
And S205, resetting the exposure parameters of the first camera, and acquiring at least two frames of images by using the first camera. And the exposure parameter of the first camera is associated with the exposure parameter of the second camera.
And S206, synthesizing the at least two frames of images into an HDR image.
The following sets forth some specific and alternative embodiments to which the present application relates.
In step S201, the initial image may be an image acquired by the first camera in any photographing scene. Specifically, after the user aims the terminal device at a scene to be photographed, the first camera can be started to acquire a corresponding initial image.
In step S202, the terminal device may further adjust/set an exposure parameter of the first camera according to the initial image acquired by the first camera, so that the first camera can normally expose and acquire an image.
Specifically, after the first camera collects the initial image, photometry can be performed on the initial image to set corresponding exposure weight, and then the exposure amount is calculated. Further, the terminal device may reset the exposure parameter of the first camera according to the calculated exposure amount, so that the brightness of the image acquired by the subsequent first camera (i.e., the sum of the brightness of each pixel in the image, or the average value of the brightness of each pixel in the image) can reach/be equal to a corresponding preset brightness threshold, that is, the first camera can be ensured to be normally exposed and photographed. Accordingly, there will be no significant areas of over-brightness or over-darkness in the subsequent images captured with the first camera to be available as preview images.
The setting mode of the exposure weight and the mode used for photometry are in one-to-one correspondence, for example, the photometry mode is average photometry, and the exposure weight also adopts an average weight, which may be specifically set by a user side or a system side in a self-defined manner, for example, 0.5, for example, 1, and the like. As shown in fig. 4A, a schematic diagram of an initial image composed of 12 pixels, if the terminal device performs averaging metering on the initial image in an averaging metering manner, the exposure weight of each pixel in the initial image may be set by using an averaging weight, for example, 1. Correspondingly, the exposure of the initial image shot by the first camera is as follows: the brightness of all pixels in the entire initial image is averaged. It should be noted that the number of pixels 12 is merely an example, and is not limited.
For another example, if the terminal device performs photometry on the initial image by using a center weight photometry method, the exposure weight of the initial image may also be set by using a center weight setting method, where the center weight setting method specifically sets the exposure weight of the pixel in the center area of the image to be larger, and sets the exposure weight of the pixel in the other areas of the image to be smaller. Specifically, as shown in fig. 4B, an exposure weight diagram of a preview image is shown, and it is assumed that the preview image is composed of 12 pixels. If the central area in the initial image includes a box of milk as shown in fig. 4B, and the exposure weight is set by the central weight setting method, the exposure weight of the pixel corresponding to the milk may be set to a first weight (e.g., 2), and the exposure weights of the remaining pixels may be set to a second weight (e.g., 1), where the first weight is greater than the second weight. Accordingly, the exposure of the initial image may be: if the brightness of each pixel in each area is weighted and averaged, for example, the brightness of each pixel in the central area is 200 as shown in fig. 4B, and the brightness of each pixel in the remaining areas is 50, the exposure amount of the initial image shown in fig. 4B is: (2 x 200 x 2+1 x 50 x 10)/12, about 108.
The present application is not limited to the light measurement method and the exposure weight. The light measuring method can include, but is not limited to, matrix light measuring, central averaging light measuring, central partial light measuring, etc., and will not be described in detail herein.
Correspondingly, the terminal equipment can correspondingly adjust the exposure parameters of the first camera according to the calculated exposure, so that the first camera can be normally exposed and photographed. Optionally, at this time, the adjusted exposure parameter of the first camera may be recorded as a normal exposure parameter.
The exposure parameters refer to parameters for influencing the exposure amount of the camera, and include but are not limited to aperture, shutter, exposure time, exposure value, sensitivity and the like.
The terminal device may include, but is not limited to, a User Equipment (UE), a mobile phone, a tablet computer (tablet personal computer), a Personal Digital Assistant (PDA), a Mobile Internet Device (MID), a wearable device (wearable device), and other devices supporting network communication, and the like.
In step S203, after adjusting the exposure parameter of the first camera, the first camera may be further used to acquire a preview image in a target scene, and analyze the preview image to determine whether the target scene is an HDR scene.
Specifically, the user can align the terminal device to a target scene to be photographed, and start the first camera to acquire a preview image in the target scene. Furthermore, statistics and analysis of pixel brightness can be performed on the preview image, so that a distribution graph of the pixel brightness in the preview image is obtained. The present application is not limited to the specific representation of the distribution diagram, and for example, the distribution diagram may be a bar chart, a line chart, or the like. Specifically, fig. 5 shows a distribution histogram (bar chart) of pixel luminance in a preview image, in which the abscissa represents the pixel luminance and the ordinate represents the number of pixels.
Optionally, the terminal device may further analyze whether overexposure and underexposure exist in the preview image acquired by the first camera. Wherein the determination condition that the overexposure exists in the preview image is any one or combination of more than one of the following conditions: 1) the proportion of the first highlight area in the preview image is greater than or equal to a first preset proportion; 2) and the average value of the pixel brightness of the second highlight area in the preview image is greater than or equal to the first brightness threshold value.
Accordingly, the determination condition that there is no overexposure in the preview image is any one or a combination of more than one of the following: 1) the proportion of the first highlight area in the preview image is smaller than a first preset proportion; 2) the average value of the pixel brightness of the second highlight area in the preview image is smaller than a first brightness threshold value; 3) the exposure of the first camera reaches a preset lower limit value.
The first highlight area may refer to a pixel area of the highest N gray scales in the preview image, where N is a positive integer. The second highlight region may be a region of the first M pixels with the highest brightness in the preview image, where M is a positive integer. Optionally, the second highlight region may also be a region corresponding to the top m% of pixels with the highest brightness in the preview image, where m is a custom constant. The first preset ratio, the first brightness threshold and the preset lower limit may be specifically threshold values set by a user side or a system side in a user-defined manner, and the application is not limited.
The determination condition of the existence of the underexposure in the preview image is at least one or a combination of more of the following items: 1) the proportion of the first dim light area in the preview image is greater than or equal to a second preset proportion; 2) and the average value of the pixel brightness of the second dim area in the preview image is less than a second brightness threshold value.
Accordingly, the determination condition that there is no underexposure in the preview image is at least one or a combination of more of the following: 1) the proportion of the first dim light area in the preview image is smaller than a second preset proportion; 2) the average value of the pixel brightness of the second dim area in the preview image is greater than or equal to a second brightness threshold value; 3) the exposure of the first camera reaches a preset upper limit value.
The first dim light area is a pixel area of the lowest Q gray scales in the preview image, and Q is a positive integer. The second dim light area is an area of P pixels with the lowest brightness in the preview image, and P is a positive integer. Optionally, the second dim area may also be an area of the first p% of pixels with the highest brightness in the preview image, where p is a custom constant. The second preset ratio, the second brightness threshold and the preset upper limit value may be specifically threshold values set by a user side or a system side in a user-defined manner, and the application is not limited.
In the present application, the P, Q, M and N can be positive integers set by the user side or the system side in a self-defined manner, and they may be the same or different. For example, referring to fig. 5, taking the gray scales as pixel brightness levels as an example, the first brightness region may refer to a pixel region in the preview image corresponding to the brightest two gray scales in fig. 5, and the first dark region may refer to a pixel region in the preview image corresponding to the darkest two gray scales in fig. 5, which will not be described in detail herein.
Accordingly, the terminal device may determine whether the target scene corresponding to the preview image is an HDR scene according to the statistical result. Specifically, when overexposure and/or underexposure exist in the preview image, it may be determined that the target scene corresponding to the preview image is an HDR scene, and step S204 is continuously performed. Otherwise, the flow may end.
It should be noted that the respective photographing scenes of the preview image and the initial image may be the same or different, and the application is not limited thereto.
Accordingly, several embodiments of the step S204 exist specifically.
In a first implementation manner, when overexposure exists in the preview image (that is, a target scene corresponding to the preview image is an HDR scene), a second camera is enabled, and an exposure parameter of the second camera is directly adjusted, so that no overexposure exists in subsequent images acquired by the second camera.
In a second embodiment, when there is overexposure in the preview image, the second camera is enabled, and the first image is acquired by the second camera. And under the condition that the first image still has overexposure, adjusting the exposure parameters of the second camera so that the subsequent images acquired by the second camera do not have overexposure.
In a third embodiment, when there is underexposure in the preview image (that is, a target scene corresponding to the preview image is an HDR scene), a second camera is enabled, and an exposure parameter of the second camera is directly adjusted, so that there is no underexposure in a subsequent image acquired by the second camera.
In a fourth embodiment, when underexposure exists in the preview image, the second camera is started, and the first image is acquired through the second camera. And under the condition that the first image still has under exposure, adjusting the exposure parameters of the second camera so that the images acquired by the second camera do not have under exposure.
In an optional embodiment, after the second camera is enabled, the exposure parameters of the second camera may be synchronized to the exposure parameters of the first camera in step S202, that is, synchronized to the normal exposure parameters. Alternatively, the image quality (i.e., accuracy) of the synthesized HDR image may not be considered, nor may it be synchronized to normal exposure parameters.
In an optional embodiment, in order to consider high quality of the synthesized HDR image, the second camera needs to be reused to obtain the corresponding first image, and the exposure parameters of the second camera are adjusted when the first image is overexposed or underexposed. The details are described below.
Correspondingly, when there is no overexposure or underexposure in the first image, the determined exposure parameter of the second camera is a normal exposure parameter (the exposure parameter adjusted by the first camera in S202). It will be appreciated that the first image may be a preview image captured by the second camera or may be an image captured by the second camera but not displayed (i.e., not used as a preview). The following explains the concrete implementation processes of the above four embodiments.
Specifically, in the case that the target scene is an HDR scene, a second camera may be started to perform photometry on the target scene, so as to acquire a first image in the same HDR scene.
In an alternative embodiment, in practical applications, the preview images acquired by the first camera and the second camera may have a certain parallax, that is, the preview images are different. Accordingly, the terminal device may transform the initial image collected by the second camera before photometry, so that the first image collected by the second camera is the same as the preview image collected by the first camera, and even if the image contents obtained by the first camera and the second camera are the same, it may be understood that the spatial characteristics of the image contents are the same. The spatial characteristics include, but are not limited to, color characteristics, texture characteristics, shape characteristics, and spatial relationship characteristics of the image, which are not described in detail herein.
Specifically, due to the existence of a certain distance between the first camera and the second camera, there is a certain difference between the preview images acquired by the first camera and the second camera in the same HDR scene. For example, due to different shooting angles, the preview images acquired by the first camera and the second camera have a certain parallax, and the like. Specifically, as shown in fig. 6, a scene schematic diagram of photographing is shown, when two cameras are used to photograph in the same scene, a certain parallax exists, so that images acquired by the two cameras are also different.
Correspondingly, in order to ensure that the preview images acquired by the two cameras are consistent, the initial image acquired by the second camera can be subjected to image rectification or image transformation. The present application is not limited to the specific implementation of image rectification/transformation, such as translation, rotation, expansion and contraction, and projection rectification.
In an alternative embodiment, before the second camera is started to capture the first image in the HDR scene, the exposure parameters of the second camera may also be synchronously set. Specifically, the exposure parameters of the second camera are synchronously set as the exposure parameters (i.e., normal exposure parameters) of the first camera in S202, so that a high-quality target image (i.e., HDR image) is obtained later.
Further, after the second camera obtains the first image, the terminal device may perform statistics and analysis on pixel brightness of the first image, so as to obtain a distribution map of the pixel brightness in the preview image. Optionally, the terminal device may further count/analyze whether overexposure and underexposure exist in the first image acquired by the second camera. For specific determination conditions of whether the first image has overexposure and underexposure, reference may be made to the related explanations in the foregoing embodiments, and details are not repeated here.
Further, the terminal device may adjust an exposure parameter of the second camera according to the statistical result. The concrete implementation is as follows:
when the first image (or the preview image) is over-exposed, the exposure amount can be reduced by reducing (decreasing) the exposure parameter of the second camera, so that the over-exposure does not exist in the images acquired by the subsequent second camera. Optionally, at this time, the adjusted exposure parameter of the second camera may be recorded as the first exposure parameter. For the determination condition that the image is not over-exposed, reference may be specifically made to the relevant explanations in the foregoing embodiments, and details are not described here.
In an optional embodiment, the exposure amount of the second camera may also be directly reduced to a preset exposure lower limit value, so that no overexposure exists in images acquired by the subsequent second camera. Specific embodiments of reducing the exposure include, but are not limited to, reducing the exposure time (i.e., increasing the shutter speed), reducing the aperture, and reducing the sensitivity, among others.
The number of times of adjusting the exposure parameter of the second camera is not limited in the present application. No matter how many times the adjustment is carried out, it is only required to ensure that over exposure and/or under exposure do not exist in the preview image acquired by the second camera subsequently.
For example, the second camera acquires a preview image by adopting an exposure time of 100ms for the first time, and the pixel brightness of all areas in the preview image is 255, namely the preview image is an overexposed image; after the first exposure parameter adjustment, assuming that the exposure time of the second camera is adjusted to be 50ms, and other exposure parameters are not changed, the brightness of the pixel in the second acquired preview image becomes 200, but still exceeds the preset brightness threshold (e.g. 180), i.e. the second acquired preview image is still an overexposed image. Accordingly, the exposure time may be reduced again until the average value of the pixel brightness of the highlight area in the image acquired by the second camera is less than the preset brightness threshold, i.e., there is no overexposure.
Accordingly, when the first image (or the preview image) has underexposure, the exposure amount can be increased by increasing the exposure parameter of the second camera, so that the underexposure does not exist in the image acquired by the second camera any more. Optionally, at this time, the adjusted exposure parameter of the second camera may be recorded as the second exposure parameter. For the determination condition that the image does not have under-exposure, reference may be specifically made to the relevant explanations in the foregoing embodiments, and details are not described here.
In an optional embodiment, the exposure amount of the second camera may also be directly increased to a preset exposure upper limit value, so that the preview image acquired by the second camera is no longer under-exposed. Specific embodiments of increasing the exposure include, but are not limited to, increasing the exposure time (i.e., decreasing the shutter speed), increasing the aperture, increasing the sensitivity, and the like.
Correspondingly, in steps S205 and S206, after the terminal device receives the photographing instruction triggered by the user, the first camera may be started to photograph in the corresponding photographing mode. The photographing mode corresponds to the exposure parameter of the first camera, namely one photographing mode corresponds to one exposure parameter of the first camera. That is, the first camera is started to collect the multi-frame image according to the exposure parameter corresponding to the photographing mode.
The mode of acquiring the photographing mode may specifically be: the photographing instruction is analyzed, or the terminal equipment is obtained by analyzing a previous image. For example, in a normal situation, the terminal device selects the first photographing mode corresponding to the exposure parameter to photograph. When the terminal equipment analyzes that underexposure dead black exists in the image acquired by the second camera, the exposure parameters corresponding to the second photographing mode can be automatically selected for photographing and the like. In the HDR scenario, there are two photographing modes proposed in the present application, which may be specifically a first photographing mode and a second photographing mode, which will be described in detail below.
Specifically, in the first photographing mode, the terminal device may arbitrarily select multiple sets of target exposure parameters from the first exposure parameters to the second exposure parameters, and set the target exposure parameters as the exposure parameters of the first camera, that is, the exposure parameters (target exposure parameters) of the first camera are between the first exposure parameters and the second exposure parameters. And then controlling the first camera to acquire a series of frame images (namely at least two frame images) under different target exposure parameters. The exposure parameters used for two adjacent frames of images may be the same or different.
It should be understood that since the first exposure parameter is the minimum exposure parameter obtained when the image is overexposed, the second exposure parameter is the maximum exposure parameter obtained when the image is underexposed. The target exposure parameters are therefore any one or more sets of exposure parameters from the minimum exposure parameter to the maximum exposure parameter, which of course includes the exposure parameters during normal exposure (i.e. the normal exposure parameters mentioned above). That is, in the first photographing mode, the first camera acquires a series of corresponding frame images from the first exposure parameter to the second exposure parameter (including the normal exposure parameter).
Accordingly, in step S206, the terminal device may adopt a setting algorithm to combine the acquired series of frame images into a final target image. The target image is the HDR image finally shot and output by the terminal equipment. The setting algorithm can be set by a user side or a system side in a self-defined mode, such as a high dynamic image synthesis algorithm, an image superposition algorithm and the like.
In the second photographing mode, the terminal device may arbitrarily select multiple sets of target exposure parameters from the first exposure parameters to the normal exposure parameters, and set the target exposure parameters as the exposure parameters of the first camera, that is, the exposure parameters (target exposure parameters) of the first camera are between the first exposure parameters to the normal exposure parameters. And controlling the first camera to acquire a series of frame images (namely at least two frame images) under different target exposure parameters. The exposure parameters used for two adjacent frames of images may be the same or different. However, in the second photographing mode, a series of frame images corresponding to the first exposure parameter to the normal exposure parameter need to be acquired, so that the set algorithm is adopted in step S206 to synthesize the frame images into a final target image (i.e., HDR image).
It should be noted that the second photographing mode is suitable for a scene in which an under-exposed black image exists, and an image with an expanded dynamic range is obtained by multi-frame image synthesis.
By implementing the embodiment of the invention, the problems of poor image quality and the like in the conventional HDR photographing method can be solved, so that the dynamic range and the image quality of the photographed image are improved.
Based on the foregoing embodiments, an implementation manner of a terminal device to which the present application is applicable is described below. The terminal equipment is provided with a control module and at least two camera modules. The camera module can be specifically a camera, a sensor, a photosensitive chip and other components and parts for photographing. In the above embodiments of the present application, the camera module is taken as an example of a camera, and details of related embodiments are described. The control module may be specifically a processor, and is configured to implement relevant steps in the embodiment of fig. 2 or fig. 3, and/or implement other technical content described in the text, such as the statistical analysis of the image described above, which is not limited herein.
Fig. 7 is a schematic structural diagram of a terminal device according to an embodiment of the present invention. The terminal device 100 as shown in fig. 7 may include: baseband chip 110, memory 115 (one or more computer-readable storage media), Radio Frequency (RF) module 116, and peripheral system 117. These components may communicate over one or more communication buses 114.
The peripheral system 117 is mainly used to implement an interactive function between the terminal device 100 and a user/external environment, and mainly includes an input/output device of the terminal device 100. In a specific implementation, the peripheral system 117 may include: a display screen controller 118, a camera controller 119, an audio controller 120, and a sensor management module 121. Wherein each controller may be coupled to a respective peripheral device (e.g., display screen 123, camera 124, audio circuitry 125, and sensor 126). In some embodiments, the number of the cameras 124 is not limited, and the number of the cameras may be two or more. The camera 124 may specifically be a 3D camera or the like. In some embodiments, the display screen 123 may be a conventional display screen, a touch screen configured with a self-capacitive floating touch panel, a touch screen configured with an infrared floating touch panel, or the like. It should be noted that the peripheral system 117 may also include other I/O peripherals.
The baseband chip 110 may integrally include: one or more processors 111, a clock module 112, and a power management module 113. The clock module 112 integrated in the baseband chip 110 is mainly used for generating clocks required for data transmission and timing control for the processor 111. The power management module 113 integrated in the baseband chip 110 is mainly used for providing stable and high-precision voltage for the processor 111, the rf module 116 and peripheral systems.
A Radio Frequency (RF) module 116 for receiving and transmitting RF signals mainly integrates a receiver and a transmitter of the terminal 100. The Radio Frequency (RF) module 116 communicates with a communication network and other communication devices through radio frequency signals. In particular implementations, the Radio Frequency (RF) module 116 may include, but is not limited to: an antenna system, an RF transceiver, one or more amplifiers, a tuner, one or more oscillators, a digital signal processor, a CODEC chip, a SIM card, a storage medium, and the like. In some embodiments, the Radio Frequency (RF) module 116 may be implemented on a separate chip.
The memory 115 is coupled to the processor 111 for storing various software programs and/or sets of instructions. In particular implementations, memory 115 may include high speed random access memory and may also include non-volatile memory, such as one or more magnetic disk storage devices, flash memory devices, or other non-volatile solid state storage devices. Memory 115 may store an operating system (hereinafter referred to simply as a system), such as an embedded operating system like ANDROID, IOS, WINDOWS, or LINUX. Memory 115 may also store network communication programs that may be used to communicate with one or more additional devices, one or more terminal devices, one or more network devices. The memory 115 may also store a photographing program that the processor 111 may call to complete the photographing of the HDR image.
It should be understood that terminal device 100 is only one example provided by embodiments of the present invention, and that terminal device 100 may have more or fewer components than shown, may combine two or more components, or may have a different configuration implementation of components.
Referring to fig. 8, a functional block diagram of a terminal device according to an embodiment of the present invention is shown. The functional blocks of the terminal device may implement the inventive arrangements in hardware, software or a combination of hardware and software. Those skilled in the art will appreciate that the functional blocks described in FIG. 8 may be combined or separated into sub-blocks to implement the present scheme. Thus, the above description of the invention may support any possible combination or separation or further definition of the functional blocks described below.
Specifically, with reference to the foregoing embodiment of fig. 7, when the functional unit corresponding to the memory 115 is a storage unit 601, the functional unit corresponding to the baseband chip 110 (specifically, the processor 111) is a processing unit 602, and the functional unit corresponding to the radio frequency module 116 is a communication unit 603, as shown in fig. 8, the terminal device 110 may include: a processing unit 602 and a communication unit 603. The processing unit 702 is configured to control and manage the operation of the terminal device 110. Wherein processing unit 602 is configured to enable terminal device 110 to perform steps S101-S104 in fig. 1, steps S201-S206 in fig. 2, and/or to perform other steps of the techniques described herein. Communication unit 603 is used to support communication of terminal device 110 with other devices, e.g., communication unit 603 is used to support terminal device 110 in acquiring images they captured from a first camera or a second camera, and/or for performing other steps of the techniques described herein. Optionally, terminal device 110 may further include a storage unit 601 for storing program codes and data of terminal device 110.
The Processing Unit 602 may be a processor or a controller, such as a Central Processing Unit (CPU), a general-purpose processor, a Digital Signal Processor (DSP), an Application-Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA), or other Programmable logic devices, transistor logic devices, hardware components, or any combination thereof. Which may implement or perform the various illustrative logical blocks, modules, and circuits described in connection with the disclosure. The processor may also be a combination of computing functions, e.g., comprising one or more microprocessors, DSPs, and microprocessors, among others. The communication unit 603 may be a communication interface, a transceiver circuit, etc., wherein the communication interface is a generic term and may include one or more interfaces, such as interfaces between network devices and other devices. The storage unit 601 may be a memory.
Optionally, the terminal device shown in fig. 8 may further include a corresponding functional unit in the peripheral system 117, which is not shown in the figure. For example, the display screen 123 corresponds to a display unit for displaying an image acquired by the camera, such as a preview image or an HDR image. The number of the imaging units referred to in this application includes two or more, and so on.
The specific implementation of the terminal device shown in fig. 7 or fig. 8 may also refer to the corresponding description of the foregoing method embodiment, and is not described herein again.
The steps of a method or algorithm described in connection with the embodiments disclosed herein may be embodied in hardware or in software executed by a processor. The software instructions may be comprised of corresponding software modules that may be stored in Random Access Memory (RAM), flash Memory, Read Only Memory (ROM), Erasable Programmable ROM (EPROM), Electrically Erasable Programmable ROM (EEPROM), registers, a hard disk, a removable hard disk, a compact disc Read Only Memory (CD-ROM), or any other form of storage medium known in the art. An exemplary storage medium is coupled to the processor such the processor can read information from, and write information to, the storage medium. Of course, the storage medium may also be integral to the processor. The processor and the storage medium may reside in an ASIC. Additionally, the ASIC may reside in a network device. Of course, the processor and the storage medium may reside as discrete components in a terminal device.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by a computer program, which can be stored in a computer-readable storage medium, and when executed, can include the processes of the embodiments of the methods described above. And the aforementioned storage medium includes: various media that can store program codes, such as ROM, RAM, magnetic or optical disks.

Claims (16)

1. A photographing method is applied to a terminal device provided with a first camera and a second camera, and comprises the following steps:
acquiring a preview image through a first camera;
determining a first exposure parameter through the second camera under the condition that the preview image has overexposure; the image acquired by the second camera corresponding to the first exposure parameter has no overexposure;
acquiring at least two frames of images through the first camera; the exposure parameters of the first cameras corresponding to the at least two frames of images are different, and the exposure amount corresponding to the exposure parameters of the first cameras corresponding to the at least two frames of images is greater than or equal to the exposure amount corresponding to the first exposure parameters;
and synthesizing the at least two acquired frames of images into a High Dynamic Range (HDR) image.
2. The method of claim 1, wherein the preview image having overexposure comprises:
the proportion of the first highlight area in the preview image is larger than or equal to a first preset proportion, or the average value of the pixel brightness of the second highlight area in the preview image is larger than or equal to a first brightness threshold value;
the preview image being absent of overexposure comprises:
the proportion of a first highlight area in the preview image is smaller than the first preset proportion, or the average value of the pixel brightness of a second highlight area in the preview image is smaller than the first brightness threshold, or the exposure amount of the first camera reaches a preset lower limit value;
the first highlight area is a pixel area of the highest N gray scales in the preview image, and N is a positive integer; the second highlight area is an area of the first M pixels with the highest brightness in the preview image, and M is a positive integer.
3. The method of claim 1, wherein determining the first exposure parameter by the second camera is performed only if there is overexposure in the first image acquired by the second camera.
4. The method of claim 2, wherein determining the first exposure parameter by the second camera is performed only if there is overexposure in the first image acquired by the second camera.
5. The method according to any one of claims 1-4, wherein the exposure parameters corresponding to the at least two frames of images are between the first exposure parameter and a normal exposure parameter; and the average value of the pixel brightness in the image corresponding to the normal exposure parameter is equal to a preset brightness threshold value.
6. The method according to any one of claims 1-4, further comprising:
determining a second exposure parameter through the second camera under the condition that the preview image has underexposure; and the image acquired by the second camera corresponding to the first exposure parameter has no underexposure.
7. The method of claim 6, wherein the preview image having underexposure comprises:
the ratio of the first dim light area in the preview image is greater than or equal to a second preset ratio, or the average value of the pixel brightness of the second dim light area in the preview image is smaller than a second brightness threshold value;
the absence of underexposure of the preview image comprises:
the ratio of the first dim light area in the preview image is smaller than a first preset ratio, or the average value of the pixel brightness of the second dim light area in the preview image is larger than or equal to a second brightness threshold value, or the exposure amount of the first camera reaches a preset upper limit value;
the first dim light area is a pixel area of the lowest Q gray scales in the preview image, and Q is a positive integer; the second dim light area is an area of P pixels with the lowest brightness in the preview image, and P is a positive integer.
8. The method of claim 6, wherein determining second exposure parameters by the second camera is performed if there is underexposure in the first image acquired by the second camera.
9. The method of claim 7, wherein determining second exposure parameters by the second camera is performed if there is underexposure in the first image acquired by the second camera.
10. The method of claim 6, wherein the exposure parameters corresponding to the at least two frames of images are between the first exposure parameter and the second exposure parameter.
11. The method according to any one of claims 7-9, wherein the at least two frames of images correspond to exposure parameters between the first exposure parameter and the second exposure parameter.
12. The method of claim 3, 4, 8 or 9, wherein the first image is an image obtained by the second camera under normal exposure parameters, and the preview image is an image obtained by the first camera under the normal exposure parameters, and the normal exposure parameters are such that an average value of pixel brightness in the preview image is equal to a preset brightness threshold.
13. The method according to claim 12, wherein the first image is an image obtained by image transformation, and the first image is consistent with the spatial characteristics of the image content of the preview image.
14. The terminal equipment is characterized by comprising at least two cameras including a first camera and a second camera, a memory and a processor coupled with the memory and the at least two cameras; the memory is used for storing instructions, and the processor is used for executing the instructions and communicating with the first camera and the second camera; wherein the processor, when executing the instructions, performs the method of any of claims 1-13 above.
15. The terminal device of claim 14, further comprising a display coupled to the processor, the display configured to display an image captured by the first camera or the second camera under control of the processor, or an HDR image corresponding to the method executed by the processor.
16. A computer-readable storage medium, in which a computer program is stored which, when being executed by a processor, carries out the method according to any one of claims 1 to 13.
CN201810257073.7A 2018-03-26 2018-03-26 Photographing method, related device and computer storage medium Expired - Fee Related CN108337445B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810257073.7A CN108337445B (en) 2018-03-26 2018-03-26 Photographing method, related device and computer storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810257073.7A CN108337445B (en) 2018-03-26 2018-03-26 Photographing method, related device and computer storage medium

Publications (2)

Publication Number Publication Date
CN108337445A CN108337445A (en) 2018-07-27
CN108337445B true CN108337445B (en) 2020-06-26

Family

ID=62931571

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810257073.7A Expired - Fee Related CN108337445B (en) 2018-03-26 2018-03-26 Photographing method, related device and computer storage medium

Country Status (1)

Country Link
CN (1) CN108337445B (en)

Families Citing this family (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108900785A (en) * 2018-09-18 2018-11-27 Oppo广东移动通信有限公司 Exposal control method, device and electronic equipment
CN108881701B (en) * 2018-09-30 2021-04-02 华勤技术股份有限公司 Shooting method, camera, terminal device and computer readable storage medium
CN109862269B (en) * 2019-02-18 2020-07-31 Oppo广东移动通信有限公司 Image acquisition method and device, electronic equipment and computer readable storage medium
CN110049240B (en) * 2019-04-03 2021-01-26 Oppo广东移动通信有限公司 Camera control method and device, electronic equipment and computer readable storage medium
CN110225248B (en) * 2019-05-29 2021-11-16 Oppo广东移动通信有限公司 Image acquisition method and device, electronic equipment and computer readable storage medium
CN110166705B (en) * 2019-06-06 2021-04-23 Oppo广东移动通信有限公司 High dynamic range HDR image generation method and device, electronic equipment and computer readable storage medium
CN110958400B (en) * 2019-12-13 2021-11-23 上海海鸥数码照相机有限公司 System, method and device for keeping exposure of continuously shot pictures consistent
CN113364964B (en) * 2020-03-02 2023-04-07 RealMe重庆移动通信有限公司 Image processing method, image processing apparatus, storage medium, and terminal device
CN113497880A (en) * 2020-03-20 2021-10-12 华为技术有限公司 Method for shooting image and electronic equipment
CN112492208B (en) * 2020-11-30 2022-03-22 维沃移动通信有限公司 Shooting method and electronic equipment
CN116007744A (en) * 2021-10-21 2023-04-25 华为技术有限公司 Ultraviolet detection method and electronic equipment
CN114785963B (en) * 2022-06-22 2022-09-30 武汉市聚芯微电子有限责任公司 Exposure control method, terminal and storage medium
CN115767262B (en) * 2022-10-31 2024-01-16 华为技术有限公司 Photographing method and electronic equipment
CN117714835A (en) * 2023-08-02 2024-03-15 荣耀终端有限公司 Image processing method, electronic equipment and readable storage medium

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014138695A1 (en) * 2013-03-08 2014-09-12 Pelican Imaging Corporation Systems and methods for measuring scene information while capturing images using array cameras
CN104937921A (en) * 2013-12-06 2015-09-23 华为终端有限公司 Terminal, image processing method, and image acquisition method
CN107395998A (en) * 2017-08-24 2017-11-24 维沃移动通信有限公司 A kind of image capturing method and mobile terminal
CN107454343A (en) * 2014-11-28 2017-12-08 广东欧珀移动通信有限公司 Photographic method, camera arrangement and terminal
CN107465882A (en) * 2017-09-22 2017-12-12 维沃移动通信有限公司 A kind of image capturing method and mobile terminal
CN107613218A (en) * 2017-09-15 2018-01-19 维沃移动通信有限公司 The image pickup method and mobile terminal of a kind of high dynamic range images

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI581632B (en) * 2016-06-23 2017-05-01 國立交通大學 Image generating method and image capturing device

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014138695A1 (en) * 2013-03-08 2014-09-12 Pelican Imaging Corporation Systems and methods for measuring scene information while capturing images using array cameras
CN104937921A (en) * 2013-12-06 2015-09-23 华为终端有限公司 Terminal, image processing method, and image acquisition method
CN107454343A (en) * 2014-11-28 2017-12-08 广东欧珀移动通信有限公司 Photographic method, camera arrangement and terminal
CN107395998A (en) * 2017-08-24 2017-11-24 维沃移动通信有限公司 A kind of image capturing method and mobile terminal
CN107613218A (en) * 2017-09-15 2018-01-19 维沃移动通信有限公司 The image pickup method and mobile terminal of a kind of high dynamic range images
CN107465882A (en) * 2017-09-22 2017-12-12 维沃移动通信有限公司 A kind of image capturing method and mobile terminal

Also Published As

Publication number Publication date
CN108337445A (en) 2018-07-27

Similar Documents

Publication Publication Date Title
CN108337445B (en) Photographing method, related device and computer storage medium
US11563897B2 (en) Image processing method and apparatus which determines an image processing mode based on status information of the terminal device and photographing scene information
CN108683862B (en) Imaging control method, imaging control device, electronic equipment and computer-readable storage medium
CN107635102B (en) Method and device for acquiring exposure compensation value of high-dynamic-range image
US11228720B2 (en) Method for imaging controlling, electronic device, and non-transitory computer-readable storage medium
CN109218628B (en) Image processing method, image processing device, electronic equipment and storage medium
JP6911202B2 (en) Imaging control method and imaging device
CN110445988B (en) Image processing method, image processing device, storage medium and electronic equipment
CN110225248B (en) Image acquisition method and device, electronic equipment and computer readable storage medium
US11431915B2 (en) Image acquisition method, electronic device, and non-transitory computer readable storage medium
CN109218627B (en) Image processing method, image processing device, electronic equipment and storage medium
WO2019148978A1 (en) Image processing method and apparatus, storage medium and electronic device
JP5719418B2 (en) High dynamic range image exposure time control method
CN108337446B (en) High dynamic range image acquisition method, device and equipment based on double cameras
CN108616689B (en) Portrait-based high dynamic range image acquisition method, device and equipment
CN107888839B (en) high dynamic range image acquisition method, device and equipment
US20210160416A1 (en) Method for imaging controlling, electronic device, and non-transitory computer-readable storage medium
CN107846556B (en) Imaging method, imaging device, mobile terminal and storage medium
CN110166705B (en) High dynamic range HDR image generation method and device, electronic equipment and computer readable storage medium
CN109040607B (en) Imaging control method, imaging control device, electronic device and computer-readable storage medium
CN110349163B (en) Image processing method and device, electronic equipment and computer readable storage medium
US11601600B2 (en) Control method and electronic device
CN110881108B (en) Image processing method and image processing apparatus
CN107682611B (en) Focusing method and device, computer readable storage medium and electronic equipment
CN109005363B (en) Imaging control method, imaging control device, electronic device and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20200626

CF01 Termination of patent right due to non-payment of annual fee