CN113891008B - Exposure intensity adjusting method and related equipment - Google Patents

Exposure intensity adjusting method and related equipment Download PDF

Info

Publication number
CN113891008B
CN113891008B CN202110651305.9A CN202110651305A CN113891008B CN 113891008 B CN113891008 B CN 113891008B CN 202110651305 A CN202110651305 A CN 202110651305A CN 113891008 B CN113891008 B CN 113891008B
Authority
CN
China
Prior art keywords
image
exposure intensity
electronic device
weight
exposure
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110651305.9A
Other languages
Chinese (zh)
Other versions
CN113891008A (en
Inventor
冯寒予
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honor Device Co Ltd
Original Assignee
Honor Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honor Device Co Ltd filed Critical Honor Device Co Ltd
Priority to CN202110651305.9A priority Critical patent/CN113891008B/en
Publication of CN113891008A publication Critical patent/CN113891008A/en
Application granted granted Critical
Publication of CN113891008B publication Critical patent/CN113891008B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/71Circuitry for evaluating the brightness variation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/73Circuitry for compensating brightness variation in the scene by influencing the exposure time

Abstract

The application discloses an exposure intensity adjusting method and related equipment. The method comprises the following steps: the electronic device can judge the similarity between the stored image shot last time and the last frame preview image acquired before the shooting, when the two images are similar and the automatic exposure mode is not changed due to user operation, the exposure intensity of the shooting is adjusted based on the exposure intensity of the stored image shot last time, the time of the shooting for two times and the ambient brightness, then the shooting is carried out again, and the shot image is stored. The method can improve the consistency of the image brightness in the shooting process and improve the user experience.

Description

Exposure intensity adjusting method and related equipment
Technical Field
The present application relates to the field of terminal technologies, and in particular, to an exposure intensity adjusting method and a related device.
Background
At present, in a process that a user uses an electronic device such as a mobile phone to obtain a set of images with high consistency, for example, in a process of delayed photography, continuous shooting, and the like, if an environment changes briefly and drastically, so that the ambient light brightness measured by the electronic device changes greatly (for example, the ambient light brightness is too high or too low), an automatic exposure system of the electronic device may automatically adjust exposure parameters. When the environment returns to normal, the electronic device may not be in time to adjust the exposure parameters, and still use the exposure parameters when the environment changes to shoot the image, so that the finally obtained image is too bright or too dark, resulting in low consistency of multiple images obtained by the user.
Therefore, how to acquire images with high consistency in the processes of time-lapse photography, continuous shooting and the like is a problem to be solved at present.
Disclosure of Invention
The application provides an exposure intensity adjusting method and related equipment, which can adjust the exposure intensity based on the exposure parameters obtained before when an automatic exposure system has no time to adjust the exposure parameters, thereby improving the consistency of the images obtained by a user and enabling the brightness of the images to change smoothly.
In a first aspect, the present application provides a method for adjusting exposure intensity. The method comprises the following steps: the electronic equipment acquires a first image and a second image shot by a camera, and determines whether the similarity between the content of the first image and the content of the second image is higher than a first threshold value; the shooting time of the second image is later than that of the first image; if the similarity between the content of the first image and the content of the second image is higher than a first threshold value, the electronic equipment acquires a first exposure intensity and a second exposure intensity; the first exposure intensity is the exposure intensity of the first image when being shot; the second exposure intensity is the exposure intensity of the second image when being shot; the first image has a first brightness; the second image has a second brightness; the electronic equipment determines a third exposure intensity according to the first exposure intensity and the second exposure intensity, and a third image is obtained by shooting through the camera under the third exposure intensity; the third image has a third brightness; the difference between the third brightness and the first brightness is less than a second threshold.
In the solution provided in the present application, if the first image and the second image are similar, the electronic device may obtain the first exposure intensity and the second exposure intensity, determine a third exposure intensity according to the first exposure intensity and the second exposure intensity, and then obtain a third image according to the third exposure intensity. Wherein the difference in brightness between the first image and the third image is small. That is to say, the brightness change of the image obtained by the electronic device is smooth, and for the user, the phenomenon that the image is too dark or too bright due to sudden change of the environment is avoided, so that the user does not need to spend time to obtain the image, and the user experience is improved.
With reference to the first aspect, in a possible implementation manner of the first aspect, the determining, by the electronic device, a third exposure intensity according to the first exposure intensity and the second exposure intensity specifically includes: the electronic device determines a first weight and a second weight; the first weight is the weight of the first exposure intensity; the second weight is the weight of the second exposure intensity; determining a third exposure intensity according to the first exposure intensity, the first weight, the second exposure intensity and the second weight; if the brightness difference between the first image and the second image is larger, the first weight is smaller, and the second weight is larger; if the time difference between the first image and the second image is larger, the first weight is smaller, and the second weight is larger.
In the solution provided in the present application, the electronic device may determine the third exposure intensity by the first weight, the second weight, the first exposure intensity, and the second exposure intensity. The first weight and the second weight are further related to the time difference and the brightness difference of the first image and the second image. If the time difference between the first image and the second image is large and the brightness difference is large, the specific gravity of the second exposure intensity is referred to when determining the third exposure intensity, otherwise, the specific gravity of the first exposure intensity is referred to when determining the third exposure intensity. That is, if the time difference between the acquisition of the first image and the acquisition of the second image is large and the brightness difference is large, the user may not need to have the brightness consistency between the first image and the third image high, so the electronic device does not refer to the exposure intensity when acquiring the first image too much, and the intention of the user is fully considered.
With reference to the first aspect, in a possible implementation manner of the first aspect, the determining, by the electronic device, a third exposure intensity according to the first exposure intensity and the second exposure intensity specifically includes: the electronic equipment judges whether the electronic equipment has state difference at the first moment and the second moment; the first moment is the moment when the electronic equipment acquires a first image; the second moment is the moment when the electronic equipment acquires a second image; the state difference is that the electronic equipment has a change of an automatic exposure mode caused by user operation at the first time and/or the second time. And if the electronic equipment does not have state difference at the first moment and the second moment, the electronic equipment determines a third exposure intensity according to the first exposure intensity and the second exposure intensity.
In the solution provided by the application, before determining the third exposure intensity, the electronic device may determine whether there is a change in the automatic exposure mode due to a user operation at the first time and the second time. For example, when the second image is acquired, if the user touches the display screen and triggers the touchAE mode, the automatic exposure system may increase the brightness of the screen of the touch area of the user, which affects the adjustment of the subsequent exposure parameters. If the user wishes to increase the brightness of the picture, the electronic device does not determine the third exposure intensity according to the first exposure intensity. Such a determination takes into account more of the user's wishes.
With reference to the first aspect, in a possible implementation manner of the first aspect, the method further includes: if the electronic equipment has state difference between the first moment and the second moment, the electronic equipment obtains a fourth image by shooting under a second exposure intensity through a camera; the fourth image and the second image are two continuous frames of images shot by the electronic equipment through the camera.
In the scheme provided by the application, if the electronic equipment has a state difference between the first time and the second time, the user probably wants to change the brightness of the picture. Therefore, in this case, the electronic apparatus acquires an image directly at the second exposure intensity without referring to the first exposure intensity in consideration of the user's will.
With reference to the first aspect, in a possible implementation manner of the first aspect, the second image and the third image are two consecutive frames of images captured by the electronic device through the camera.
In the solution provided in the present application, the second image and the third image may be two consecutive frames of images acquired by the electronic device. That is to say, the brightness of the adjacent images acquired by the user through the electronic equipment is similar, and the brightness consistency of the images acquired by the user is improved.
With reference to the first aspect, in a possible implementation manner of the first aspect, the method further includes: if the similarity between the content of the first image and the content of the second image is not higher than the first threshold value, the electronic equipment obtains a fifth image by shooting through the camera under the second exposure intensity; the fifth image and the second image are two continuous frames of images shot by the electronic equipment through the camera.
In the solution provided in the present application, if the similarity between the content of the first image and the content of the second image is not higher than the first threshold, the user may not need the image with a high brightness. Therefore, in this case, the electronic apparatus acquires an image directly at the second exposure intensity without referring to the first exposure intensity in consideration of the user's will.
In a second aspect, the present application provides an electronic device comprising a camera, one or more memories, one or more processors coupled with the camera, the one or more memories for storing computer program code, the computer program code comprising computer instructions; the camera is used for shooting under the first exposure intensity to obtain a first image and shooting under the second exposure intensity to obtain a second image, wherein the first image has first brightness, and the second image has second brightness; the shooting time of the second image is later than that of the first image; a processor for determining whether a similarity of content of the first image and content of the second image is above a first threshold; the processor is further used for acquiring a first exposure intensity and a second exposure intensity under the condition that the content of the first image and the content of the second image are higher than a first threshold value, and determining a third exposure intensity according to the first exposure intensity and the third exposure intensity; the camera is further used for shooting under a third exposure intensity to obtain a third image, the third image has a third brightness, and the brightness difference between the third brightness and the first brightness is smaller than a second threshold value.
With reference to the second aspect, in a possible implementation manner of the second aspect, the processor, when configured to determine the third exposure intensity according to the first exposure intensity and the third exposure intensity, is specifically configured to: determining a first weight and a second weight; the first weight is the weight of the first exposure intensity; the second weight is the weight of the second exposure intensity; determining a third exposure intensity according to the first exposure intensity, the first weight, the second exposure intensity and the second weight; if the brightness difference between the first image and the second image is larger, the first weight is smaller, and the second weight is larger; if the time difference between the first image and the second image is larger, the first weight is smaller, and the second weight is larger.
With reference to the second aspect, in a possible implementation manner of the second aspect, the processor, when configured to determine the third exposure intensity according to the first exposure intensity and the third exposure intensity, is specifically configured to: judging whether the electronic equipment has state difference at the first moment and the second moment; the first moment is the moment when the electronic equipment acquires a first image; the second moment is the moment when the electronic equipment acquires a second image; the state difference is that the electronic equipment has a change of an automatic exposure mode caused by user operation at a first time and/or a second time; and if the electronic equipment has no state difference between the first moment and the second moment, determining a third exposure intensity according to the first exposure intensity and the second exposure intensity.
With reference to the second aspect, in a possible implementation manner of the second aspect, the camera is further configured to: if the electronic equipment has state difference between the first moment and the second moment, shooting under the second exposure intensity to obtain a fourth image; the fourth image and the second image are two continuous frames of images shot by the electronic equipment through the camera.
With reference to the second aspect, in a possible implementation manner of the second aspect, the second image and the third image are two consecutive frames of images captured by the electronic device through the camera.
With reference to the second aspect, in a possible implementation manner of the second aspect, the camera is further configured to: if the similarity between the content of the first image and the content of the second image is not higher than the first threshold value, shooting under a second exposure intensity to obtain a fifth image; the fifth image and the second image are two continuous frames of images shot by the electronic equipment through the camera.
In a third aspect, the present application provides a computer storage medium, which includes instructions that, when executed on an electronic device, cause the electronic device to perform any one of the possible implementations of the first aspect.
In a fourth aspect, an embodiment of the present application provides a chip applied to an electronic device, where the chip includes one or more processors, and the processor is configured to invoke computer instructions to cause the electronic device to execute any one of the possible implementation manners of the first aspect.
In a fifth aspect, an embodiment of the present application provides a computer program product including instructions, which, when run on a device, cause the electronic device to perform any one of the possible implementations of the first aspect.
It is understood that the electronic device provided by the second aspect, the computer storage medium provided by the third aspect, the chip provided by the fourth aspect, and the computer program product provided by the fifth aspect are all used to execute the method provided by the embodiments of the present application. Therefore, the beneficial effects achieved by the method can refer to the beneficial effects in the corresponding method, and are not described herein again.
Drawings
Fig. 1 is a schematic hardware structure diagram of an electronic device 100 according to an embodiment of the present disclosure;
fig. 2 is a schematic diagram of a software structure of an electronic device 100 according to an embodiment of the present disclosure;
3A-3G are a set of schematic user interfaces provided by embodiments of the present application;
fig. 4 is a schematic diagram of a time-lapse photography image acquisition provided by an embodiment of the present application;
fig. 5 is a schematic diagram of a dynamic video obtained by time-lapse photography according to an embodiment of the present application;
FIG. 6 is a schematic diagram of a dynamic video obtained by delayed shooting according to an embodiment of the present application;
fig. 7 is a flowchart of an exposure intensity adjusting method according to an embodiment of the present disclosure;
fig. 8A-8C are schematic diagrams of another set of user interfaces provided by embodiments of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application. In the description of the embodiments herein, "/" means "or" unless otherwise specified, for example, a/B may mean a or B; "and/or" in the text is only an association relationship describing an associated object, and means that three relationships may exist, for example, a and/or B may mean: three cases of a alone, a and B both, and B alone exist, and in addition, "a plurality" means two or more than two in the description of the embodiments of the present application.
It should be understood that the terms "first," "second," and the like in the description and claims of this application and in the drawings are used for distinguishing between different objects and not for describing a particular order. Furthermore, the terms "include" and "have," as well as any variations thereof, are intended to cover non-exclusive inclusions. For example, a process, method, system, article, or apparatus that comprises a list of steps or elements is not limited to only those steps or elements listed, but may alternatively include other steps or elements not listed, or inherent to such process, method, article, or apparatus.
Reference in the specification to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment can be included in at least one embodiment of the specification. The appearances of the phrase in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. It is explicitly and implicitly understood by one skilled in the art that the embodiments described herein can be combined with other embodiments.
First, some terms and related technologies referred to in the present application are explained to facilitate understanding by those skilled in the art.
Exposure is the fundamental process and key feature of all photochemical imaging methods. A visible image can be obtained by exposure. The exposure mode, i.e., a mode in which a computer uses a natural light source, is generally classified into various modes such as manual exposure and automatic exposure. The quality of the picture is related to the exposure, i.e. how much light should be passed through to enable the photosensitive element to obtain a sharp image. The exposure amount is determined by the light-passing time (shutter speed) and the light-passing area (aperture size).
Time-lapse photography (Time-lag photography), also called Time-lapse photography and Time-lapse video recording, is a Time-compression photography technique. The method takes a group of photos, and later, the photos are connected in series to form a video, so that the process of minutes, hours or even days is compressed to be played in a video mode in a short time. In a time-lapse video, the process of slowly changing objects or scenes is compressed to a short time, and a wonderful and wonderful scene which cannot be perceived by naked eyes at ordinary times is presented.
The RAW image is RAW data in which a CMOS or CCD image sensor converts a captured light source signal into a digital signal, and is also referred to as a RAW image file.
Sensitivity, also known as iso (international Organization for standardization) value, is a measure of the sensitivity of a film to light, determined by sensitivity metrology and several measurements, recently standardized by the international Organization for standardization. For less sensitive films, it takes longer to expose to achieve the same image as the more sensitive ones, and is therefore commonly referred to as a slow film. Highly sensitive negatives are thus referred to as fast negatives. Whether digital or film photography, the use of relatively high sensitivity often results in reduced image quality (due to coarser film particles or higher image noise or other factors) in order to reduce exposure time.
Exposure is a very important step in photography and determines how much light a photosensitive element can capture when an image is captured. That is, the exposure determines the brightness (brightness and darkness) of the image. The exposure amount reflects the exposure amount and is a parameter for specifically representing how much light is acquired by the photosensitive element. According to the definition of the exposure amount, the following steps are carried out: the exposure amount is determined by the exposure time, the light transmission area and the ambient light intensity. Wherein, the shutter speed determines the exposure time, and the aperture size determines the light transmission area. In general, when photographing is performed under natural light, the ambient light can hardly be changed, but ISO can affect the sensitivity of the photosensitive element to light, so ISO can be considered to affect the intensity of ambient light acquired by the photosensitive element.
As described above, the exposure amount is affected by three factors, i.e., the exposure time (shutter speed), the aperture size, and ISO, and these three factors affecting the exposure can be referred to as exposure parameters.
It is understood that the aperture size in electronic devices such as mobile phones is fixed. To change the exposure, the electronic device can adjust the exposure time and ISO. For ease of understanding and calculation, the present embodiment uses the exposure intensity to characterize the actual exposure. For example, the exposure intensity in an electronic device such as a mobile phone can be expressed as: exposure intensity is exposure time ISO.
At present, Automatic Exposure (AE) systems exist in electronic devices such as many cameras and mobile phones, and the AE systems in these electronic devices can automatically adjust Exposure parameters without manual adjustment by a user.
The process of adjusting the exposure parameters by the AE system is described below.
Average luminance in a photographing environment is L s When the exposure parameter is satisfied
Figure BDA0003111285920000051
The electronic device may determine that the brightness of the image is appropriate. Where t represents the exposure time. F represents the aperture size (F-number). L is a radical of an alcohol s Representing the average brightness of the ambient light. S represents ISO. K is an arbitrary constant, typically determined by the camera manufacturer. For example, K-12.5.
Transforming the above formula to obtain A v =log 2 (F 2 ),
Figure BDA0003111285920000052
S v =log 2 (S.times.N) and
Figure BDA0003111285920000053
Figure BDA0003111285920000054
substituting the transformed equation to obtain an exposure equation: a. the v +T v =S v +B v This exposure equation is an empirical formula for calculating the exposure parameters of film processing machines.
Wherein A is v The aperture size is indicated, and each increase in the aperture size by one unit indicates a decrease in the amount of incident light by half. T is v Indicating a shutter speed, the value of which, for each increase of one unit, indicates a decrease of half the amount of incoming light. S v The sensitivity of the camera is shown, and each time the value of the sensitivity of the camera is increased by one unit, the sensitivity of the camera to the same light incoming amount is doubled, and the sensitivity is shown on an image that the brightness is doubled. B is v Which represents the average brightness of the ambient light (ambient brightness), indicates that the average brightness of the ambient light is doubled for each increase in value by one unit.
In the AE system, the sum of the left and right sides of the exposure equation is called E v I.e. E v =A v +T v =S v +B v . Wherein E is v Reflecting the exposure of the camera, or E v The exposure intensity of the camera can be reflected. E v The larger the exposure intensity is, the brighter the image acquired by the camera is; e v The smaller the exposure intensity, the darker the image acquired by the camera. Understandably, when performing photometry to obtain B v Then according to thisThe exposure parameters satisfying the exposure equation can be obtained by the formula, so that an image with proper brightness can be obtained.
However, measure B v Thereafter, there may be multiple sets of exposure parameters (A) v ,T v ,S v ) The exposure equation is satisfied. At this time, the AE system can select an appropriate exposure parameter from an exposure table (exposure line map). It will be understood that the exposure tables (exposure profiles) referred to herein refer to the exposure parameters and E v The corresponding table of (1). That is, by looking up the exposure table, the AE system can acquire appropriate exposure parameters.
Understandably, B obtained due to photometry v And not necessarily accurate, multiple iterations are required to determine the final exposure parameters.
In addition, since the aperture size of electronic devices such as mobile phones is fixed, the AE system in these electronic devices determines the shutter speed and ISO from the measured ambient light brightness.
According to the above, when the AE system in the electronic device detects that the current ambient brightness changes, the exposure parameters are adjusted under the condition that the exposure equation is satisfied. For example, the AE system may decrease E when it is detected that the current ambient brightness is getting brighter v Adjusting exposure parameters according to an exposure table to ensure that the image acquired by the electronic equipment is not too bright; the AE system may increase E when a dimming of the current ambient brightness is detected v And adjusting exposure parameters according to the exposure table, so that the image acquired by the electronic equipment is not too dark.
It is understood that some user operations may cause the AE mode in the AE system to change. For example, a user touching the screen may trigger touch AE mode, which may apply high weight metering to the area touched by the user, thereby affecting the measured B v Resulting in the finally adjusted exposure parameters not being in accordance with the current situation.
It is assumed that the AE mode is changed by the user operation, and a state difference occurs. For example, when the image X is captured by the electronic apparatus, there is no change in the AE mode due to the user operation. When the electronic device is used for shooting the image Y, the electronic device detects the touch operation acting on the preview area and triggers the touch AE mode. That is, when the image Y is acquired, there is a change in the AE mode caused by a user operation. Therefore, the electronic device has a difference in state when acquiring the image X and the image Y.
When a user needs to capture a group of images with high consistency (including brightness consistency and content consistency) through the electronic device, if the environment changes briefly and drastically during the capturing process, so that the ambient light brightness measured by the electronic device changes greatly (for example, the ambient light brightness is too high or too low), the AE system of the electronic device automatically adjusts the exposure parameters. Because the change of the environment is short and violent, when the environment returns to normal, the electronic equipment may not be in time to adjust the exposure parameters, and still uses the exposure parameters when the environment changes to shoot the image, so that the finally stored image is too bright or too dark, and the consistency of a plurality of images stored by a user is poor.
It is understood that brief and drastic changes in the environment may include, but are not limited to, the camera being briefly occluded, someone walking in front of the lens, etc.
It should be noted that delayed shooting and continuous shooting (simply referred to as continuous shooting) are two typical ways for a user to shoot a group of images with high consistency.
The process of shooting delayed photography is similar to the process of making Stop Motion animation (Stop Motion), a plurality of pictures with the same shooting interval time are connected in series to form a dynamic video, and the low-speed change process of scenes is shown by obviously changed images. It can be understood that when a user obtains a group of images with high consistency through time-lapse photography, if the environment changes briefly and drastically during the photographing process, the brightness of the photographed images changes unsmoothly, which may cause the finally generated dynamic video to have a flicker phenomenon.
Similar to time-lapse photography, the user can also take a group of images with high consistency by continuous shooting. Specifically, after the user triggers the continuous shooting mode, the shutters of the electronic device are automatically pressed at the same time interval, and a plurality of images shot at the same time interval can be obtained. It can be understood that when a user takes a group of images with high consistency by continuous shooting, if the environment changes briefly and drastically during the shooting process, the brightness of the multiple images taken has large difference. That is, these image brightness variations are not smooth and the uniformity is poor.
The application provides an exposure intensity adjusting method and related equipment, whether a current shooting scene is similar to a last shooting scene or not can be analyzed, the exposure intensity is adjusted by combining the state difference of two times of shooting, the consistency of the brightness of an image acquired by a user is improved, the user does not need to spend time to shoot again, the time is saved, and the user experience is also improved.
The following describes an apparatus according to an embodiment of the present application.
Fig. 1 is a schematic hardware structure diagram of an electronic device 100 according to an embodiment of the present disclosure.
The electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a Universal Serial Bus (USB) interface 130, a charging management Module 140, a power management Module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication Module 150, a wireless communication Module 160, an audio Module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor Module 180, a button 190, a motor 191, an indicator 192, a camera 193, a display screen 194, a Subscriber Identity Module (SIM) card interface 195, and the like. The sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
It is to be understood that the illustrated structure of the embodiment of the present invention does not specifically limit the electronic device 100. In other embodiments of the present application, the electronic device 100 may include more or fewer components than shown, or combine certain components, or split certain components, or arrange different components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Processor 110 may include one or more processing units, such as: the Processor 110 may include an Application Processor (AP), a modem Processor, a Graphics Processing Unit (GPU), an Image Signal Processor (ISP), a controller, a memory, a video codec, a Digital Signal Processor (DSP), a baseband Processor, and/or a Neural-Network Processing Unit (NPU), etc. The different processing units may be separate devices or may be integrated into one or more processors.
The controller may be, among other things, a neural center and a command center of the electronic device 100. The controller can generate an operation control signal according to the instruction operation code and the time sequence signal to finish the control of instruction fetching and instruction execution.
It is understood that an AE system may also be included in the processor 110. The AE system may be specifically provided in the ISP. The AE system can be used to implement automatic adjustment of exposure parameters. Alternatively, the AE system may also be integrated in other processor chips. The embodiments of the present application do not limit this.
In the embodiments provided herein, the electronic device 100 may execute the exposure intensity adjustment method through the processor 110.
A memory may also be provided in processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that have just been used or recycled by the processor 110. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Avoiding repeated accesses reduces the latency of the processor 110, thereby increasing the efficiency of the system.
In some embodiments, processor 110 may include one or more interfaces. The USB interface 130 is an interface conforming to the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, or the like. The USB interface 130 may be used to connect a charger to charge the electronic device 100, and may also be used to transmit data between the electronic device 100 and a peripheral device. And the earphone can also be used for connecting an earphone and playing audio through the earphone. The interface may also be used to connect other electronic devices 100, such as AR devices and the like.
The charging management module 140 is configured to receive a charging input from a charger. The charging management module 140 may also supply power to the electronic device 100 through the power management module 141 while charging the battery 142.
The power management module 141 is used to connect the battery 142, the charging management module 140 and the processor 110. The power management module 141 receives input from the battery 142 and/or the charge management module 140 and provides power to the processor 110, the internal memory 121, the external memory, the display 194, the camera 193, the wireless communication module 160, and the like.
The wireless communication function of the electronic device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the electronic device 100 may be used to cover a single or multiple communication bands. Different antennas can also be multiplexed to improve the utilization of the antennas.
The mobile communication module 150 may provide a solution including 2G/3G/4G/5G wireless communication applied to the electronic device 100. The mobile communication module 150 may include at least one filter, a switch, a power Amplifier, a Low Noise Amplifier (LNA), and the like. The mobile communication module 150 may receive the electromagnetic wave from the antenna 1, filter, amplify, etc. the received electromagnetic wave, and transmit the electromagnetic wave to the modem processor for demodulation. The mobile communication module 150 may also amplify the signal modulated by the modem processor, and convert the signal into electromagnetic wave through the antenna 1 to radiate the electromagnetic wave.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating a low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then passes the demodulated low frequency baseband signal to a baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then passed to the application processor. The application processor outputs a sound signal through an audio device (not limited to the speaker 170A, the receiver 170B, etc.) or displays an image or video through the display screen 194.
The Wireless Communication module 160 may provide solutions for Wireless Communication applied to the electronic device 100, including Wireless Local Area Networks (WLANs) (e.g., Wireless Fidelity (Wi-Fi) network), Bluetooth (BT), Global Navigation Satellite System (GNSS), Frequency Modulation (FM), Near Field Communication (NFC), Infrared (IR), and the like. The wireless communication module 160 may be one or more devices integrating at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 2, performs frequency modulation and filtering processing on electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, perform frequency modulation and amplification on the signal, and convert the signal into electromagnetic waves through the antenna 2 to radiate the electromagnetic waves.
In some embodiments, antenna 1 of electronic device 100 is coupled to mobile communication module 150 and antenna 2 is coupled to wireless communication module 160 so that electronic device 100 can communicate with networks and other devices through wireless communication techniques.
The electronic device 100 implements display functions via the GPU, the display screen 194, and the application processor. The GPU is a microprocessor for image processing, and is connected to the display screen 194 and an application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. The processor 110 may include one or more GPUs that execute program instructions to generate or alter display information.
The display screen 194 is used to display images, video, and the like. The display screen 194 includes a display panel. The Display panel may be a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), an Active Matrix Organic Light-Emitting Diode (Active-Matrix Organic Light-Emitting Diode, AMOLED), a flexible Light-Emitting Diode (FLED), a Mini LED, a Micro-OLED, a Quantum Dot Light-Emitting Diode (QLED), or the like. In some embodiments, the electronic device 100 may include 1 or N display screens 194, with N being a positive integer greater than 1.
The electronic device 100 may implement the acquisition function via the ISP, camera 193, video codec, GPU, display screen 194, application processor, and the like.
The ISP is used to process the data fed back by the camera 193. For example, when a photo is taken, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electrical signal, and the camera photosensitive element transmits the electrical signal to the ISP for processing and converting into an image or video visible to the naked eye. The ISP can also carry out algorithm optimization on the noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in camera 193.
The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image to the photosensitive element. The photosensitive element may be a Charge Coupled Device (CCD) or a Complementary Metal-Oxide-Semiconductor (CMOS) phototransistor. The light sensing element converts the optical signal into an electrical signal, which is then passed to the ISP where it is converted into a digital image or video signal. And the ISP outputs the digital image or video signal to the DSP for processing. The DSP converts the digital image or video signal into image or video signal in standard RGB, YUV and other formats. In some embodiments, the electronic device 100 may include 1 or N cameras 193, N being a positive integer greater than 1. For example, in some embodiments, the electronic device 100 may acquire images of multiple exposure coefficients using the N cameras 193, and further, in video post-processing, the electronic device 100 may synthesize an HDR image by an HDR technique from the images of multiple exposure coefficients.
The digital signal processor is used for processing digital signals, and can process digital images or video signals and other digital signals. For example, when the electronic device 100 selects a frequency bin, the digital signal processor is used to perform fourier transform or the like on the frequency bin energy.
Video codecs are used to compress or decompress digital video. The electronic device 100 may support one or more video codecs. In this way, the electronic device 100 may play or record video in a variety of encoding formats, such as: moving Picture Experts Group (MPEG) 1, MPEG2, MPEG3, MPEG4, and the like.
The NPU is a Neural-Network (NN) computing processor, which processes input information quickly by using a biological Neural Network structure, for example, by using a transfer mode between neurons of a human brain, and can also learn by itself continuously. Applications such as intelligent recognition of the electronic device 100 can be realized through the NPU, for example: image recognition, face recognition, speech recognition, text understanding, and the like.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to extend the memory capability of the electronic device 100. The external memory card communicates with the processor 110 through the external memory interface 120 to implement a data storage function. For example, files such as music, video, etc. are saved in an external memory card.
The internal memory 121 may be used to store computer-executable program code, which includes instructions. The processor 110 executes various functional applications of the electronic device 100 and data processing by executing instructions stored in the internal memory 121. The internal memory 121 may include a program storage area and a data storage area. The storage program area may store an operating system, an application program (such as a sound playing function, an image and video playing function, etc.) required by at least one function, and the like. The storage data area may store data (such as audio data, phone book, etc.) created during use of the electronic device 100, and the like.
The electronic device 100 may implement audio functions via the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the headphone interface 170D, and the application processor. Such as music playing, recording, etc.
The audio module 170 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal.
The speaker 170A, also called a "horn", is used to convert the audio electrical signal into an acoustic signal.
The receiver 170B, also called "earpiece", is used to convert the electrical audio signal into an acoustic signal.
The microphone 170C, also referred to as a "microphone," is used to convert sound signals into electrical signals. The electronic device 100 may be provided with at least one microphone 170C.
The headphone interface 170D is used to connect a wired headphone.
The sensor module 180 may include 1 or more sensors, which may be of the same type or different types. It is understood that the sensor module 180 shown in fig. 1 is only an exemplary division, and other division is possible, which is not limited in this application.
The pressure sensor 180A is used for sensing a pressure signal, and can convert the pressure signal into an electrical signal. In some embodiments, the pressure sensor 180A may be disposed on the display screen 194. When a touch operation is applied to the display screen 194, the electronic apparatus 100 detects the intensity of the touch operation according to the pressure sensor 180A. The electronic apparatus 100 may also calculate the touched position from the detection signal of the pressure sensor 180A. In some embodiments, the touch operations that are applied to the same touch position but different touch operation intensities may correspond to different operation instructions.
The gyro sensor 180B may be used to determine the motion attitude of the electronic device 100. In some embodiments, the angular velocity of electronic device 100 about three axes (i.e., the x, y, and z axes) may be determined by gyroscope sensor 180B. The gyro sensor 180B may be used for photographing anti-shake.
The air pressure sensor 180C is used to measure air pressure. In some embodiments, electronic device 100 calculates altitude, aiding in positioning and navigation, from barometric pressure values measured by barometric pressure sensor 180C.
The magnetic sensor 180D includes a hall sensor. The electronic device 100 may detect the opening and closing of the flip holster using the magnetic sensor 180D.
The acceleration sensor 180E may detect the magnitude of acceleration of the electronic device 100 in various directions (typically three axes). The magnitude and direction of gravity can be detected when the electronic device 100 is stationary. The method can also be used for identifying the posture of the electronic equipment 100, and is applied to horizontal and vertical screen switching, pedometers and other applications.
A distance sensor 180F for measuring a distance. The electronic device 100 may measure the distance by infrared or laser. In some embodiments, taking a picture of a scene, electronic device 100 may utilize range sensor 180F to range for fast focus.
The proximity light sensor 180G may include, for example, a Light Emitting Diode (LED) and a light detector, such as a photodiode. The light emitting diode may be an infrared light emitting diode. The electronic device 100 emits infrared light to the outside through the light emitting diode. The electronic device 100 detects infrared reflected light from nearby objects using a photodiode. When sufficient reflected light is detected, it can be determined that there is an object near the electronic device 100. When insufficient reflected light is detected, the electronic device 100 may determine that there are no objects near the electronic device 100.
The ambient light sensor 180L is used to sense the ambient light level.
The fingerprint sensor 180H is used to acquire a fingerprint.
The temperature sensor 180J is used to detect temperature.
The touch sensor 180K is also referred to as a "touch panel". The touch sensor 180K may be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, which is also called a "touch screen". The touch sensor 180K is used to detect a touch operation applied thereto or nearby. The touch sensor can communicate the detected touch operation to the application processor to determine the touch event type. Visual output associated with the touch operation may be provided through the display screen 194. In other embodiments, the touch sensor 180K may be disposed on a surface of the electronic device 100, different from the position of the display screen 194.
In one embodiment of the present application, a user may need to acquire a series of images using the electronic device 100 for time-lapse photography or continuous shooting. In a delayed shooting or continuous shooting scenario, the electronic device 100 may assume the AE mode. That is, the electronic apparatus 100 automatically adjusts the AE value, and may trigger the touchAE mode if the user touches the display screen 194 during the preview of the series of images. In the touchAE mode, the electronic apparatus 100 may adjust the brightness of the corresponding position where the user touches the display screen and perform high-weight photometry. When the average brightness of the picture is calculated, the weight of the user touch area is obviously higher than that of other areas, and the finally calculated average brightness of the picture is closer to the average brightness of the user touch area.
The bone conduction sensor 180M may acquire a vibration signal.
The keys 190 include a power-on key, a volume key, and the like. The keys 190 may be mechanical keys. Or may be touch keys. The electronic apparatus 100 may receive a key input, and generate a key signal input related to user setting and function control of the electronic apparatus 100.
The motor 191 may generate a vibration cue. The motor 191 may be used for incoming call vibration cues, as well as for touch vibration feedback. For example, touch operations applied to different applications (e.g., photographing, audio playing, etc.) may correspond to different vibration feedback effects. The motor 191 may also respond to different vibration feedback effects for touch operations applied to different areas of the display screen 194. Different application scenes (such as time reminding, receiving information, alarm clock, game and the like) can also correspond to different vibration feedback effects. The touch vibration feedback effect may also support customization.
Indicator 192 may be an indicator light that may be used to indicate a state of charge, a change in charge, or a message, missed call, notification, etc.
The SIM card interface 195 is used to connect a SIM card. The SIM card can be attached to and detached from the electronic device 100 by being inserted into the SIM card interface 195 or being pulled out of the SIM card interface 195. The electronic device 100 may support 1 or N SIM card interfaces, N being a positive integer greater than 1. The electronic device 100 interacts with the network through the SIM card to implement functions such as communication and data communication. In some embodiments, the electronic device 100 employs esims, namely: an embedded SIM card. The eSIM card can be embedded in the electronic device 100 and cannot be separated from the electronic device 100.
Fig. 2 is a block diagram of a software structure of an electronic device 100 according to an embodiment of the present disclosure.
The layered architecture divides the software into several layers, each layer having a clear role and division of labor. The layers communicate with each other through a software interface. In some embodiments, the system is divided into four layers, an application layer, an application framework layer, a Runtime (Runtime) and system library, and a kernel layer, from top to bottom.
The application layer may include a series of application packages.
As shown in fig. 2, the application package may include applications (also referred to as applications) such as camera, gallery, calendar, phone call, map, navigation, WLAN, bluetooth, music, video, short message, etc.
The Application framework layer provides an Application Programming Interface (API) and a Programming framework for the Application program of the Application layer. The application framework layer includes a number of predefined functions.
As shown in FIG. 2, the application framework layers may include a window manager, content provider, view system, phone manager, resource manager, notification manager, and the like.
The window manager is used for managing window programs. The window manager can obtain the size of the display screen, judge whether a status bar exists, lock the screen, intercept the screen and the like.
The content provider is used to store and retrieve data and make it accessible to applications. The data may include video, images, audio, calls made and received, browsing history and bookmarks, phone books, etc.
The view system includes visual controls such as controls to display text, controls to display pictures, and the like. The view system may be used to build applications. The display interface may be composed of one or more views. For example, the display interface including the short message notification icon may include a view for displaying text and a view for displaying pictures.
The phone manager is used to provide communication functions of the electronic device 100. Such as management of call status (including on, off, etc.).
The resource manager provides various resources for the application, such as localized strings, icons, pictures, layout files, video files, and the like.
The notification manager enables the application to display notification information in the status bar, can be used to convey notification-type messages, can disappear automatically after a brief dwell, and does not require user interaction. Such as a notification manager used to notify download completion, message alerts, etc. The notification manager may also be a notification that appears in the form of a chart or scroll bar text at the top status bar of the system, such as a notification of a background running application, or a notification that appears on the screen in the form of a dialog interface. For example, prompting text information in the status bar, sounding a prompt tone, vibrating the electronic device, flashing an indicator light, etc.
The Runtime (Runtime) includes a core library and a virtual machine. Runtime is responsible for scheduling and management of the system.
The core library comprises two parts: one part is a function that a programming language (for example, java language) needs to call, and the other part is a core library of the system.
The application layer and the application framework layer run in a virtual machine. The virtual machine executes programming files (e.g., java files) of the application layer and the application framework layer as binary files. The virtual machine is used for performing the functions of object life cycle management, stack management, thread management, safety and exception management, garbage collection and the like.
The system library may include a plurality of functional modules. For example: surface Manager (Surface Manager), Media Libraries (Media Libraries), three-dimensional graphics processing Libraries (e.g., OpenGL ES), two-dimensional graphics engines (e.g., SGL), and the like.
The surface manager is used to manage the display subsystem and provides a fusion of two-Dimensional (2-Dimensional, 2D) and three-Dimensional (3-Dimensional, 3D) layers for multiple applications.
The media library supports a variety of commonly used audio, video format playback and recording, and still image files, among others. The media library may support a variety of audio-video encoding formats, such as MPEG4, h.264, MP3, AAC, AMR, JPG, PNG, and the like.
The three-dimensional graphic processing library is used for realizing 3D graphic drawing, image rendering, synthesis, layer processing and the like.
The 2D graphics engine is a drawing engine for 2D drawing.
The kernel layer is a layer between hardware and software. The kernel layer at least comprises a display driver, a camera driver, an audio driver, a sensor driver and a virtual card driver.
The following describes exemplary workflow of the software and hardware of the electronic device 100 in connection with capturing a photo scene.
When the touch sensor 180K receives a touch operation, a corresponding hardware interrupt is issued to the kernel layer. The kernel layer processes the touch operation into an original input event (including touch coordinates, a time stamp of the touch operation, and other information). The raw input events are stored at the kernel layer. And the application program framework layer acquires the original input event from the kernel layer and identifies the control corresponding to the input event. Taking the touch operation as a touch click operation, and taking a control corresponding to the click operation as a control of a camera application icon as an example, the camera application calls an interface of the application framework layer, starts the camera application, further starts the camera drive by calling the kernel layer, and captures a still image or a video through the camera 193.
The following describes a time-lapse photography scenario provided by the present application.
It is understood that the term "user interface" in the description and claims and drawings of the present application is a media interface for interaction and information exchange between an application or operating system and a user that enables conversion between an internal form of information and a form acceptable to the user. A commonly used presentation form of the user interface is a Graphical User Interface (GUI), which refers to a user interface related to computer operations and displayed in a graphical manner. It may be an interface element such as an icon, a window, a control, etc. displayed in the display screen of the electronic device, where the control may include a visual interface element such as an icon, a button, a menu, a tab, a text box, a dialog box, a status bar, a navigation bar, a Widget, etc.
1. Starting time-delay photography (fig. 3A-3D)
FIG. 3A illustrates an exemplary user interface 300 on electronic device 100 for exposing applications installed by electronic device 100.
The user interface 300 displays a page with application icons placed therein, which may include a plurality of application icons (e.g., a weather application icon, a calendar application icon, an album application icon, a notes application icon, an email application icon, an application store application icon, a settings application icon, etc.). And page indicators can be displayed below the application icons to indicate the position relationship between the currently displayed page and other pages. Below the page indicator are a plurality of tray icons (e.g., camera application icon 310, browser application icon, information application icon, dialing application icon). The tray icon remains displayed upon page switching. The content displayed on the user interface 300 is not limited in the embodiment of the present application.
It is understood that the electronic apparatus 100 may detect a user operation (such as a touch/click operation) by the user on the camera application icon 310, and in response to the operation, the electronic apparatus 100 may display the photographing interface 400 illustrated in fig. 3B. The capture interface 400 may be a user interface of a default capture mode of the camera application on which a user may complete a capture. The camera application is an application for image shooting on electronic equipment such as a smart phone and a tablet computer, and the name of the application is not limited in the application. That is, the user may open the photographing interface 400 of the camera application by clicking the camera application icon 310. It can be understood that the default camera in the default photographing mode is not limited to be the rear camera, and the electronic device 100 may also set the front camera as the default camera, that is, after the camera application is started, the electronic device 100 may display the image captured by the front camera in the preview area 420, and may be used for the user to photograph through the default front camera.
Fig. 3B illustrates one user interface 400 for a camera application on an electronic device such as a smartphone. As shown in fig. 3B, the capture interface 400 may include a parameter adjustment area 410, a preview area 420, a camera mode option 430, an album shortcut control 441, a shutter control 442, a camera flip control 443.
Preview area 420 may be used to display a preview image. The preview image is an image acquired by the electronic device 100 in real time through a camera. The electronic device may refresh the display content in preview area 420 in real-time to facilitate the user to preview the image currently captured by the camera.
One or more shooting mode options may be displayed in the camera mode option 430. The one or more shooting mode options may include: night mode option 431, intelligent portrait mode option 432, take mode option 433, record mode option 434, and more option 435. It is understood that the one or more shooting mode options may be presented on the interface as text information, such as "night scene", "smart portrait", "take picture", "record picture", "more", and may also be presented as icons or other forms of Interactive Elements (IE), which is not limited in this application.
When a user operation acting on the photographing mode option is detected, the electronic apparatus 100 may turn on the photographing mode selected by the user. In particular, when user operations on the more option 435 are detected, the electronic device 100 may further display more other shooting mode options, such as a delayed shooting mode option, a burst mode option, and the like, which may present a richer camera function to the user. It is understood that more or fewer shooting mode options may be included in the camera mode options 430, and the camera mode options shown in fig. 3B are only one implementation manner of the present application and should not be considered as a limitation of the present application.
The album shortcut control 441 may be used to open an album application. In response to a user operation, such as a touch operation, acting on the album shortcut control 441, the electronic device 100 may open an album application.
The shutter control 442 may be used to listen for user actions that trigger a picture taking. Electronic device 100 may detect a user operation on shutter control 442, in response to which electronic device 100 may save the preview image in preview area 420 as a picture in an album application. In addition, the electronic apparatus 100 may also display thumbnails of the saved images in the album shortcut control 441. That is, the user can trigger a photograph by operating the shutter control 442. It is understood that the shutter control 442 may be a button or other form of control, as the present application is not limited thereto.
The camera flip control 443 can be used to monitor user operations that trigger flipping the camera. The electronic apparatus 100 may detect a user operation, such as a touch operation, acting on the camera flip control 443, and in response to the operation, the electronic apparatus 100 may flip the camera for shooting, such as switching the rear camera to the front camera or switching the front camera to the rear camera.
More or fewer controls may also be included in the shooting interface 400, which is not limited in this embodiment of the application.
Electronic device 100 may detect a user operation (such as a touch/click operation) on further options 435 by the user, in response to which electronic device 100 may display mode selection interface 500 shown in fig. 3C, mode selection interface 500 being a user interface displaying further shooting mode options.
Fig. 3C illustrates a mode selection interface 500. As shown in fig. 3C, mode-selection interface 500 may include a mode-selection area 510 and a camera mode option 430.
The mode selection area 510 includes a delayed photography mode option 511 and a continuous shooting mode option 512. It is understood that the mode selection area 510 may further include more shooting mode options, and the mode selection area 510 shown in fig. 3C is only one implementation manner of the present application and should not be considered as a limitation of the present application.
It is understood that the electronic device 100 may detect a user operation (such as a touch/click operation) by the user on the camera mode option 430. In response to the operation, the electronic device may display other user interfaces. For example, in response to a user action operation at the photograph mode option 433, the electronic device displays the photograph interface 400.
As shown in fig. 3C, the electronic device 100 may detect a user operation (such as a touch/click operation) applied to the delayed photography mode option 511 by the user. In response to this operation, the electronic device 100 may display a time-lapse camera interface 600 (shown in fig. 3D).
The electronic apparatus 100 may detect a user operation on the start delayed shooting control 642, in response to which the electronic apparatus 100 starts performing delayed shooting.
In an embodiment of the present application, the electronic device 100 automatically triggers the shutter at preset time intervals, and stores one frame of image after triggering the shutter, that is, one frame of image for forming the delayed video. It is understood that during the period between two consecutive shutter activations, the electronic device 100 will also capture an image, which is displayed in the preview area 420. That is, the image captured by the electronic device 100 between two consecutive triggers of the shutter is a preview image and is not stored in the album application.
It is understood that the preset time can be set according to actual needs, and the present application is not limited thereto. For example, the preset time may be 2 seconds(s).
When the electronic apparatus 100 detects a user operation on the end-delay photographing control 742 (as shown in fig. 3E), the electronic apparatus 100 stops photographing in response to the operation. The electronic device 100 may combine the plurality of frames of images saved in the delayed photographing process into a dynamic video and save the dynamic video into the album application. In response to a user operation (such as a touch/click operation) acting on the album shortcut control 441, the electronic device 100 may open an album application. Therefore, the user can conveniently view the dynamic video obtained by adopting the time-delay shooting mode.
2. Adjusting the exposure intensity during the delayed photography to avoid the flicker phenomenon in the captured delayed photography video (FIGS. 3E-3G)
In response to a user operation acting on the time-lapse photography control 642 illustrated in fig. 3D, the electronic device 100 may display a time-lapse photography interface illustrated in fig. 3E.
As shown in fig. 3E, at 4s of the time-lapse photography, the electronic apparatus 100 triggers the shutter to save the image 1. It is understood that the image 1 is one frame image constituting the time-lapse video for photography obtained by photographing. For convenience of description, the luminance of the image 1 is denoted as luminance 1.
After electronic device 100 has triggered the shutter and saved image 1, it will continue to capture and display images in preview area 420. As shown in fig. 3F, image 2 is a preview image of one frame displayed in preview area 420 after electronic device 100 saves image 1, and specifically, image 2 is a preview image of the last frame before electronic device 100 takes the next shot. For convenience of description, the luminance of the image 2 is denoted as luminance 2.
When the electronic device 100 collects the image 2, the camera is shielded by falling leaves, so that the ambient brightness becomes dark. The AE system in the electronic device 100 increases E after detecting that the ambient brightness is dark v The exposure intensity will also increase. Since the AE system needs to determine the final exposure parameters through multiple iterations, the AE system may still increase E after the leaves fall and no longer block the camera v . That is, when the leaves no longer block the camera, the AE system may still increase the exposure intensity, i.e., cannot adjust the exposure parameters in time according to the current ambient brightness.
The electronic device 100 may perform similarity detection on the image 2 and the image 1, and determine that the contents of the image 1 and the image 2 are similar. The electronic apparatus 100 may also compare the difference in the state when the images 1 and 2 are acquired, that is, determine whether there is a change in the automatic exposure mode caused by a user operation. For example, when the electronic device 100 captures image 2, the user touches the display screen and triggers the touchAE mode. By comparison, the electronic device 100 determines that there is no state difference between the acquisition of image 1 and image 2. The electronic device 100 determines the fusion weight according to the weight measurement parameters (e.g., the shooting time and the brightness), and re-determines the exposure parameter according to the fusion weight, so as to avoid the situation that the brightness of the image is too large due to the fact that the AE system cannot adjust the exposure parameter in time.
It is understood that the similarity detection method, the state difference determination method, the fusion weight determination method, and the exposure intensity re-determination method will be described in the following embodiments, and will not be described herein.
As shown in fig. 3G, at the 6 th s of the delayed shooting, the electronic apparatus 100 triggers the shutter to save the image 3. It can be understood that the exposure parameters employed by the electronic device 100 in acquiring the image 3 are the above-described re-determined exposure parameters. For convenience of description, the luminance of the image 3 is denoted as luminance 3.
It can be understood that, in the above-mentioned time-lapse photographing process, the preset time is 2 s. That is, the electronic apparatus 100 automatically triggers the shutter every 2s, and stores one frame of image, that is, one frame of image for composing the delayed photographic video. Image 1 and image 3 are two frames of images that the electronic device 100 has twice triggered the shutter to save in succession.
As shown in fig. 4, fig. 4 is a schematic diagram of an image acquired during time-lapse photography. It is understood that the 2 nd frame image is image 1 and the 3 rd frame image is image 3. Between the 2 nd frame image and the 3 rd frame image, the electronic device 100 also acquires a preview image 3 and a preview image 4. The preview image 4 is the preview image of the last frame before the 3 rd frame image is captured. That is, the preview image 4 is the image 2. When the electronic device 100 captures the preview image 4, a part of the camera is blocked by fallen leaves, so that the preview image 4 is darker than other images (such as a 2 nd frame image) captured by the electronic device 100 before. That is, luminance 2 is smaller than luminance 1. Since the electronic device 100 does not adopt the exposure parameters adjusted by the AE system before acquiring the 3 rd frame image, but re-determines the exposure parameters according to the fusion weight, the brightness difference between the 3 rd frame image and the 2 nd frame image is not large. That is, the difference between the luminance 1 and the luminance 3 is not large.
When the electronic apparatus 100 detects a user operation on the end delayed photography control 742, the electronic apparatus 100 ends the delayed photography in response to the operation. The images saved in the delayed shooting process form a video, namely the delayed shooting video. As shown in fig. 5, the luminance difference between the 2 nd frame image and the 3 rd frame image of the time-lapse video is small, that is, the luminance change of the entire video with the luminance 1 is smooth, and the flicker phenomenon (as shown in fig. 6) caused by the large luminance difference between the 2 nd frame image and the 3 rd frame image does not occur. The 3 rd frame image in the time-lapse video camera shown in fig. 6 is significantly brighter than the other images because the exposure parameters adopted by the electronic apparatus 100 at the time of acquiring the 3 rd frame image are changed. The exposure parameters are adjusted for a relatively dark screen by the AE system. According to the adjustment principle of the AE system, the adjusted exposure intensity increases, and the corresponding exposure parameters also increase. However, when the image of the 3 rd frame is shot, the brightness of the picture is restored to the state before the leaves are shielded, and the image with proper brightness can be obtained without increasing the exposure parameters, so the flicker phenomenon shown in fig. 6 is caused.
As can be seen from the above, time-lapse photography and continuous shooting are two typical ways for a user to acquire a group of images with high consistency. It can be understood that the time-lapse photography scene is similar to the continuous shooting scene. The above description is made only by taking a delayed shooting scene as an example. The continuous shooting scene may be referred to as a delayed shooting scene, and is not described in detail herein.
It should be noted that the application scenario of the present application is not limited to the delayed shooting scenario and the continuous shooting scenario, and the exposure intensity adjustment method provided in the embodiment of the present application may also be applicable to other shooting scenarios in which multiple frames of images are continuously shot within a period of time. For example, a video recording scene, a scene in which a panoramic image is captured, and the like.
The following describes a specific implementation of the embodiment of the present application with reference to a flowchart shown in fig. 7.
S701: the electronic device 100 performs similarity detection on an image a and an image B, namely, determines whether the image a and the image B are similar, wherein the image a is an image saved in the last shooting, and the image B is a last frame preview image on the display screen before the current shooting.
Specifically, the electronic device 100 obtains the last frame of preview image in the display screen before the shooting, and in order to simplify the subsequent description, may record the image that is finally saved after the shooting at the last time as an image a, record the last frame of preview image in the display screen before the shooting at this time as an image B, and detect the similarity between the image a and the image B by using an average hash (aHash) algorithm. It is understood that the similarity referred to herein is the similarity of image contents.
It will be appreciated that one frame of image may be saved for each shot. In the delayed photography scenario, the saved images constitute the finally obtained delayed photography video. In addition, one frame of image can be saved in the last shooting and the current shooting respectively, and two frames of images saved by the two shooting can be two continuous frames of images in the delayed shooting video.
The preview image acquired by the electronic device 100 may be a preview image of one frame within a period of time (for example, 1ms) before the current shooting, and is not necessarily a preview image of the last frame before the current shooting.
In one embodiment of the present application, the electronic device 100 may perform similarity detection on an image D and an image B, that is, determine whether the image D and the image B are similar, where the image D is a last frame preview image captured last time.
The specific steps for detecting the content similarity of the image A and the image B by utilizing the aHash algorithm are as follows:
1. the image size is reduced.
The size of the image a and the image B is reduced, and the image a and the image B after the size reduction are respectively referred to as an image a and an image B. It is understood that the reduced size can be specifically set according to the actual situation. For example, the size of image a and image B may be reduced to 16 px. The image A and the image B are reduced to the same size, so that the details of the images can be removed, only basic information such as structure, brightness and darkness and the like is kept, the same hash value of subsequent images can be ensured, and distance calculation is convenient to carry out.
2. And (5) graying processing.
It is understood that the images a and B acquired are generally color images, so the images a and B (images a and B) after size reduction are also generally color images. The second step of the aHash algorithm is to perform graying processing on the image a and the image b to obtain a grayscale image a and a grayscale image b, i.e. to convert the image a and the image b into grayscale images.
First, a color image and a grayscale image (grayscale image) will be described:
a digital image is composed of pixels, which are two-dimensional elements, each having a specific position and magnitude. The digital image may include color images and grayscale images. The pixels of a color image are typically represented by three components, Red (R), Green (G) and Blue (B) (RGB color space), i.e. a two-dimensional matrix of triads of Red, Green and Blue, each value of the triad being between 0-255. Pixels of a grayscale image are represented by a value (grayscale value) ranging from 0 to 255, with 0 representing black and 255 representing white and the other values representing a grayscale between black and white.
Thus, converting a color image to a grayscale image requires R, G, B three components to be converted to one component (denoted by Gray), including but not limited to the following:
the floating point algorithm: gray ═ R0.3 + G0.59 + B0.11;
integer method: gray ═ (R30 + G59 + B11)/100;
③ the shifting method: gray ═ (R76 + G151 + B28) > > 8;
fourthly, average value method: (R + G + B)/3;
taking green only: g.
3. The pixel mean is calculated.
The average of the Gray scale values (Gray) of all pixels of the Gray scale image a and the Gray scale image b is calculated respectively, the average of the Gray scale values of the pixels of the Gray scale image a is recorded as m, and the average of the Gray scale values of the pixels of the Gray scale image b is recorded as n. For example, if the size of the image a and the image B is reduced to 16px in step 2, 2 integer matrices of 16 × 16 can be obtained after the graying process is performed, the elements in the 2 matrices are the grayscale values, and the average value of all the elements in the 2 matrices is calculated respectively.
4. The pixel gray values are compared.
Traversing the gray value of each pixel of the gray image a, comparing the gray value of each pixel in the gray image a with m, if the gray value of the pixel is greater than m, recording as 1, otherwise, recording as 0. Similarly, the gray value of each pixel of the gray image b is compared with n by traversing the gray value of each pixel of the gray image b. If the gray scale value of the pixel is larger than n, the value is marked as 1, otherwise, the value is marked as 0.
5. A hash value is calculated.
The comparison results of the previous step (step 4) are combined together to form a character string, and the character string is a "fingerprint" of the image and can also be called as a hash value. For example, if the size of image a and image B is reduced to 16px in step 2, i.e., the size of image a and image B is 16px, the comparison results of the previous step are combined together to form a 256-bit character string. That is, the fingerprints of image a and image b are each a 256-bit character string.
It should be noted that when the comparison results of the previous step are combined together, the combination order is not important, but the grayscale images a and b need to adopt the same combination order.
6. The hamming distance is calculated.
In the information theory, the hamming distance between two character strings with equal length is the number of different characters at the corresponding positions of the two character strings. And comparing the number of different characters at the corresponding positions of the 2 character strings obtained in the previous step, namely obtaining the Hamming distance between the image a and the image b. It can be appreciated that a larger hamming distance indicates a more inconsistent image from image to image. Conversely, a smaller hamming distance indicates that the two images are more similar.
In some embodiments of the application, when the ratio of the hamming distance to the length of the fingerprint does not exceed the preset threshold, it is indicated that the image a and the image B are similar, that is, the scenes of two shots (this shot and the last shot) are similar, and the subsequent steps are continuously executed, otherwise, it is indicated that the image a and the image B are dissimilar, that is, the scenes of two shots (this shot and the last shot) are dissimilar, the exposure parameters are not adjusted, the current exposure parameters are directly used for shooting, and the shot image is saved.
In other embodiments of the present application, when the hamming distance does not exceed the preset threshold, it indicates that image a and image B are similar, that is, it indicates that scenes of two shots (this shot and last shot) are similar, and continues to perform the subsequent steps, otherwise, it indicates that image a and image B are dissimilar, that is, it indicates that scenes of two shots (this shot and last shot) are dissimilar, and does not adjust the exposure parameters, the current exposure parameters are directly used for performing the current shot, and the shot image is saved.
It will be appreciated that the fingerprints may also be divided and compared, for example, 64-bit fingerprints may be divided into groups of 4, and then compared in hexadecimal.
It should be noted that the preset threshold may be set according to actual requirements, and the present application is not limited to this. In one embodiment of the present application, the preset threshold is 0.25. In another embodiment of the present application, the preset threshold is 8.
Of course, other methods can be used to detect the similarity between image a and image B. For example, perceptual hash (pHash) algorithm, difference hash (dHash) algorithm, histogram method, and similarity determination using peak signal to noise ratio (PSNR) or Structural Similarity (SSIM). It is understood that the method adopted by the application for detecting the similarity of the image A and the image B is not limited.
S702: the electronic apparatus 100 determines a state difference between a time a at which the image a is acquired and a time B at which the image B is acquired, the state difference being a change in the automatic exposure mode caused by a user operation. Specifically, the electronic apparatus 100 detects whether there is a state difference between the two times (time a and time B) at which the image a and the image B are acquired, that is, whether there is a change in the AE mode caused by a user operation. For example, a user touching the display screen may trigger the touchAE mode. At this time, the electronic apparatus 100 may increase the brightness of the corresponding screen area touched by the user, and may perform high-weight photometry on the user-touched area. When the average brightness of the picture is calculated subsequently, the weight of the touch area is obviously higher than that of other areas, so that the calculated average brightness of the picture is similar to the brightness of the touch area. It should be noted that the touchAE mode refers to a mode for clicking a screen in the preview process, and the electronic device 100 adjusts the brightness of the corresponding area clicked by the user in the screen. It is understood that the user operation may include touch, click, sound, gesture, etc., which is not limited in this application.
It can be understood that if the state difference exists between the time A and the time B, namely the state difference exists between the current shooting and the last shooting, the subsequent steps are not required to be executed; and if the state difference does not exist between the time A and the time B, namely the state difference does not exist between the current shooting and the last shooting, continuing to execute the subsequent steps.
Electronic device 100 may determine from table 1 whether there is a difference in state between time a and time B.
TABLE 1
Figure BDA0003111285920000171
Figure BDA0003111285920000181
1. At both time A and time B, there is a change in the AE mode caused by a user operation
It can be understood that if there is a change in the AE mode caused by the user operation at both time a and time B, it indicates that the user may wish to change the brightness of the screen at both time a and time B. For example, a user may wish to highlight a certain area in the picture. In this case, it is determined that there is a state difference between the time a and the time B, and the photographing is performed directly according to the exposure parameters obtained from the changed AE mode without performing the subsequent steps.
2. There is a change in the AE mode caused by a user operation at time a, and there is no change in the AE mode caused by a user operation at time B
It can be understood that if there is a change in the AE mode caused by the user operation at time a and there is no change in the AE mode caused by the user operation at time B, it indicates that the user may wish to change the brightness of the screen at time a. For example, a user may wish to highlight a certain area in a picture. In this case, it is determined that there is a state difference between the time a and the time B, and the photographing is performed directly according to the exposure parameters obtained from the changed AE mode without performing the subsequent steps.
3. There is no change in the AE mode by the user operation at time a, and there is a change in the AE mode by the user operation at time B
It can be understood that if there is no change in the AE mode caused by the user operation at time a, and there is a change in the AE mode caused by the user operation at time B, it indicates that the user may wish to change the brightness of the screen at time B. For example, a user may wish to highlight a certain area in the picture. In this case, it is determined that there is a state difference between the time a and the time B, and shooting is performed directly according to the current exposure parameter without referring to the exposure parameter at the time of the last shooting, and subsequent steps are not performed.
4. There is no change of the AE mode caused by the user operation at the time a and the time B
It can be understood that if there is no change in the AE mode caused by the user operation at time a and time B, it indicates that the user may not wish to change the brightness of the screen. In this case, it is determined that there is no state difference between the time a and the time B, the exposure parameter at the time of the last shooting has a reference meaning, and the subsequent steps can be continuously performed.
S703: the electronic device 100 determines the fusion weight according to a weight parameter, where the weight parameter includes one or more of the following: time difference, luminance ratio of image a and image B.
The electronic device 100 determines the fusion weight according to a weight metric parameter (e.g., time difference, luminance ratio). Specifically, the electronic apparatus 100 sets the fusion weight w according to the time difference t Setting the fusion weight w according to the luminance ratio lux . The basic principle for setting the fusion weight is as follows: the smaller the time difference, the lower the fusion weight w t The higher; the greater the luminance ratio, the fusion weight w lux The higher.
It is understood that the time difference refers to a difference between times when the electronic apparatus 100 automatically triggers the shutter at the time of two shots, and the brightness ratio refers to a ratio of ambient brightness (lv) when the image a and the image B are acquired.
Therefore, before determining the fusion weight, the electronic device 100 needs to obtain the shutter trigger time when acquiring the images a and B, and the ambient brightness. The moment of triggering the shutter is stored in the electronic device 100, and can be called according to the storage address; the ambient brightness can be estimated in lv.
It should be noted that, the specific formula for calculating the ambient brightness is as follows:
Figure BDA0003111285920000182
wherein, A represents an aperture value, and avg Luma represents an average brightness value of the Raw image; exposure Time denotes Exposure Time in seconds(s); ISO represents the current sensitivity of the electronic apparatus 100; c is a constant, with a specific value of 100/46.
It will be understood that the aperture value may be expressed as the size of the aperture, F being the lens focal length/effective lens diameter, and the aperture value is expressed as F number, denoted as F/, and the complete series of aperture values is as follows: f/1.0, f/1.4, f/2.0, f/2.8, f/4.0, f/5.6, f/8.0, f/11, f/16, f/22, f/32, f/44, f/64. The diaphragm is used to determine the amount of light entering the lens. In the case of a shutter being unchanged: the smaller the value behind F, the larger the aperture, the more the light input quantity, the brighter the picture, the narrower the focal plane, the larger the blurring of the main body background; the larger the value behind F, the smaller the aperture, the smaller the amount of light entering, the darker the picture, the wider the focal plane, the sharper the front and back of the subject.
It is understood that the RGB color space is one of the most commonly used color spaces. Generally, Raw images employ an RGB color space. There are various methods for calculating the average luminance value of the Raw image. Optionally, the Raw image is converted from an RGB color space to an XYZ color space, in the RGB color space, a color image is generally represented by three components of Red (Red, R), Green (Green, G) and Blue (Blue, B), and in the XYZ color space, tristimulus values in an energy spectrum in which three components of X (Red primary color stimulus quantity), Y (Green primary color stimulus quantity) and Z (Blue primary color stimulus quantity) match, and an average value of Y components of each pixel of the Raw image after the color space is converted is taken as an average brightness value; converting the Raw image from an RGB color space into a Yxy color space, defining colors according to two components of x (dominant wavelength of hue) and Y (saturation) in the Yxy color space, wherein the Y component represents brightness, and taking the average value of the Y components of each pixel of the Raw image after the color space is converted as an average brightness value; converting the Raw image from an RGB color space into a YUV color space, wherein Y represents lightness, U and V represent chroma (describing color and saturation) in the YUV color space, and taking the average value of Y components of each pixel of the Raw image after the color space is converted as an average brightness value; the color space is not converted, and the average value of three components of the Raw image R, G, B is directly taken as the average brightness value; the color space is not converted, and the average value of the G components of each pixel of the Raw image is directly taken as the average brightness value.
It is understood that the above calculation methods are only examples given in the present application, and there are other methods of calculating the average luminance value of the Raw image, which are not limited in the present application.
In addition, the Exposure Time and ISO may be read directly within a sensor (e.g., a CMOS image sensor, etc.), or when the electronic apparatus 100 adopts the AE mode, the parameters may be stored in a specified memory address, and the electronic apparatus 100 acquires the parameters by accessing the specified memory address.
In one embodiment of the present application, the fusion weight w is set t And w lux As shown in tables 2 and 3. Specifically, as shown in Table 2, when the time difference is represented as T, and T is less than or equal to 3, w t 1 is ═ 1; when T is 3.5, w t 0.5; when T is greater than or equal to 4, w t 0; when T is more than 3 and less than 3.5, linear interpolation is carried out, then
Figure BDA0003111285920000191
When T is more than 3.5 and less than 4, linear interpolation is carried out, then
Figure BDA0003111285920000192
As shown in Table 3, when the luminance ratio is X and X is less than or equal to 0.7, w lux 0; when X is 1, w lux 0.8; when X is greater than or equal to 1.3, w lux 0; when X is more than 0.7 and less than 1, linear interpolation is carried out, then
Figure BDA0003111285920000193
When X is more than 1 and less than 1.3, linear interpolation is carried out, then
Figure BDA0003111285920000194
TABLE 2
Time difference (second unit) Not more than 3 3.5 Not less than 4
w t 100/100 50/100 0/100
TABLE 3
Ratio of luminance Not more than 0.7 1 Not less than 1.3
w lux 0/100 80/100 0/100
S704: the electronic device 100 determines the exposure intensity when acquiring the image C, which is the image saved in the current shooting, according to the exposure intensity when acquiring the image a, the exposure intensity when acquiring the image B, and the determined fusion weight.
Specifically, the electronic apparatus 100 redetermines the exposure intensity, that is, determines the exposure intensity at the time of acquiring the image C, based on the exposure intensity at the time of acquiring the image a, the exposure intensity at the time of acquiring the image B, and the fusion weight determined in step S703. The image a is an image stored in the last shooting, and the image C is an image stored in the current shooting. The electronic device 100 may determine the exposure intensity at the time of acquiring the image C according to the following formula:
newEV=(1-w t *w lux )*curEV+w t *w Lux *lastEV。
it is understood that lastET lastISO. Where lastEV is the exposure intensity at which image a was acquired, lastET is the exposure time at which image a was acquired, and lastISO is ISO at which image a was acquired.
It is understood that cure ev is cure et curISO. Wherein, the curEV is the exposure intensity when the last frame preview image (image B) is acquired before the current shooting, the curET is the exposure time when the last frame preview image (image B) is acquired before the current shooting, and the curISO is the ISO when the last frame preview image (image B) is acquired before the current shooting.
It is understood that newET newISO. Wherein newEV is the exposure intensity when image C is acquired, newET is the exposure time when image C is acquired, and newISO is ISO when image C is acquired.
In addition, w is t And w lux To fuse weights, w t And w lux Reference may be made to step S703.
It can be understood that the exposure intensity is used to characterize the exposure amount, so the amount of light received by the photosensitive element can be determined by the value of the exposure intensity. In the embodiment of the present application, only the value of the exposure intensity is considered, and the unit of the exposure intensity is not considered, but if the exposure intensity is calculated by multiplying the exposure time by ISO, it is necessary to ensure that the units of the exposure time are consistent, for example, the exposure time is in seconds(s).
Illustratively, the exposure parameters when acquiring image a include: lastET 1/30 and lastISO 300, the exposure intensity at which image a was acquired is: lastEV lastET lastISO 10. The exposure parameters when acquiring image B include: when the image B is acquired, the exposure intensity is 1, 1/20 or 400, respectively: currev ═ curET ═ curISO ═ 20. If the image a is similar to the image B and the image a and the image B have no state difference through the similarity detection, the electronic device may determine the fusion weight and determine the exposure intensity when the image C is obtained. Specifically, the time difference between two shots is 3, the lv ratio (lv at this time of shooting/lv at the last time of shooting) between the two shots is 1, and the following are provided according to tables 2 and 3: w is a t =1,w lux When the ratio is 0.8, the following results are obtained: newEV (1-1 × 0.8) × 20+1 × 0.8 × 10 ═ 12.
According to the above example, when the image a and image B are photographed in a similar scene without a state difference, the shorter the time difference between the two photographing, the closer the ambient brightness, the higher the fusion weight, i.e., the larger the influence of the exposure intensity at the time of the last photographing on the re-determined exposure intensity.
When the preview image (image B) of the last frame before the current shooting is acquired, a passerby walks through the shooting picture. The detection of a face in the electronic device 100 triggers a change in the AE mode, so that the brightness of the face region is increased. Since the change of the AE mode may not be the subjective intention of the user, the electronic device may set the fusion weight w when re-determining the exposure parameters t And w lux Such that more reference is made to the exposure parameters of the last shot when the exposure parameters are re-determined. For example, the electronic device may set the fusion weight w t Has a lower limit of 0.8, w lux The lower limit of (B) is 0.6, i.e. w t ≥0.8,w lux Not less than 0.6. It is understood that the lower limit of the fusion weight may be preset according to practical situations, and the application is not limited thereto.
Illustratively, one image is saved every 2s in a certain time-lapse photography, and the current photography and the last photography are two times of photography in the time-lapse photography. Fig. 8A-8C are some user interfaces during these two shots.
As shown in fig. 8A, the image displayed in preview area 420 is an image captured at the time of the delayed shooting at 30s, that is, an image (image a) finally saved in the last shooting.
As shown in fig. 8B, the image displayed in preview area 420 is a preview image acquired at the time of the delayed shooting at 31s, that is, a preview image (image B) of the last frame displayed by electronic apparatus 100 before the current shooting. When the delayed photography is performed in the 31 st s, the electronic device 100 detects that a face frame appears in the picture according to a face detection algorithm, an AE system in the electronic device 100 automatically increases the brightness of the face region, then calculates the average brightness of the picture, and then adjusts the exposure parameter according to the average brightness of the picture.
It can be understood that if the electronic apparatus 100 detects that the image (image a) and the image B saved by the last shooting are similar images and there is no state difference between the two images, the electronic apparatus 100 determines the fusion weight according to the time difference and the brightness ratio between the two shots, and re-determines the exposure intensity. In the process of re-determining the exposure intensity, the fusion weight w may be set t And w lux The lower limit of (c). For example, the fusion weight w is set t Has a lower limit of 0.8, w lux The lower limit of (B) is 0.6, i.e. w t ≥0.8,w lux Not less than 0.6. When the fusion weight determined from the time difference and the ambient brightness is smaller than the lower limit of the fusion weight, the lower limit of the fusion weight is directly taken as the fusion weight of the calculation (re-determination of the exposure intensity).
As shown in fig. 8C, the image displayed in preview area 420 is an image captured at the 32 th time of the delayed shooting, that is, an image (image C) finally saved in the current shooting. It can be understood that the difference in brightness between image C and image a is small. If the exposure parameters are not adjusted by adopting the scheme, the brightness of the finally obtained and stored image is bright, so that the brightness difference between the image C and the image A is large, the brightness change of the finally generated video is not smooth, and the user experience is influenced.
It is understood that in other scenarios, the fusion weight may be set, and the present application is not limited thereto.
It is understood that after the electronic device 100 re-determines the exposure intensity, the exposure time and the ISO may be determined according to a relation of the exposure intensity, the exposure time and the ISO (exposure intensity — exposure time ISO). The electronic device 100 may further store a comparison table of the exposure intensity, the exposure time, and the ISO, and when the electronic device 100 determines the exposure intensity again, the exposure time and the ISO may be reset according to the comparison table.
The electronic device 100 may perform the current shooting according to the reset exposure time and ISO, and store the image shot this time, where the brightness of the image stored in the previous shooting changes smoothly and the consistency is good.
It should be noted that, in other embodiments of the present application, the exposure intensity adjustment method provided by the present application may further include only step S701, step S703, and step S704 shown in fig. 7. That is, without determining the difference in the state of the two shots, the fusion weight may be determined based on the time difference between the two shots and the ambient brightness, and the exposure intensity may be newly determined.
It should be noted that the electronic device mentioned in the claims may be the electronic device 100 in the embodiment of the present application.
In the embodiment of the application, the electronic device 100 may acquire a first image and a second image captured by a camera. The first image may be the image a or the image D of the foregoing embodiment, and the second image may be the image B of the foregoing embodiment. The first image may also be image 1 shown in fig. 3E. The second image may also be image 2 shown in fig. 3F. In addition, the first exposure intensity may be lastEV of the foregoing embodiment, and the second exposure intensity may be curEV of the foregoing embodiment. The first luminance may be the luminance 1 of the foregoing embodiment, and the second luminance may be the luminance 2 of the foregoing embodiment.
In this embodiment, the electronic device 100 may perform similarity detection on the first image and the second image, and determine whether the similarity between the content of the first image and the content of the second image is higher than a first threshold. The first threshold may be a preset threshold mentioned in step S701. In addition, the similarity between the content of the first image and the content of the second image may be represented by a hamming distance, which may specifically refer to step S701 and is not repeated herein.
In this embodiment, the electronic device 100 may determine a third exposure intensity according to the first exposure intensity and the second exposure intensity, and obtain a third image by shooting through the camera at the third exposure intensity. Wherein the third exposure intensity may be newEV of the previous embodiment. The third image may be the image C of the foregoing embodiment, and may also be the image 3 shown in fig. 3G. In addition, the third luminance may be the luminance 3 of the foregoing embodiment.
In the implementation of the present application, the difference between the brightness of the third image and the brightness of the first image is smaller than the second threshold. It can be understood that the third image and the first image are both images acquired by the user through the electronic device 100, and the brightness change of the third image and the first image is smooth. It should be noted that the brightness difference mentioned here may be a difference value of average brightness of the first image and the third image. In addition, the second threshold may be set according to actual needs, and the application is not limited to this.
In an embodiment of the application, the electronic device 100 may determine the first weight and the second weight, and determine the third exposure intensity by the first weight, the second weight, the first exposure intensity, and the second exposure intensity. Wherein the first weight may be w of the foregoing embodiment t *w Lux The second weight may be (1-w) of the foregoing embodiment t *w lux ). If the brightness difference between the first image and the second image is larger, the first weight is smaller, and the second weight is larger; if the time difference between the first image and the second image is larger, the first weight is smaller, and the second weight is larger. It is understood that the luminance difference can be represented by the above-mentioned luminance ratio, and the closer the luminance ratio is to 1, the smaller the luminance difference is. The time difference may be represented by the time difference mentioned above, the larger the time difference, the smaller the first weight and the larger the second weight.
In the embodiment of the present application, the electronic device 100 may determine whether the electronic device 100 has a status difference between the first time and the second time. The first time may be the time a of the foregoing embodiment, and the second time may be the time B of the foregoing embodiment.
In the embodiment of the present application, if the electronic device 100 has a state difference between the first time and the second time, the electronic device 100 obtains a fourth image by shooting with a camera at a second exposure intensity; the fourth image and the second image are two consecutive frames of images captured by the electronic device 100 through the camera. That is, if the electronic apparatus 100 has a state difference between the first time and the second time, the electronic apparatus 100 may directly use the exposure intensity when the second image is acquired to capture the image. In this case, the user's will most likely be a desire to change the brightness of the screen, so the electronic device 100 no longer readjusts the exposure intensity.
In this embodiment of the application, if the similarity between the content of the first image and the content of the second image is not higher than the first threshold, the electronic device 100 obtains a fifth image by shooting with the camera at the second exposure intensity; the fifth image and the second image are two consecutive frames of images captured by the electronic device 100 through the camera. That is, if the content of the first image is not similar to the content of the second image, the electronic device 100 may directly use the exposure intensity when the second image is acquired to capture. In this case, the user may not need the first image and the third image to have high brightness uniformity, so the electronic device 100 does not readjust the exposure intensity.
In the embodiment of the present application, the second image and the third image may be two consecutive frames of images captured by the electronic device 100 through a camera. For example, the second image is image B, the third image is image C, and at this time, the second image is a preview image of the last frame before the electronic device 100 acquires the third image.
The above embodiments are only used to illustrate the technical solutions of the present application, and not to limit the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; and the modifications or the substitutions do not make the essence of the corresponding technical solutions depart from the scope of the technical solutions of the embodiments of the present application.

Claims (11)

1. An exposure intensity adjustment method, characterized by comprising:
the method comprises the steps that electronic equipment obtains a first image and a second image shot by a camera, and whether the similarity of the content of the first image and the content of the second image is higher than a first threshold value or not is determined; the shooting time of the second image is later than that of the first image;
if the similarity between the content of the first image and the content of the second image is higher than the first threshold value, the electronic equipment acquires a first exposure intensity and a second exposure intensity; the first exposure intensity is the exposure intensity of the first image when being shot; the second exposure intensity is the exposure intensity of the second image when being shot; the first image has a first brightness; the second image has a second brightness;
the electronic equipment judges whether the electronic equipment has state difference at a first moment and a second moment; the first moment is the moment when the electronic equipment acquires the first image; the second moment is the moment when the electronic equipment acquires the second image; the state difference is that the electronic equipment has a change of an automatic exposure mode caused by user operation at the first time and/or the second time;
if the electronic equipment does not have the state difference at the first moment and the second moment, the electronic equipment determines a third exposure intensity according to the first exposure intensity and the second exposure intensity, and a third image is obtained by shooting the electronic equipment at the third exposure intensity through the camera; the third image has a third brightness; the difference in luminance between the third luminance and the first luminance is less than a second threshold.
2. The method of claim 1, wherein the electronic device determines a third exposure intensity from the first exposure intensity and the second exposure intensity, and in particular comprising:
the electronic device determining a first weight and a second weight; the first weight is a weight of the first exposure intensity; the second weight is a weight of the second exposure intensity;
the electronic device determines the third exposure intensity according to the first exposure intensity, the first weight, the second exposure intensity and the second weight;
wherein if the brightness difference between the first image and the second image is larger, the first weight is smaller, and the second weight is larger; if the time difference between the first image and the second image is larger, the first weight is smaller, and the second weight is larger.
3. The method of claim 1 or 2, wherein the method further comprises:
if the state difference exists between the electronic equipment at the first moment and the second moment, the electronic equipment obtains a fourth image through shooting by the camera under the second exposure intensity; the fourth image and the second image are two continuous frames of images shot by the electronic equipment through the camera.
4. The method of any one of claims 1-3, wherein the second image and the third image are two consecutive frames of images taken by the electronic device through the camera.
5. The method of any one of claims 1-4, further comprising:
if the similarity between the content of the first image and the content of the second image is not higher than the first threshold, the electronic equipment obtains a fifth image by shooting through the camera under the second exposure intensity; the fifth image and the second image are two continuous frames of images shot by the electronic equipment through the camera.
6. An electronic device comprising a camera, one or more memories, one or more processors, wherein the one or more processors are coupled with the camera, the one or more memories to store computer program code, the computer program code comprising computer instructions;
the camera is used for shooting under a first exposure intensity to obtain a first image and shooting under a second exposure intensity to obtain a second image, wherein the first image has a first brightness, and the second image has a second brightness; the shooting time of the second image is later than that of the first image;
the processor is used for determining whether the similarity between the content of the first image and the content of the second image is higher than a first threshold value;
the processor is further configured to obtain the first exposure intensity and the second exposure intensity if the content of the first image and the content of the second image are higher than the first threshold;
the processor is further configured to determine whether the electronic device has a state difference between a first time and a second time; the first moment is the moment when the electronic equipment acquires the first image; the second moment is the moment when the electronic equipment acquires the second image; the state difference is that the electronic equipment has a change of an automatic exposure mode caused by user operation at the first time and/or the second time; if the electronic equipment does not have the state difference at the first moment and the second moment, determining a third exposure intensity according to the first exposure intensity and the second exposure intensity;
the camera is further configured to capture a third image at the third exposure intensity, where the third image has a third brightness, and a brightness difference between the third brightness and the first brightness is smaller than a second threshold.
7. The electronic device of claim 6, wherein the processor, when being configured to determine a third exposure intensity from the first exposure intensity and the third exposure intensity, is specifically configured to:
determining a first weight and a second weight; the first weight is a weight of the first exposure intensity; the second weight is a weight of the second exposure intensity;
determining the third exposure intensity according to the first exposure intensity, the first weight, the second exposure intensity, and the second weight;
wherein if the brightness difference between the first image and the second image is larger, the first weight is smaller, and the second weight is larger; if the time difference between the first image and the second image is larger, the first weight is smaller, and the second weight is larger.
8. The electronic device of claim 6 or 7, wherein the camera is further to:
if the state difference exists between the first moment and the second moment, the electronic equipment shoots a fourth image under the second exposure intensity; the fourth image and the second image are two continuous frames of images shot by the electronic equipment through the camera.
9. The electronic device of any of claims 6-8, wherein the second image and the third image are two consecutive frames of images taken by the electronic device through the camera.
10. The electronic device of any of claims 6-9, wherein the camera is further to:
if the similarity between the content of the first image and the content of the second image is not higher than the first threshold value, shooting at the second exposure intensity to obtain a fifth image; the fifth image and the second image are two continuous frames of images shot by the electronic equipment through the camera.
11. A computer storage medium, comprising: computer instructions; the computer instructions, when executed on an electronic device, cause the electronic device to perform the method of any of claims 1-5.
CN202110651305.9A 2021-06-10 2021-06-10 Exposure intensity adjusting method and related equipment Active CN113891008B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110651305.9A CN113891008B (en) 2021-06-10 2021-06-10 Exposure intensity adjusting method and related equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110651305.9A CN113891008B (en) 2021-06-10 2021-06-10 Exposure intensity adjusting method and related equipment

Publications (2)

Publication Number Publication Date
CN113891008A CN113891008A (en) 2022-01-04
CN113891008B true CN113891008B (en) 2022-08-09

Family

ID=79010196

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110651305.9A Active CN113891008B (en) 2021-06-10 2021-06-10 Exposure intensity adjusting method and related equipment

Country Status (1)

Country Link
CN (1) CN113891008B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116996762B (en) * 2023-03-29 2024-04-16 荣耀终端有限公司 Automatic exposure method, electronic equipment and computer readable storage medium

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110771148A (en) * 2018-11-28 2020-02-07 深圳市大疆创新科技有限公司 Shooting method and device and unmanned aerial vehicle
CN111614892A (en) * 2020-04-28 2020-09-01 深圳英飞拓智能技术有限公司 Face image acquisition method, shooting device and computer-readable storage medium

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101726964B (en) * 2008-10-28 2012-01-18 亚洲光学股份有限公司 Image acquiring device and method for shifting charging capacitance of flashlamp thereof
US8730384B2 (en) * 2010-06-03 2014-05-20 Apple Inc. Systems, methods, and devices for flash exposure control using preflash statistics
CN102223485B (en) * 2011-06-10 2012-07-18 深圳市先河系统技术有限公司 Exposure compensation device and method
CN104394329B (en) * 2014-11-28 2017-12-12 东莞宇龙通信科技有限公司 A kind of photographic method, device and mobile terminal
US9787909B1 (en) * 2016-03-31 2017-10-10 Stmicroelectronics (Research & Development) Limited Controlling signal-to-noise ratio in high dynamic range automatic exposure control imaging
US10417476B2 (en) * 2016-12-30 2019-09-17 Eosmem Corporation Exposure time determination method for image sensing operation
CN112532857B (en) * 2019-09-18 2022-04-12 华为技术有限公司 Shooting method and equipment for delayed photography
WO2021184239A1 (en) * 2020-03-18 2021-09-23 深圳市大疆创新科技有限公司 Exposure method and apparatus, photographing device, movable platform, and storage medium
CN112153299A (en) * 2020-09-18 2020-12-29 深圳创维-Rgb电子有限公司 Camera exposure processing method and device and intelligent terminal

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110771148A (en) * 2018-11-28 2020-02-07 深圳市大疆创新科技有限公司 Shooting method and device and unmanned aerial vehicle
CN111614892A (en) * 2020-04-28 2020-09-01 深圳英飞拓智能技术有限公司 Face image acquisition method, shooting device and computer-readable storage medium

Also Published As

Publication number Publication date
CN113891008A (en) 2022-01-04

Similar Documents

Publication Publication Date Title
US11800221B2 (en) Time-lapse shooting method and device
WO2020168956A1 (en) Method for photographing the moon and electronic device
US11949978B2 (en) Image content removal method and related apparatus
CN113630558B (en) Camera exposure method and electronic equipment
CN113810603B (en) Point light source image detection method and electronic equipment
WO2023015991A1 (en) Photography method, electronic device, and computer readable storage medium
US20210409588A1 (en) Method for Shooting Long-Exposure Image and Electronic Device
CN113891009B (en) Exposure adjusting method and related equipment
CN113891008B (en) Exposure intensity adjusting method and related equipment
WO2022267608A1 (en) Exposure intensity adjusting method and related apparatus
US20230412929A1 (en) Photographing Method and Related Apparatus
US20230014272A1 (en) Image processing method and apparatus
CN114463191B (en) Image processing method and electronic equipment
CN115529419B (en) Shooting method under multiple artificial light sources and related device
CN115631250B (en) Image processing method and electronic equipment
US20240137659A1 (en) Point light source image detection method and electronic device
WO2023160224A9 (en) Photographing method and related device
CN115460343B (en) Image processing method, device and storage medium
CN116055855B (en) Image processing method and related device
CN115526786B (en) Image processing method and related device
CN115705663B (en) Image processing method and electronic equipment
WO2024078275A1 (en) Image processing method and apparatus, electronic device and storage medium
CN116193269A (en) Exposure mode switching method and related equipment
CN117119316A (en) Image processing method, electronic device, and readable storage medium
CN117857915A (en) Photographing method, photographing device and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant