CN116033275B - Automatic exposure method, electronic equipment and computer readable storage medium - Google Patents

Automatic exposure method, electronic equipment and computer readable storage medium Download PDF

Info

Publication number
CN116033275B
CN116033275B CN202310319202.1A CN202310319202A CN116033275B CN 116033275 B CN116033275 B CN 116033275B CN 202310319202 A CN202310319202 A CN 202310319202A CN 116033275 B CN116033275 B CN 116033275B
Authority
CN
China
Prior art keywords
image sensor
image
brightness
frame
mode
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202310319202.1A
Other languages
Chinese (zh)
Other versions
CN116033275A (en
Inventor
李朝霞
叶凌
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honor Device Co Ltd
Original Assignee
Honor Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honor Device Co Ltd filed Critical Honor Device Co Ltd
Priority to CN202310995464.XA priority Critical patent/CN116996762B/en
Priority to CN202310319202.1A priority patent/CN116033275B/en
Publication of CN116033275A publication Critical patent/CN116033275A/en
Application granted granted Critical
Publication of CN116033275B publication Critical patent/CN116033275B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/65Control of camera operation in relation to power supply
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/667Camera operation mode switching, e.g. between still and video, sport and normal or high- and low-resolution modes
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/71Circuitry for evaluating the brightness variation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/73Circuitry for compensating brightness variation in the scene by influencing the exposure time
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/76Circuitry for compensating brightness variation in the scene by influencing the image signals

Abstract

The application provides an automatic exposure method, electronic equipment and a computer readable storage medium, relates to the technical field of image processing, and can solve the problem of too slow automatic exposure speed. The method comprises the following steps: the electronic equipment responds to a first operation of a user and starts the image sensor; acquiring an N-th frame image according to a preset first mode through an image sensor, and determining first brightness and first exposure parameters of the N-th frame image, wherein N is a positive integer; acquiring an (n+1) -th frame image based on the first exposure parameter by an image sensor, and determining a second brightness and a second exposure parameter of the (n+1) -th frame image; determining that the difference value between the second brightness and the first brightness meets a preset condition; and acquiring a second mode corresponding to the first operation, and converting the second exposure parameter into a third exposure parameter corresponding to the second mode through the image sensor.

Description

Automatic exposure method, electronic equipment and computer readable storage medium
Technical Field
The present application relates to the field of image processing technologies, and in particular, to an automatic exposure method, an electronic device, and a computer readable storage medium.
Background
Currently, electronic devices (such as mobile phones, tablet personal computers and the like) are configured with an Auto Exposure (AE) function, and after the electronic devices start a camera, the AE function can automatically adjust the exposure and the gain according to the intensity of external light, so that overexposure or underexposure is avoided.
However, the AE convergence speed is limited, and under the condition that the external light is too strong or weak, the automatic exposure effect of the electronic device is poor, and overexposure or underexposure may occur. For example, in response to a user instruction to start an operation of the camera, the electronic device wakes up the camera, and then gradually converges from overexposure to normal exposure or from underexposure to normal exposure. The normal exposure generally means that the brightness distribution of the pixels of the image frame is uniform. In the process, the user needs to wait for the displayed picture to be exposed normally before shooting, so that poor shooting experience is brought to the user, and quick shooting cannot be realized.
Disclosure of Invention
In view of the above, the present application provides an automatic exposure method, an electronic device, and a computer readable storage medium, which can analyze the brightness and exposure parameters of a current frame image by an image sensor, so as to obtain a next frame image based on the brightness and exposure of the current frame image, so as to implement rapid automatic exposure in the image acquisition process. And when the brightness difference value between the current frame image and the next frame image meets the preset condition, completing the automatic exposure process, and acquiring exposure parameters indicating that the image frame image is normally exposed.
In a first aspect, the present application provides an automatic exposure method applied to an electronic device supporting the operation of a photographing application. The electronic device can respond to a first operation of a user to start the image sensor so that the image sensor can be powered on and can acquire images. And then, the electronic equipment acquires an N frame image according to a preset first mode through the image sensor, and determines the first brightness and the first exposure parameter of the N frame image, wherein N is a positive integer. The image sensor may perform image acquisition based on a preset first mode while determining a first brightness and a first exposure value of an acquired image thereof. Next, the electronic device continues to acquire an (n+1) -th frame image based on the first exposure parameter by the image sensor, and determines a second luminance and a second exposure parameter of the (n+1) -th frame image. If it is determined that the difference between the second luminance and the first luminance satisfies the preset condition, for example, the difference between the second luminance and the first luminance is smaller than the preset threshold, it may be indicated that the luminance difference between the nth frame image and the (n+1) th frame image is not large. That is, the image sensor continues to acquire the proper picture exposure of the image frame based on the second exposure parameter. For this purpose, the electronic device may acquire a second mode corresponding to the first operation, and convert the second exposure parameter into a third exposure parameter corresponding to the second mode through the image sensor. The image sensor can continue to collect images according to the third exposure parameters corresponding to the second mode, and collect image frames which correspond to the second mode and are normally exposed.
In summary, the electronic device may collect image frames by using the image sensor, and analyze the brightness and exposure parameters of the collected image frames to quickly determine the second exposure parameters for indicating that the image sensor collects normal exposure image frames, thereby improving the speed of automatic exposure. The image sensor may acquire an image frame of correct exposure based on the second exposure parameter in the first mode, and then change the mode to the second mode. In order to ensure that the picture exposure of the image frames acquired in the second mode is also correct, the image sensor can convert the second exposure parameters into third exposure parameters corresponding to the second mode, so that the image sensor acquires the image frames with normal picture exposure based on the third exposure parameters in the second mode. In this way, the automatic exposure method provided by the application can directly perform timely automatic exposure processing on the image through the image sensor, and then, convert the second exposure parameter after the automatic exposure processing into the third exposure parameter corresponding to the second mode for image acquisition, so that the image frame corresponding to the second mode and having normal exposure can be output after the automatic exposure is completed.
In a possible implementation manner of the first aspect, the automatic exposure method further includes: and acquiring and displaying the target image frame according to the third exposure parameter. Because the third exposure parameter corresponds to the second mode, and the picture exposure of the image frame correspondingly acquired based on the third exposure parameter is correct, the target image frame can be acquired according to the third exposure parameter and displayed, so that a user can watch a display picture with normal exposure.
In another possible implementation manner of the first aspect, the brightness of the target image frame is the same as or similar to the second brightness. Wherein the brightness of the target image frame being similar to the second brightness means that a difference between the brightness of the target image frame and the second brightness is smaller than a preset threshold. Since the third exposure parameter is converted from the second exposure parameter, and the second exposure parameter corresponds to the second brightness which is the n+1st frame image, the brightness of the target image frame acquired by the image sensor based on the third exposure parameter is the same as or similar to the second brightness.
In another possible implementation manner of the first aspect, the first frame rate corresponding to the first mode is greater than the second frame rate corresponding to the second mode. The first frame rate is the speed at which the image sensor is acquiring images based on the first mode, and correspondingly, the second frame rate is the speed at which the image sensor is acquiring images based on the second mode. The image sensor acquires image frames for the auto-exposure process at a first frame rate in the first mode, and can increase the speed of the auto-exposure process. The image sensor collects image frames for display at a second frame rate in a second mode, so that the display requirement of the second mode can be met, and the collection power consumption of the image sensor can be reduced.
In another possible implementation manner of the first aspect, one or more mode parameters corresponding to one or more preset modes respectively are stored in the image sensor, and the preset modes include the second mode. Wherein the mode parameters stored in the image sensor include, for example, one or more of resolution, size, and frame rate at which the image sensor captures images. The image sensor may acquire corresponding mode parameters in response to a second mode corresponding to the first operation, so as to convert the second exposure parameters into third exposure parameters corresponding to the second mode through a preset rule. The preset rule may be a program for realizing conversion of the exposure parameters.
Based on the above, the image sensor can respond to the first operation of the user, and directly perform conversion of the exposure parameters based on the saved mode parameters, so that the automatic exposure speed can be further improved.
In another possible implementation manner of the first aspect, the mode parameter includes one or more of a resolution, a size of a display screen of the electronic device, and a frame rate at which the image sensor captures the image. The mode parameters corresponding to different modes are different, for example, in a video shooting mode, the image sensor can acquire images at a high speed, so that smoothness of shooting video pictures can be realized, and the image sensor has low requirements on the speed of acquiring image frames in the process of shooting pictures. Therefore, the frame rate of the image sensor in the mode parameter corresponding to the video shooting mode to collect the image is smaller than the frame rate of the image sensor in the mode parameter corresponding to the portrait mode to collect the image.
In another possible implementation manner of the first aspect, the process of acquiring, by the electronic device, the nth frame image according to the preset first mode and determining the first brightness and the first exposure parameter of the nth frame image by the image sensor may include the following process. Firstly, the electronic equipment acquires an N frame image according to a preset first mode through the image sensor. After the nth frame image is acquired, the electronic device determines a first brightness of the nth frame image through the image sensor. The electronic device can perform brightness analysis processing on the nth frame image through the image sensor to obtain a current environment brightness corresponding to the nth frame image, and the first brightness is obtained. The electronic device then converts the first brightness to a corresponding first exposure parameter via the image sensor.
Based on this, the electronic device may collect an nth frame image by the image sensor and analyze a first luminance and a first exposure parameter corresponding to the nth frame. The complicated process that the image frames need to be sent to an application processor for processing in the prior art is avoided. In addition, the above scheme can directly determine the current ambient light brightness, i.e. the first brightness, based on the nth frame image. Compared with the prior art, the method and the device have the advantages that the ambient light intensity is not required to be acquired based on the ambient light intensity sensor, the hardware space occupied by the ambient light sensor is reduced, so that the volume of the electronic equipment can be reduced, or the hardware occupied space of the ambient light sensor can be reduced in other aspects.
In another possible implementation manner of the first aspect, during the automatic exposure processing, the electronic device determines, by using the image sensor, an ambient light level and a first exposure parameter corresponding to the nth frame image, and adjusts the determined first exposure parameter based on the ambient light level, so as to implement collection of the (n+1) th frame image. For example, the electronic device may first determine, by using the image sensor, a first adjustment step corresponding to the first brightness. Then, the electronic device can determine a fourth exposure parameter corresponding to the first exposure parameter according to the first adjustment step size through the image sensor. And then acquiring an (n+1) th frame image based on a fourth exposure parameter according to a first mode by the image sensor, so as to realize convergence of the ambient light brightness corresponding to the acquired next frame image.
In the above implementation manner, the adjustment steps corresponding to the different brightnesses are different, so the image sensor can adaptively control the acquisition of the next image frame based on the current ambient light brightness. For example, when the ambient light brightness is brighter or darker, the corresponding adjustment step length can be longer, so that automatic exposure can be quickly performed under the condition that the ambient light brightness is brighter or darker, the brightness of the image frame collected by the image sensor can be quickly converged to proper brightness, and the automatic exposure speed is improved. Alternatively, the relationship between the brightness and the adjustment step may be predetermined and stored in the image sensor, so as to further ensure that the image sensor can perform a rapid automatic exposure process.
In another possible implementation manner of the first aspect, the first exposure parameter, or the second exposure parameter, or the third exposure parameter is an exposure value and/or a gain value. The exposure value refers to the integral of the illuminance received by the photosensitive unit of the object image sensor over time. The image sensor can realize the process of image acquisition based on the exposure value by adjusting the aperture size and the exposure time of the image sensor. The gain value refers to the amplification factor of the image sensor when the automatic gain control module is used for amplifying the analog electric signal. By adjusting the gain value of the image sensor, adjustment of the brightness of the image frame can be achieved. When the image sensor collects the image frames based on the exposure parameters, the brightness of the image frames collected by the image sensor can be accurately adjusted based on the exposure values and/or the gain values, so that the image sensor can obtain the image frames with specified brightness, and an accurate automatic exposure process is realized.
In another possible implementation manner of the first aspect, the electronic device may further display an (n+1) -th frame image acquired during the automatic exposure. Because the speed of the automatic exposure process is faster, the electronic device can complete the automatic exposure process when the eyes of the user are still in the delayed persistence state after displaying the (n+1) th frame image, so that the noninductive automatic exposure process can be realized. And the starting speed of the application program is effectively improved.
In another possible implementation of the first aspect, the first application is a camera application. The image sensor may be activated by the camera application and image acquisition and processing by the image sensor. The first operation is an operation performed by a user on an icon of the first application program, for example, a click operation performed by the user on the icon of the first application program is detected at an interface on which the icon of the first application program is displayed. The first operation may also be an operation of instructing to switch the first application to the foreground display, for example, the first application is in a background running state, and the electronic device detects an operation of switching the first application to the foreground running by the user. The first operation may also be an operation of launching the first application through the second application, for example, the electronic device detecting a click operation of a control in the second application display interface by the user for launching the first application. The first operation may also be several operations above, and the specific case may be determined based on the actual application.
In another possible implementation manner of the first aspect, the preset condition includes that a difference value between the second brightness and the first brightness is less than or equal to a preset threshold value for the mth time, and m is a positive integer. In order to avoid a single error, for example, after the occurrence that the difference between the second luminance and the first luminance is smaller than or equal to the preset threshold value once, then the occurrence that the difference between the second luminance and the first luminance is larger than the preset threshold value a plurality of times is followed. The electronic device can determine whether the brightness of the image frame tends to be stable or not through the times that the difference value between the second brightness and the first brightness is smaller than or equal to a preset threshold value by the image sensor, so that the stability of the image frame in automatic exposure by the image sensor is ensured.
In a second aspect, the present application provides an electronic device comprising a display screen, an image sensor, a memory, and one or more processors; the display screen, the image sensor, the memory and the processor are coupled; the display screen is for displaying an image generated by the processor, the image sensor is for acquiring an image frame, and the memory is for storing computer program code, the computer program code comprising computer instructions.
Wherein the computer instructions, when executed by the processor, cause the electronic device to: in response to a first operation by a user, activating an image sensor; acquiring an N-th frame image according to a preset first mode through an image sensor, and determining first brightness and first exposure parameters of the N-th frame image, wherein N is a positive integer; acquiring an (n+1) -th frame image based on the first exposure parameter by an image sensor, and determining a second brightness and a second exposure parameter of the (n+1) -th frame image; determining that the difference value between the second brightness and the first brightness meets a preset condition; and acquiring a second mode corresponding to the first operation, and converting the second exposure parameter into a third exposure parameter corresponding to the second mode through the image sensor.
In another possible design of the second aspect, the processor, when executing the computer instructions, causes the electronic device to further: and acquiring and displaying the target image frame according to the third exposure parameter.
In another possible design of the second aspect, the brightness of the target image frame is the same as or similar to the second brightness. Wherein similar means that the difference between the luminance of the target image frame and the second luminance is not large, for example, the difference between the luminance of the target image frame and the second luminance is less than or equal to a preset threshold.
In another possible design of the second aspect, the first frame rate corresponding to the first mode is greater than the second frame rate corresponding to the second mode.
In another possible design manner of the second aspect, one or more mode parameters corresponding to one or more preset modes respectively are stored in the image sensor, where the preset modes include the second mode.
In another possible design manner of the second aspect, the mode parameter includes one or more of resolution, size, and frame rate at which the image sensor captures an image of the display screen of the electronic device.
In another possible design of the second aspect, the processor, when executing the computer instructions, causes the electronic device to further: acquiring an Nth frame image according to a preset first mode through an image sensor; determining, by the image sensor, a first luminance of an nth frame image; and determining a first exposure parameter corresponding to the first brightness through the image sensor.
In another possible design of the second aspect, the processor, when executing the computer instructions, causes the electronic device to further: determining a first adjustment step length corresponding to the first brightness through an image sensor; determining a fourth exposure parameter corresponding to the first exposure parameter according to the first adjustment step length through the image sensor; and acquiring an (n+1) th frame image according to the first mode through a fourth exposure parameter by the image sensor.
In another possible design of the second aspect, the first exposure parameter, the second exposure parameter, or the third exposure parameter is an exposure value and/or a gain value.
In another possible design of the second aspect, the processor, when executing the computer instructions, causes the electronic device to further: and displaying the (n+1) th frame image.
In another possible design manner of the second aspect, the first operation is one or more of an operation of a user on an icon of the first application, an operation of instructing to switch the first application to a foreground display, and an operation of starting the first application through the second application, and the first application is a camera application.
In another possible design manner of the second aspect, the preset condition includes that a difference between the second brightness and the first brightness is mth less than or equal to a preset threshold value, and m is a positive integer.
In a third aspect, the application provides a computer readable storage medium comprising computer instructions which, when run on an electronic device, cause the electronic device to perform the method of the first aspect and any one of its possible designs.
In a fourth aspect, the application provides a computer program product for causing an electronic device to carry out the method of the first aspect and any one of its possible designs as described above when the computer program product is run on the electronic device.
In a fifth aspect, the present application provides an apparatus for inclusion in an electronic device, the apparatus having functionality to implement the behaviour of the electronic device in any one of the methods of the above aspects and possible implementations. The functions can be realized by hardware, and can also be realized by executing corresponding software by hardware. The hardware or software includes at least one module or unit corresponding to the functions described above. Such as a dispensing module or unit, a scanning module or unit, a recycling module or unit, a moving module or unit, and a storage module or unit, etc.
It will be appreciated that the electronic device of the second aspect and any of the possible designs thereof, the computer readable storage medium of the third aspect, the computer program product of the fourth aspect, and the apparatus of the fifth aspect are all configured to perform the corresponding methods provided above, and therefore the advantages achieved by the electronic device are referred to as advantages in the corresponding methods provided above, and are not repeated herein.
Drawings
Fig. 1 is a schematic diagram of interface display in a photographing scene according to an embodiment of the present application;
FIG. 2 is a schematic diagram illustrating a running process of a camera application according to an embodiment of the present application;
fig. 3 is a schematic diagram of a display screen of a camera application according to an embodiment of the present application
FIG. 4 is a schematic diagram illustrating another operation of a camera application according to an embodiment of the present application;
fig. 5 is a schematic diagram of an electronic device according to an embodiment of the present application;
fig. 6 is a schematic hardware structure of an electronic device according to an embodiment of the present application;
fig. 7 is a schematic software structure of an electronic device according to an embodiment of the present application;
FIG. 8 is a schematic diagram illustrating a running process of a camera application according to an embodiment of the present application;
FIG. 9 is a flowchart of an automatic exposure method according to an embodiment of the present application;
FIG. 10 is a flowchart of another automatic exposure method according to an embodiment of the present application;
fig. 11 is a schematic diagram of a shooting mode setting interface according to an embodiment of the present application;
FIG. 12 is a flowchart of another automatic exposure method according to an embodiment of the present application;
fig. 13 is an operation schematic diagram of application switching according to an embodiment of the present application;
Fig. 14 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
The terms "first" and "second" are used below for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defining "a first" or "a second" may explicitly or implicitly include one or more such feature. In the description of the present embodiment, unless otherwise specified, the meaning of "plurality" is two or more.
In some scenarios, an electronic device is configured with a camera for implementing an image acquisition function of the electronic device. For example, an application program for calling the camera is installed in the electronic device, and the electronic device can start the camera to collect images in response to the operation of a user in the application program. Illustratively, as shown in (a) of fig. 1, the electronic device may display a first interface including a plurality of application icons. In response to a user clicking on a camera control in the first interface, the electronic device may launch a camera application program, launch a camera through the camera application, and display a second interface as shown in fig. 1 (b), which may be used to display images captured by the camera in real time.
In some examples, in response to a user clicking on an operation of the camera application, the electronic device loads a related process of the camera application to launch the camera application. After the relevant process is loaded, the electronic device may display a second interface as shown in fig. 1 (b). Optionally, during the loading of the relevant process of the camera application, a third interface may be displayed in the display screen of the electronic device, which may be used to indicate that the camera application is being started. The content displayed in the third interface may include a name or an identifier corresponding to the camera application program, or an application loads a moving image or video, or a black screen display. The content displayed in the third interface can be adjusted according to the actual application scene and the application requirement, and the specific display content of the third interface is not limited here.
For example, as shown in fig. 2, after the electronic device determines to start the camera application program in response to the user operation, the application processor (application processor, AP) controls the image sensor in the camera to be powered on and starts up, and triggers the initialization process of the image sensor. And in the process that the application processor controls the image sensor to be electrified and started and triggers the image sensor to be initialized, a third interface can be displayed in the display screen of the electronic equipment. Optionally, during the initialization process, the application processor may perform an initialization setting on a state parameter of the image sensor, so that the image sensor is placed in a software standby state (SW standby). Wherein the camera control interface (camera control interface, CCI) or serial peripheral interface (serial peripheral interface, SPI) of the image sensor in the software standby state is in a callable state. After the image sensor is in a software standby state, the application processor may communicate with the image sensor through CCI or SPI.
After the initialization is completed, the application processor may perform mode setting on the image sensor, where the mode setting process is used to set an image acquisition mode of the image sensor, and the image sensor may perform image acquisition according to parameters corresponding to the image acquisition mode. For example, the application processor sends the mode setting parameters to the image sensor to cause the image sensor to switch to the corresponding image acquisition mode for image acquisition.
In some examples, the application processor may perform mode setting on the image sensor based on preset mode setting parameters. For example, after the application processor controls the image sensor to power up, preset mode setting parameters may be obtained from the memory. The preset mode setting parameters stored in the memory can be set by the user or default for the system. In addition, the preset mode setting parameters stored in the memory can also be the mode setting parameters corresponding to the image acquisition module where the image sensor is located before the last power-off. For example, the application processor may store the mode setting parameters corresponding to the image capturing mode in which the image sensor is currently located in the memory synchronously, so that the application processor may be used when the image sensor is set in the mode next time.
After the image sensor is switched to the corresponding image acquisition mode, image acquisition is performed, and the electronic device can display a second interface as shown in (b) of fig. 1. As shown in fig. 2, the second interface is used to display the image frames acquired by the image sensor. That is, the image sensor starts acquiring an image frame (frame) after completing the mode setting. After the image sensor acquires the image frame, the application processor automatically exposes (automatic exposure, AE) the image frame to cause the image brightness of the image frame acquired by the image sensor to be close to the given target brightness. Wherein a given target brightness corresponds to a mode of image sensor setting. The image brightness of the image frame collected by the image sensor is close to the given target brightness, which generally means that the difference between the image brightness of the image frame collected by the image sensor and the target brightness is smaller than the preset threshold.
Optionally, the process of automatically exposing the image frame by the application processor includes an initial automatic exposure control (initial automatic exposure control, initial AEC) process and a fine automatic exposure control (fine automatic exposure control, fine AEC) process. As shown in fig. 2, the initial AEC process is configured to quickly adjust an exposure parameter of the image sensor based on an adjustment step a, so that a picture brightness of an image frame acquired by the image sensor is quickly close to a given target brightness, and a difference between the picture brightness of the image acquired by the image sensor and the given target brightness is smaller than a preset threshold a. And then, the application processor performs a fine AEC control process for precisely adjusting the exposure parameters of the image sensor based on the adjustment step length B, so that the difference between the picture brightness of the image frame acquired by the image sensor and the given target brightness is smaller than a preset threshold value B. The preset threshold A is larger than the preset threshold B, and the adjustment step A is larger than the adjustment step B.
In the above process, after the initial AEC process is finished, the image brightness of the image frame acquired by the image sensor is relatively close to the given target brightness. Therefore, the electronic equipment can display the picture after the initial AEC process is finished, and the waiting time of the user is reduced. After that, the fine AEC process can be used to continuously and accurately adjust the exposure parameters, so that the electronic device can display a picture with normal exposure.
For example, the application processor obtains a luminance variation amount corresponding to an adjustment step a of the initial AEC process to be 10nit (nit), a preset threshold a to be 10nit, and a luminance variation amount corresponding to an adjustment step B of the fine AEC process to be 2nit, and a preset threshold B to be 2nit. Taking the target brightness as 50nit as an example, as shown in fig. 2, after the application processor acquires the image frame 1 acquired by the image sensor, the picture brightness of the image frame 1 is determined to be 1nit. Then, in the initial AEC process, the application processor adjusts the exposure parameter of the image sensor based on the adjustment step a, so that the brightness of the next frame of image acquired by the image sensor is increased by 10nit. The image sensor acquires the image frame 2 without considering that there is an error in the picture brightness of the image frame acquired by the image sensor based on the exposure parameter, and the application processor can acquire the picture brightness of the image frame 2 to be 11nit. Since the difference between the picture brightness of the image frame 2 and the target brightness 50nit is 39nit and is greater than the 10nit of the preset threshold value A, the application processor needs to continuously adjust the exposure parameter of the image sensor based on the adjustment step length A, so that the picture brightness of the next frame of image acquired by the image sensor is increased by 10nit, that is, the application processor determines that the picture brightness of the image frame 3 acquired by the image sensor is 21nit. And so on until the application processor determines that the picture brightness of the image frame 5 acquired by the image sensor is 41nit. At this time, since the difference between the picture brightness 41nit and the target brightness 50nit of the image frame 5 is 9nit, which is less than 10nit of the preset threshold a, the application processor may trigger the fine AEC process and continue to control the image sensor to perform image acquisition. The application processor determines the image frame 5 as the initial frame of the control process of fine AEC, and the picture brightness of the image frame 5 is 41nit. Then, in the control process of fine AEC, the application processor adjusts the exposure parameter of the image sensor based on the adjustment step B, so that the brightness of the next frame of image acquired by the image sensor is increased by 2nit. The application processor may determine that the picture brightness of the image frame 6 acquired by the image sensor is 43nit, irrespective of an error in the picture brightness of the image frame acquired by the image sensor based on the exposure parameter. Since the difference between the picture brightness of the image frame 6 and the target brightness 50nit is 7nit and is greater than 2nit of the preset threshold B, the application processor continues to adjust the exposure parameter of the image sensor based on the adjustment step B, so that the picture brightness of the next frame of image acquired by the image sensor is increased by 2nit, that is, the application processor can determine that the picture brightness of the image frame 7 acquired by the image sensor is 45nit. And so on until the application processor acquires that the picture brightness of the image frame 9 acquired by the image sensor is 49nit, and the difference between the picture brightness 49nit of the image frame 9 and the target brightness 50nit is 1nit, which is smaller than 2nit of the preset threshold value B. Then the application processor may determine that the automatic exposure control process of the image sensor is completed.
It should be understood that in the above example process, the application processor completes the initial automatic exposure control process by adjusting the brightness of 5 image frames, and then completes the refined automatic exposure control process by adjusting the brightness of 4 image frames, which is only illustrative. The processing quantity of the image frames respectively related to the initial automatic exposure control process and the refined automatic exposure control process is related to the parameters of the automatic exposure control process and the actual condition of the acquired image frames. And the actual length of time for automatic exposure control may be determined based on the application processor capabilities, as embodiments of the application are not limited in this regard.
In some examples, the exposure value (share) and the gain value (gain) of the image sensor may be continuously adjusted based on a given Target brightness (AE Target), so that the brightness of the image frame collected by the image sensor is close to the Target brightness, and the process of automatic exposure control is completed. The application processor generally takes the current ambient light brightness acquired by the ambient light sensor as a target brightness, so that the accuracy of automatic exposure is ensured. In the above-described automatic exposure control method, it is necessary to configure an image sensor and an ambient light sensor in the electronic apparatus.
In the automatic exposure processing as above, the application processor needs to determine the target brightness according to the acquired current ambient light brightness acquired by the ambient light sensor, and needs to acquire an image frame sent by the image sensor, so that the automatic exposure processing can be performed. In this process, data transmission needs to be performed between the application processor and the image sensor, and the time consumed for transmission is long. And, limited by the processing capacity of the application processor, the application processor controls the initial AEC of the image frame longer, and in the control process of the initial AEC, the difference between the brightness of the picture and the target brightness is larger, which affects the user to watch the picture. The user generally needs to wait for the control process of the initial AEC to end before determining the display content of the picture and performing the photographing operation. For example, the initial AEC control process is typically greater than 160 milliseconds (ms), during which the second interface sequentially displays image frames 1 through 5 as shown in fig. 2. During this display, the second interface may be subjected to an overexposure phenomenon that gradually converges from a bright state to a normal display, for example, the user may see that the second display interface as shown in fig. 3 (a) gradually changes to the second display interface as shown in fig. 3 (b); or the underexposure phenomenon from darkness to correct exposure, for example, the user can see that the second display interface as shown in fig. 3 (c) is gradually changed to the second display interface as shown in fig. 3 (b). Therefore, the user needs to wait for a long time to view the normally displayed image frame as shown in fig. 3 (b) and start photographing, resulting in a bad use experience and photographing experience for the user.
It can be understood that, in the process of actually displaying the third interface and the second interface, the electronic device can display the image frame 1 in the second interface through the display screen after the application processor needs to receive the image frame 1 sent by the image sensor and then execute the initial AEC control process on the image frame 1 after the image sensor collects the image frame 1. Thus, as shown in fig. 4, the start display time of the second interface may be later than the time at which the image sensor acquires the image frame 1.
In addition, in the automatic exposure process, the current ambient light brightness acquired by the ambient light sensor needs to be used as the target brightness, so as to ensure the accuracy of the automatic exposure process. Thus, there is a need to configure an image sensor and an ambient light sensor in an electronic device. However, configuring the image sensor and the ambient light sensor in the electronic device may occupy a large hardware space, and thus may increase the volume of the electronic device. In general, an ambient light sensor for automatic exposure and an image sensor are arranged at the same camera, so that the current ambient light intensity collected by the ambient light sensor corresponds to an image frame collected by the image sensor. For example, an image sensor and an ambient light sensor may be integrated into the front camera. Therefore, the front camera occupies a larger area in the front panel of the mobile phone.
For this reason, the embodiment of the application provides an automatic exposure method and an electronic device, wherein an on-chip automatic exposure (automatic exposure, AE) module is integrated on an image sensor in the electronic device. The on-chip AE module can directly acquire the image frame acquired by the image sensor and automatically expose the image frame, thereby reducing the transmission time of the image frame between the application processor and the image sensor and improving the automatic exposure speed. In addition, the processing speed of the on-chip AE module is higher than the acquisition speed of the image sensor, so that the on-chip AE module can timely process the image frames acquired by the image sensor. Thus, by improving the output speed of the image sensor and the speed of the on-chip AE module, the automatic exposure method provided by the embodiment of the application can effectively improve the automatic exposure speed.
The automatic exposure method provided by the embodiment of the application can be applied to the electronic equipment 100. For example, as shown in fig. 5, the electronic device 100 may specifically be a terminal device with a shooting function, such as a mobile phone 51, a tablet computer 52, a smart screen 53, a notebook computer 54, a vehicle-mounted device, a wearable device (such as a smart watch), an ultra-mobile personal computer (ultra-mobile personal computer, UMPC), a netbook, a personal digital assistant (personal digital assistant, PDA), an artificial intelligence (artificial intelligence, AI) device, and the like. The embodiment of the present application is not limited to the specific type of electronic device 100 and the installed operating system.
Fig. 6 shows a schematic hardware configuration of the electronic device 100. The electronic device 100 may include a processor 610, an external memory interface 620, an internal memory 621, a universal serial bus (universal serial bus, USB) interface 630, a charge management module 640, a power management module 641, a battery 642, an antenna 1, an antenna 2, a mobile communication module 650, a wireless communication module 660, an audio module 670, a speaker 670A, a receiver 670B, a microphone 670C, an ear-piece interface 670D, a sensor module 680, keys 690, a motor 691, a camera 693, a display 694, and a subscriber identity module (subscriber identification module, SIM) card interface, etc.
It should be understood that the illustrated structure of the embodiment of the present application does not constitute a specific limitation on the electronic device 100. In other embodiments of the application, electronic device 100 may include more or fewer components than shown, or certain components may be combined, or certain components may be split, or different arrangements of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
The processor 610 may include one or more processing units, such as: the processor 610 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), a controller, a memory, a video codec, a digital signal processor (digital signal processor, DSP), a baseband processor, and/or a neural network processor (neural-network processing unit, NPU), etc. Wherein the different processing units may be separate devices or may be integrated in one or more processors.
The controller may be a neural hub and a command center of the electronic device 100, among others. The controller can generate operation control signals according to the instruction operation codes and the time sequence signals to finish the control of instruction fetching and instruction execution.
A memory may also be provided in the processor 610 for storing instructions and data. In some embodiments, the memory in the processor 610 is a cache memory. The memory may hold instructions or data that the processor 610 has just used or recycled. If the processor 610 needs to reuse the instruction or data, it may be called directly from memory. Repeated accesses are avoided, reducing the latency of the processor 610 and thus improving the efficiency of the system.
In some embodiments, the processor 610 may include one or more interfaces. The interfaces may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous receiver transmitter (universal asynchronous receiver/transmitter, UART) interface, a mobile industry processor interface (mobile industry processor interface, MIPI), a general-purpose input/output (GPIO) interface, a subscriber identity module (subscriber identity module, SIM) interface, and/or a universal serial bus (universal serial bus, USB) interface, among others.
The I2C interface is a bi-directional synchronous serial bus comprising a serial data line (SDA) and a serial clock line (derail clock line, SCL). In some embodiments, the processor 610 may contain multiple sets of I2C buses. The processor 610 may be coupled to the touch sensor 680K, charger, flash, camera 693, etc., respectively, through different I2C bus interfaces. For example: processor 610 may couple touch sensor 680K through an I2C interface, causing processor 610 to communicate with touch sensor 680K through an I2C bus interface, implementing the touch functionality of electronic device 100.
It should be understood that the interfacing relationship between the modules illustrated in the embodiments of the present application is only illustrative, and is not meant to limit the structure of the electronic device 100. In other embodiments of the present application, the electronic device 100 may also employ different interfacing manners in the above embodiments, or a combination of multiple interfacing manners.
The charge management module 640 is used to receive a charge input from a charger. The charger can be a wireless charger or a wired charger. In some wired charging embodiments, the charge management module 640 may receive a charging input of a wired charger through the USB interface 630. In some wireless charging embodiments, the charge management module 640 may receive wireless charging input through a wireless charging coil of the electronic device 100. The charging management module 640 may also provide power to the electronic device through the power management module 641 while charging the battery 642.
The power management module 641 is used for connecting the battery 642, the charge management module 640 and the processor 610. The power management module 641 receives input from the battery 642 and/or the charge management module 640 and provides power to the processor 610, the internal memory 621, the external memory, the display 694, the camera 693, the wireless communication module 660, and the like. The power management module 641 may also be configured to monitor battery capacity, battery cycle times, battery health (leakage, impedance), and other parameters. In other embodiments, the power management module 641 may also be disposed in the processor 610. In other embodiments, the power management module 641 and the charge management module 640 may be disposed in the same device.
The wireless communication function of the electronic device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 650, the wireless communication module 660, a modem processor, a baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the electronic device 100 may be used to cover a single or multiple communication bands. Different antennas may also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed into a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 650 may provide a solution for wireless communication, including 2G/3G/4G/5G, as applied to the electronic device 100. The mobile communication module 650 may include at least one filter, switch, power amplifier, low noise amplifier (low noise amplifier, LNA), etc. The mobile communication module 650 may receive electromagnetic waves from the antenna 1, perform processes such as filtering, amplifying, and the like on the received electromagnetic waves, and transmit the processed electromagnetic waves to the modem processor for demodulation. The mobile communication module 650 may amplify the signal modulated by the modem processor, and convert the signal into electromagnetic waves through the antenna 1 to radiate the electromagnetic waves. In some embodiments, at least some of the functional modules of the mobile communication module 650 may be disposed in the processor 610. In some embodiments, at least some of the functional modules of the mobile communication module 650 may be disposed in the same device as at least some of the modules of the processor 610.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating the low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then transmits the demodulated low frequency baseband signal to the baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then transferred to the application processor. The application processor outputs sound signals through an audio device (not limited to speaker 670A, receiver 670B, etc.), or displays images or video through display 694.
The wireless communication module 660 may provide solutions for wireless communication including wireless local area network (wireless local area networks, WLAN) (e.g., wireless fidelity (wireless fidelity, wi-Fi) network), bluetooth (BT), global navigation satellite system (global navigation satellite system, GNSS), frequency modulation (frequency modulation, FM), near field wireless communication technology (near field communication, NFC), infrared technology (IR), etc., as applied to the electronic device 100. The wireless communication module 660 may be one or more devices that integrate at least one communication processing module. The wireless communication module 660 receives electromagnetic waves via the antenna 2, modulates the electromagnetic wave signals, filters the electromagnetic wave signals, and transmits the processed signals to the processor 610. The wireless communication module 660 may also receive signals to be transmitted from the processor 610, frequency modulate them, amplify them, and convert them to electromagnetic waves for radiation via the antenna 2.
In some embodiments, antenna 1 and mobile communication module 650 of electronic device 100 are coupled, and antenna 2 and wireless communication module 660 are coupled, such that electronic device 100 may communicate with a network and other devices through wireless communication techniques. Wireless communication techniques may include global system for mobile communications (global system for mobile communications, GSM), general packet radio service (general packet radio service, GPRS), code division multiple access (code division multiple access, CDMA), wideband code division multiple access (wideband code division multiple access, WCDMA), time division code division multiple access (time-division code division multiple access, TD-SCDMA), long term evolution (long term evolution, LTE), BT, GNSS, WLAN, NFC, FM, and/or IR techniques, among others. The GNSS may include a global satellite positioning system (global positioning system, GPS), a global navigation satellite system (global navigation satellite system, GLONASS), a beidou satellite navigation system (beidou navigation satellite system, BDS), a quasi zenith satellite system (quasi-zenith satellite system, QZSS) and/or a satellite based augmentation system (satellite based augmentation systems, SBAS).
The electronic device 100 implements display functions via a GPU, a display screen 694, and an application processor, etc. The GPU is a microprocessor for image processing, and is connected to the display 694 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. Processor 610 may include one or more GPUs that execute program instructions to generate or change display information.
The display 694 is used to display images, video, and the like. The display 694 includes a display panel. The display panel may employ a liquid crystal display (liquid crystal display, LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (AMOLED) or an active-matrix organic light-emitting diode (matrix organic light emitting diode), a flexible light-emitting diode (flex), a Miniled, a micr OLED, a Micro-OLED, a quantum dot light-emitting diode (quantum dot light emitting diodes, QLED), or the like. In some embodiments, the electronic device 100 may include 1 or N display screens 694, N being a positive integer greater than 1.
The electronic device 100 may implement shooting functions through an ISP, a camera 693, a video codec, a GPU, a display 694, an application processor, and the like.
The ISP is used to process the data fed back by the camera 693. For example, when photographing, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electrical signal, and the camera photosensitive element transmits the electrical signal to the ISP for processing, so that the electrical signal is converted into an image visible to naked eyes. ISP can also optimize the noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in the camera 693.
The camera 693 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image onto the photosensitive element. The photosensitive element may be a charge coupled device (charge coupled device, CCD) or a Complementary Metal Oxide Semiconductor (CMOS) phototransistor. The photosensitive element converts the optical signal into an electrical signal, which is then transferred to the ISP to be converted into a digital image signal. The ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into an image signal in a standard RGB, YUV, or the like format. In some embodiments, the electronic device 100 may include 1 or N cameras 693, N being a positive integer greater than 1.
Camera 693 includes at least one image sensor that may be used to capture image frames. The image sensor may be a normal frame rate image sensor or a high frame rate image sensor. The high frame rate image sensor includes, for example, an image sensor having an image acquisition frame rate of 40 frames per second (frames per second, fps) or more.
Optionally, the image sensor includes an on-chip AE module, and the on-chip AE module is configured to perform automatic exposure processing on an image frame acquired by the image sensor. For example, the on-chip AE module acquires a current image frame acquired by the image sensor and performs brightness recognition on the current image frame to acquire a current ambient light level, and determines a corresponding exposure value (timer) and/or gain value (gain) for the current ambient light level. Then, the on-chip AE module may determine a target router and/or a target gain corresponding to the router and/or the gain based on the convergence of the ambient light brightness, and instruct the image sensor to acquire a new image frame based on the target router and/or the target gain. And then, the on-chip AE module can determine the current ambient light brightness corresponding to the new image frame, determine whether the automatic exposure process is finished based on the difference value between the current ambient light brightness corresponding to the two image frames, and determine that the automatic exposure process is finished if the difference value is smaller than or equal to a preset threshold value. Then, the on-chip AE module may determine a router and/or gain corresponding to the latest image frame and a target shooting mode, and convert the router and/or gain into a router and/or gain corresponding to the target shooting mode, so as to instruct the image sensor to acquire the image frame meeting the requirement of the target shooting mode based on the router and/or gain corresponding to the target shooting mode.
The processing speed of the on-chip AE module on the image frame is greater than or equal to the speed of the image sensor for collecting the image frame, so that the on-chip AE module can timely process the image frame output by the image sensor.
The on-chip AE module may be a hardware module added to the image sensor, for example, a hardware processor with an automatic exposure processing function is integrated in the image sensor. The on-chip AE module may be a software module added to the image sensor having a processing function, for example, an algorithm for automatic exposure processing may be deployed in the image sensor having a processing function. The embodiment of the application does not limit the specific setting mode of the on-chip AE module.
The digital signal processor is used for processing digital signals, and can process other digital signals besides digital image signals. For example, when the electronic device 100 selects a frequency bin, the digital signal processor is used to fourier transform the frequency bin energy, or the like.
Video codecs are used to compress or decompress digital video. The electronic device 100 may support one or more video codecs. In this way, the electronic device 100 may play or record video in a variety of encoding formats, such as: dynamic picture experts group (moving picture experts group, MPEG) 1, MPEG2, MPEG3, MPEG4, etc.
The internal memory 621 may be used to store computer-executable program code that includes instructions. The processor 610 executes instructions stored in the internal memory 621 to thereby perform various functional applications and data processing of the electronic device 100. The internal memory 621 may include a storage program area and a storage data area. The storage program area may store an application program (such as a sound playing function, an image playing function, etc.) required for at least one function of the operating system, etc. The storage data area may store data created during use of the electronic device 100 (e.g., audio data, phonebook, etc.), and so on. In addition, the internal memory 621 may include a high-speed random access memory, and may further include a nonvolatile memory such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (universal flash storage, UFS), and the like.
Electronic device 100 may implement audio functionality through audio module 670, speaker 670A, receiver 670B, microphone 670C, headphone interface 670D, and an application processor, among others. Such as music playing, recording, etc.
The audio module 670 is used to convert digital audio information to an analog audio signal output and also to convert an analog audio input to a digital audio signal. The audio module 670 may also be used to encode and decode audio signals. In some embodiments, the audio module 670 may be disposed in the processor 610, or some of the functional modules of the audio module 670 may be disposed in the processor 610.
The sensor modules 680 may include, among other things, pressure sensors 680A, gyroscope sensors 680B, distance sensors 680F, touch sensors 680K, ambient light sensors 680L, and the like.
The pressure sensor 680A is used to sense a pressure signal, and may convert the pressure signal into an electrical signal. In some embodiments, pressure sensor 680A may be disposed on display 694. The pressure sensor 680A is of various types, such as a resistive pressure sensor, an inductive pressure sensor, a capacitive pressure sensor, and the like. The capacitive pressure sensor may be a capacitive pressure sensor comprising at least two parallel plates with conductive material. When a force is applied to the pressure sensor 680A, the capacitance between the electrodes changes. The electronic device 100 determines the strength of the pressure from the change in capacitance. When a touch operation is applied to the display 694, the electronic apparatus 100 detects the touch operation intensity according to the pressure sensor 680A. The electronic device 100 may also calculate the location of the touch based on the detection signal of the pressure sensor 680A. In some embodiments, touch operations that act on the same touch location, but at different touch operation strengths, may correspond to different operation instructions.
The gyro sensor 680B may be used to determine a motion gesture of the electronic device 100. In some embodiments, the angular velocity of electronic device 100 about three axes (i.e., x, y, and z axes) may be determined by gyro sensor 680B. The gyro sensor 680B may be used to capture anti-shake. For example, when the shutter is pressed, the gyro sensor 680B detects the shake angle of the electronic device 100, calculates the distance to be compensated by the lens module according to the angle, and makes the lens counteract the shake of the electronic device 100 through the reverse motion, so as to realize anti-shake. The gyro sensor 680B may also be used for navigation, somatosensory of game scenes.
A distance sensor 680F for measuring distance. The electronic device 100 may measure the distance by infrared or laser. In some embodiments, the electronic device 100 may range using the distance sensor 680F to achieve fast focus.
The ambient light sensor 680L is used to sense ambient light level. The electronic device 100 may adaptively adjust the brightness of the display 694 based on the perceived ambient light level.
The touch sensor 680K, also referred to as a "touch panel". The touch sensor 680K may be disposed on the display 694, and the touch sensor 680K and the display 694 form a touch screen, which is also referred to as a "touch screen". The touch sensor 680K is used to detect a touch operation acting on or near it. The touch sensor may communicate the detected touch operation to the application processor to determine the touch event type. Visual output related to touch operations may be provided through the display 694. In other embodiments, the touch sensor 680K may also be disposed on a surface of the electronic device 100 at a different location than the display 694.
The motor 691 may generate a vibration alert. The motor 691 may be used for incoming call vibration alerting as well as for touch vibration feedback.
The software system of the electronic device 100 may employ a layered architecture, an event driven architecture, a microkernel architecture, a microservice architecture, or a cloud architecture. In the embodiment of the invention, an Android system with a layered architecture is taken as an example, and a software structure of the electronic device 100 is illustrated.
Fig. 7 is a software configuration block diagram of the electronic device 100 according to the embodiment of the present invention.
The layered architecture divides the software into several layers, each with distinct roles and branches. The layers communicate with each other through a software interface. In some embodiments, the Android system is divided into four layers, from top to bottom, an application layer, an application framework layer, an Zhuoyun rows (Android run) and libraries, a hardware abstraction layer (hardware abstraction layer, HAL), and a kernel layer, respectively.
The application layer may include a series of application packages.
As shown in fig. 7, the application package may include applications for cameras, gallery, calendar, talk, map, navigation, WLAN, setup, music, short messages, etc.
The application framework layer provides an application programming interface (application programming interface, API) and programming framework for application programs of the application layer. The application framework layer includes a number of predefined functions.
As shown in fig. 7, the application framework layer may include a window manager, a content provider, a view system, a phone manager, a resource manager, a notification manager, a camera service, and the like.
Camera services are used to manage and use the functionality of camera applications. In some examples, the camera service may activate the image sensor, control the image sensor to capture image frames, obtain image frames captured by the image sensor, and so forth in response to the activation of the camera application.
The window manager is used for managing window programs. The window manager can acquire the size of the display screen, judge whether a status bar exists, lock the screen, intercept the screen and the like.
The content provider is used to store and retrieve data and make such data accessible to applications. The data may include video, images, audio, calls made and received, browsing history and bookmarks, phonebooks, etc.
The view system includes visual controls, such as controls to display text, controls to display pictures, and the like. The view system may be used to build applications. The display interface may be composed of one or more views. For example, a display interface including a text message notification icon may include a view displaying text and a view displaying a picture.
The telephony manager is used to provide the communication functions of the electronic device 100. Such as the management of call status (including on, hung-up, etc.).
The resource manager provides various resources for the application program, such as localization strings, icons, pictures, layout files, video files, and the like.
The notification manager allows the application to display notification information in a status bar, can be used to communicate notification type messages, can automatically disappear after a short dwell, and does not require user interaction. Such as notification manager is used to inform that the download is complete, message alerts, etc. The notification manager may also be a notification in the form of a chart or scroll bar text that appears on the system top status bar, such as a notification of a background running application, or a notification that appears on the screen in the form of a dialog window. For example, a text message is prompted in a status bar, a prompt tone is emitted, the electronic device vibrates, and an indicator light blinks, etc.
Android run includes a core library and a virtual machine. Android run is responsible for scheduling and management of Android systems.
The core library consists of two parts: one part is a function which needs to be called by java language, and the other part is a core library of android.
The application layer and the application framework layer run in a virtual machine. The virtual machine executes java files of the application program layer and the application program framework layer as binary files. The virtual machine is used for executing the functions of object life cycle management, stack management, thread management, security and exception management, garbage collection and the like.
The system library may include a plurality of functional modules. For example: surface manager (surface manager), media Libraries (Media Libraries), three-dimensional graphics processing Libraries (e.g., openGL ES), two-dimensional graphics engines (e.g., SGL), etc.
The surface manager is used to manage the display subsystem and provides a fusion of 2D and 3D layers for multiple applications.
Media libraries support a variety of commonly used audio, video format playback and recording, still image files, and the like. The media library may support a variety of audio and video encoding formats, such as MPEG4, h.264, MP3, AAC, AMR, JPG, PNG, etc.
The three-dimensional graphic processing library is used for realizing three-dimensional graphic drawing, image rendering, synthesis, layer processing and the like.
A two-dimensional graphics engine is a drawing engine that draws two-dimensional drawings.
The HAL layer is used for packaging a Linux kernel driver, providing an interface upwards and shielding implementation details of bottom hardware. The HAL layer may include a camera hardware abstraction (camera HAL) layer, etc. Wherein the camera HAL is the core software framework of the camera application.
The kernel layer is a layer between hardware and software. The inner core layer at least comprises a display driver, a camera driver, an audio driver and a sensor driver.
Wherein the camera services in the application framework layer may interact with the camera HAL in the HAL layer, which may interact with the camera driver in the kernel layer.
The workflow of the software and hardware of the electronic device 100 is illustrated in the following in connection with the photographing scenario according to the embodiment of the present application. As shown in fig. 1 (a), when the touch sensor 680K receives a user's touch operation on the camera application icon, a corresponding hardware interrupt is issued to the kernel layer. The kernel layer processes the touch operation into the original input event (including information such as touch coordinates, time stamp of touch operation, etc.). The original input event is stored at the kernel layer. The application framework layer acquires an original input event from the kernel layer, and recognizes that a control corresponding to the input event is a camera application icon. In response to a camera application launch in the application layer, the electronic device may launch an operating program of the camera application by invoking an interface provided by a camera service in the application framework layer, e.g., the camera service may send an open request for the image sensor to the camera HAL. The camera HAL then sends an opening request for the image sensor to the camera driver of the kernel layer. Then, the corresponding camera driver starts the camera 693 after receiving the open request sent by the image sensor. After the camera 693 is powered on, a still image or video can be captured, and the display screen can display image data collected by the camera 693 in a display interface shown in fig. 1 (b).
In some embodiments, as shown in fig. 1 (a), the electronic device starts running the camera application in response to a click operation of the camera application icon by the user. In particular, a touch sensor in an electronic device may communicate a detected touch operation to an application processor. And then, the application processor determines that the touch event type is the click operation of the camera application icon by the user, and the camera application program can be started to run. As shown in fig. 8, the application processor may control the image sensor including the on-chip AE module to be powered on when starting to run the camera application.
It should be appreciated that the electronic device may also begin running the camera application in response to call requests from other applications to the camera application, or in response to a user controlling the operation of the camera application launch with voice or gestures. The following describes a process in which the electronic device starts running the camera application, taking a click operation of the camera application icon in response to a user as an example.
Illustratively, as shown in fig. 8, in response to a user clicking an icon of a camera application, the application processor controls the image sensor to power up and start up and triggers an initialization process of the image sensor. Optionally, in the initializing process, the application processor performs initializing setting on the image sensor, so that the image sensor is switched to a software standby state. After the image sensor is switched to the software standby state, the image sensor sends a state switching success indication to the application processor. The application processor may send a burst mode request to the image sensor after determining that the image sensor has successfully switched to the state in response to receiving the state switch success indication. Accordingly, the image sensor receives a burst mode request sent by the application processor and initiates a burst mode based on the burst mode request. In the burst mode, the image sensor acquires image frames at a high frame rate, and an on-chip AE module included in the image sensor performs an automatic exposure process on the image frames acquired by the image sensor. Alternatively, the on-chip AE module can implement automatic exposure processing at a high frame rate for the image frames.
In this way, since the AE module in the image sensor can perform the automatic exposure process on the image frame immediately after the image sensor acquires the image frame. Compared with the prior art, after the image sensor acquires the image frames, the image frames need to be transmitted to the application processor, and the application processor processes the image frames, so that the transmission time of the image frames is effectively saved. In addition, the image sensor collects image frames at a high frame rate, the processing speed of the on-chip AE module on the image frames is also higher than the speed of the image sensor collecting the image frames, high-speed automatic exposure can be realized, and the automatic exposure time is shortened.
Illustratively, as shown in fig. 8, in response to a received burst mode request, an image sensor in a software standby state starts a burst mode, starts capturing an image frame, and invokes an on-chip AE module to perform an automatic exposure process on the captured image frame.
Optionally, the speed of processing the image frames by the on-chip AE module in the image sensor is greater than the speed of collecting the image frames by the image sensor, so that the on-chip AE module can timely perform automatic exposure processing on the image frames collected by the image sensor. For example, the image sensor captures image frames at a speed of 120fps, and the processing speed of the on-chip AE module is greater than or equal to 120fps.
Therefore, by increasing the speed at which the image sensor captures image frames, the speed of automatic exposure can be increased. For example, the image sensor captures image frames at 40fps, and correspondingly, the processing speed of the on-chip AE module is 40fps. At this time, the time required for the on-chip AE module to process one image frame is about 25ms (the processing time calculation formula of one image frame is 1/40 fps=25 ms). If the speed of the image sensor for capturing image frames is increased to 120fps, and the processing speed of the corresponding on-chip AE module is 120fps, the time required for processing one image frame by the on-chip AE module is increased to 8.3ms (the processing time calculation formula of one image frame is 1/120fps approximately 8.3 ms).
Optionally, in the automatic exposure process shown in fig. 2 or fig. 4, after the initial AEC process, the screen displayed on the display screen of the electronic device is close to the ambient light brightness corresponding to the screen with normal exposure, and the luminance difference between the screen displayed on the display screen of the electronic device at the beginning and the end of the initial AEC process is larger. Therefore, in the initial AEC process, the phenomenon of overexposure or underexposure occurs when the screen displayed on the display screen of the electronic device gradually converges from bright to normal exposure or from dark to normal exposure. Therefore, the scheme of the embodiment of the application improves the initial AEC process shown in the figure 2 or the figure 4, and the time required by the initial AEC process is reduced to improve the speed of integral automatic exposure, so that the phenomenon of overexposure or underexposure of the picture of the display screen is avoided.
Optionally, in the initial automatic exposure control process (i.e., initial AEC process), different screen brightness corresponds to different adjustment steps. For example, the larger or smaller the screen brightness, the larger the corresponding adjustment step, and the closer the screen brightness is to the middle value of the screen brightness variation range, the smaller the corresponding adjustment step. For example, when the brightness of the picture is 100nit, the adjustment step size may be 20nit; when the brightness of the picture is 80nit, the adjustment step length can be 15nit; when the brightness of the picture is 40nit, the adjustment step length can be 5nit; when the brightness of the picture is 20nit, the adjustment step length can be 15nit; when the brightness of the picture is 5nit, the adjustment step size can be 20nit. Thus, based on the relation between the adjustment step length and the picture brightness, when the picture brightness is brighter or darker, the picture brightness is quickly adjusted through the larger adjustment step length, and when the picture brightness approaches to the proper brightness, the picture brightness is accurately adjusted through the smaller adjustment step length.
Optionally, in the initial automatic exposure control process (i.e., initial AEC process), after determining the adjustment step size based on the screen brightness, the electronic device may also adjust the adjustment step size based on the brightness difference between two consecutive frames of images. For example, in the case that the difference between the brightness of two continuous frames is large, that is, the brightness of the image frames does not tend to be stable, the adjustment step length can be increased so as to quickly adjust the brightness of the image frames; in the case that the difference of brightness of two continuous frames of images is small, that is, the brightness of the image frames tends to be stable, the adjustment step size can be reduced.
Alternatively, the adjustment step length in the initial automatic exposure control process may be greater than or equal to the adjustment step length in the automatic exposure process in the related art. Thus, the above scheme can more rapidly complete the auto-exposure process by the same or fewer frames.
For example, corresponding to the initial automatic exposure control process shown in fig. 2, as shown in fig. 8, in the initial automatic exposure control process in the embodiment of the present application, the on-chip AE module may determine the current ambient light level corresponding to the current image frame, and iteratively adjust the exposure parameters of the image sensor based on the adjustment step corresponding to the current ambient light level, so that the picture brightness of the image frame after final iterative adjustment is close to the picture brightness of the image frame of the previous frame, that is, automatic convergence stabilization of the ambient light level is achieved. For example, the on-chip AE module determines the picture brightness 1 of the image frame 1 acquired by the image sensor and the exposure parameter 1, and determines the exposure parameter 1' based on the adjustment step corresponding to the picture brightness 1. Then, the on-chip AE module acquires an image frame 2 acquired by the image sensor based on the exposure parameter 1', and a screen brightness 2 and the exposure parameter 2 corresponding to the image frame 2. And if the on-chip AE module determines that the difference value between the picture brightness 1 and the picture brightness 2 is larger than the preset threshold value, the exposure parameter 2' is continuously determined based on the adjustment step length corresponding to the picture brightness 2, so that the image frame 3 acquired by the image sensor is obtained. If the on-chip AE module determines that the difference between the picture brightness 3 of the image frame 3 and the picture brightness 4 of the image frame 4 is less than or equal to the preset threshold, the exposure parameter may be determined to be the exposure parameter 4 corresponding to the image frame 4. Finally, the on-chip AE module determines a target shooting mode based on a shooting mode designation instruction sent by the application processor, and converts the exposure parameter 4 into an exposure parameter 4 'corresponding to the target shooting mode, so that the image sensor acquires the image frame 5 based on the exposure parameter 4'. Therefore, the electronic equipment can rapidly complete the automatic exposure process, so that the ambient light brightness corresponding to the output image frame is converged stably, and the requirement of a target shooting mode is met.
In the initial automatic exposure control process described above, the on-chip AE module acquires the screen brightness 2 corresponding to the image frame 2, and then determines the adjustment step corresponding to the screen brightness 2. Then, if the on-chip AE module determines that the difference between the picture brightness 1 and the picture brightness 2 is larger, for example, the difference is larger than the preset difference, the adjustment step corresponding to the picture brightness 2 may be increased. If the on-chip AE module determines that the difference between the picture brightness 1 and the picture brightness 2 is smaller, for example, the difference is smaller than the preset difference, the adjustment step corresponding to the picture brightness 2 can be reduced, so as to determine the exposure parameter 1'.
In the above embodiment, if the processing speed of the on-chip AE module is 120fps and the time required to process one image frame is about 8.3ms, the initial automatic exposure control process takes about 41ms (8.3 ms×5≡41 ms). Compared to the prior art, the image sensor needs to transmit the acquired image frames to the application processor, and the application processor executes the initial automatic exposure control process, which takes a long time, such as about 160ms. According to the automatic exposure method provided by the embodiment of the application, the on-chip AE module capable of carrying out automatic exposure processing at high speed is preset in the image sensor, so that a rapid initial automatic exposure control process can be realized, exposure time is effectively saved, and the problem of waiting of a user caused by overexposure or underexposure is avoided.
It will be appreciated that the above example describes an automatic exposure process for image frames taking the processing speed of the AE module on chip as 120fps as an example, in which case, 5 image frames of time are required to complete the automatic exposure process. If the on-chip AE module is greater than 120fps, the auto-exposure process time would be further reduced. For example, in a general case, the auto exposure process may be completed in not more than 6 image frames, that is, the process of the auto exposure process may be controlled within 50ms by the method of the present embodiment. And, since the automatic exposure processing speed of the on-chip AE module is greater than that of the application processor, the time required for the course of the automatic exposure processing is less than that of the prior art scheme.
As described above, after the initialization is completed, the application processor sends a burst mode request to the image sensor, so that the image sensor performs image acquisition in a high-speed mode, and the on-chip AE module on the image sensor can perform automatic exposure processing on the acquired image frames in time, thereby improving the speed of automatic exposure.
The following describes the process of the automatic exposure method provided in the embodiment of the present application in detail. Fig. 9 is a schematic flow chart of an automatic exposure method according to an embodiment of the present application. As shown in fig. 9, the method includes the following steps.
And S901, controlling the image sensor to be powered on by an application processor.
Optionally, the electronic device is configured with one or more cameras, one camera being configured with one or more image sensors. In response to a user indicating an operation to launch a camera application, the camera application launches a corresponding camera through an application processor to trigger an image sensor in the camera to power up.
The application processor sends a camera opening request to the camera service. The camera opening request comprises identification of an image sensor to be controlled. The camera service sends a channel setup request to the HAL based on the identity of the image sensor, the channel setup request being used to create an image acquisition channel in the software architecture layer. Based on the image acquisition channel, the application processor can control the image sensor to acquire images and acquire image frames sent by the image sensor. The HAL sends an opening request of the image sensor to the camera driver based on the identification of the image sensor carried in the channel establishment request, and the identification of the image sensor is carried in the opening request. And the camera driver controls the image sensor corresponding to the identifier in the camera to be electrified according to the identifier of the image sensor in the opening request.
Optionally, the application processor may determine, according to a trigger event that invokes the camera application, a target image sensor to be controlled included in the camera opening request, and further determine an identifier of the target image sensor, so as to implement starting the target image sensor in the above manner. For example, in response to a click operation of a camera application icon by a user, the application processor may determine that a camera indicated by a trigger event corresponding to the operation is a rear camera of the electronic device, and further determine that the target image sensor is an image sensor in the rear camera. In response to an operation of starting a camera application program by user voice, the application processor can determine that a camera indicated by a triggering event corresponding to the operation is a front-end camera of the electronic device, and further determine that a target image sensor is an image sensor in the front-end camera. It will be appreciated that the correspondence between the triggering event and the target image sensor has been preset in the electronic device.
In some embodiments, after the image sensor is powered on, the application processor determines whether mode parameters corresponding to different shooting modes need to be written into a register of the image sensor. The mode parameter may include one or more of resolution, size of a display screen of the electronic device, and frame rate at which the image sensor captures images.
For example, the application processor determines whether to first control the image sensor to power up. If yes, the application processor controls the image sensor to be electrified and started, and mode parameters corresponding to different shooting modes are written into a register of the image sensor. If not, the application processor controls the image sensor to be powered on and started.
For another example, the application processor determines whether the mode parameters in the registers of the image sensor need to be updated. If the image sensor needs to be updated, the application processor controls the image sensor to be powered on and started, and mode parameters corresponding to the shooting mode needing to be updated are written into a register of the image sensor.
In some examples, the shooting modes corresponding to the image sensors in different cameras are the same or different. Then the mode parameters to be written in the different image sensors are the same or different.
S902, the application processor sends initialization setting parameters to the image sensor.
In some embodiments, the application processor, upon determining that the image sensor is powered up, may send initialization setting parameters to the image sensor to trigger an initialization process of the image sensor.
Illustratively, as shown in fig. 8, the application processor, upon determining that the image sensor is powered up, triggers an initialization process for the image sensor so that the image sensor switches to a software standby state.
Optionally, a callback may also be registered during the process of sending the initialization setting parameters to the image sensor by the application processor. The function of registering the callback is that after the image sensor is switched to the software standby state based on the initialization setting parameters, an initialization state switching result can be fed back to the application processor.
In other embodiments, the application processor may write the initialization setting parameters to the target image sensor or to a register corresponding to the target image sensor. After that, the target image sensor can directly acquire the initialization setting parameters after power-on.
For example, in S901 described above, the application processor determines whether to control the image sensor to be powered on for the first time. If yes, the application processor controls the image sensor to be powered on and started, and initialization setting parameters are written into a register of the image sensor. Accordingly, the image sensor can read the initialization setting parameters from the register and complete the initialization process.
For another example, the application processor determines whether the initialization setting parameters in the registers of the image sensor need to be updated. If the image sensor needs to be updated, the application processor controls the image sensor to be powered on and started, and initialization setting parameters needing to be updated are written into a register of the image sensor.
S903, the image sensor switches to a software standby state based on the initialization setting parameter.
In some embodiments, the image sensor completes the initialization process based on the acquired initialization parameters, and switches to a software standby state in order to receive and respond to a control request sent by the application processor. For example, the image sensor acquires image frames in response to a burst mode request transmitted from the application processor, and performs an auto-exposure process on the acquired image frames.
S904, the application processor sends a burst mode request to the image sensor.
In some embodiments, the application processor may send a burst mode request to the image sensor after controlling the image sensor to power up, the burst mode request being used to instruct the image sensor to start acquiring image frames, and to perform an automatic exposure process on the acquired image frames after the image sensor acquires the image frames.
Optionally, the application processor receives an initialization state switching result fed back by the image sensor. Then, the application processor determines whether the image sensor is successfully switched to the software standby state according to the initialization state switching result. If the image sensor has successfully switched to the software standby state, the application processor may send a burst mode request to the image sensor. If the image sensor fails to switch to the software standby state, the application processor may return to execute S902, and trigger the image sensor to execute the initialization process again.
Optionally, a callback may also be registered during the process of sending the burst mode request to the image sensor by the application processor. The callback is registered to feed back the image frame acquisition result or the image frame after the automatic exposure processing to the application processor after the image sensor is switched to the burst mode and starts to acquire the image frame.
S905, the image sensor switches to burst mode and starts to acquire image frames.
In some embodiments, the image sensor switches to burst mode in response to a received burst mode request. Thereafter, in burst mode, the image sensor acquires image frames at a high frame rate. Image sensor capturing image frames at a high frame rate may refer to an image sensor capturing frame rate greater than or equal to 40fps, where the criteria for high frame rate may be a specific value determined according to the application, without limitation.
Alternatively, the image frame acquired by the image sensor may not be output to the display screen of the electronic device for display, but the following S906 is performed first, and the automatic exposure process is performed on the acquired image frame.
S906, the image sensor performs automatic exposure processing on the image frames acquired by the image sensor through the on-chip AE module.
In some embodiments, the on-chip AE module performs an automatic exposure process on the image frames acquired by the image sensor. Alternatively, the on-chip AE module can implement automatic exposure processing at a high frame rate for the image frames.
Optionally, the speed of the automatic exposure processing of the image frame by the on-chip AE module is greater than or equal to the speed of the image sensor for acquiring the image frame.
In some examples, as described in the various embodiments above, after the image sensor is switched to the burst mode, the on-chip AE module in the image sensor may first perform an automatic exposure process based on the burst mode. And after the image sensor acquires the target shooting mode indicated by the application processor, an on-chip AE module in the image sensor can convert exposure parameters obtained by the automatic exposure processing into exposure parameters corresponding to the target shooting mode on the basis of executing the automatic exposure processing based on the burst mode. Therefore, the image sensor can acquire images based on exposure parameters corresponding to the target shooting mode, and then the image frames corresponding to the target shooting mode can be directly displayed in the display screen of the electronic equipment. That is, as shown in fig. 10, S906 includes S906a and S906b described below. Wherein S906a is used for implementing an automatic exposure processing procedure based on a burst mode, and S906b is used for implementing a procedure of converting an exposure parameter obtained by the automatic exposure processing into an exposure parameter corresponding to a target shooting mode. S906a includes S1001 and S906b includes S1002-S1006.
S1001, the image sensor performs automatic exposure processing based on the burst mode to obtain an exposure parameter A.
In some embodiments, the image sensor receives a burst mode request sent by the application processor and begins acquiring image frames based on the burst mode request. In the burst mode, the on-chip AE module performs automatic exposure processing on the acquired image frames based on a preset burst mode. The on-chip AE module may perform automatic exposure processing based on an image frame acquired by the image sensor, and obtain a corresponding exposure parameter a based on a result of the automatic exposure processing.
Compared with the prior art, after the image sensor is switched to the software standby state, the application processor needs to control the image sensor to complete the mode switching, then the image sensor performs image acquisition based on the switched mode, and sends acquired image frames to the application processor, and the application processor can perform automatic exposure processing on the image frames sent by the image sensor. According to the automatic exposure processing method provided by the embodiment of the application, the image sensor can directly perform image acquisition work after receiving the burst mode request sent by the application processor, and the image sensor can directly perform automatic exposure processing on the acquired image frames through the on-chip AE module, so that the automatic exposure speed is effectively improved.
Alternatively, the on-chip AE module may acquire the current ambient light level a based on an image frame acquired by the image sensor during the execution of the automatic exposure process based on the burst mode. Compared with the prior art, the application processor determines the current ambient light brightness through the ambient light sensor, and the electronic device in the embodiment of the application can acquire the current ambient light brightness without the ambient light sensor, so that the occupied space of hardware inside the electronic device is reduced. The reduced hardware footprint may then be utilized in other ways, for example, after the ambient light sensor is eliminated in the front-facing camera, the area occupied by the front-facing camera in the front-facing panel of the electronic device (e.g., cell phone) may be correspondingly reduced, and the display screen in the front-facing panel of the cell phone may be correspondingly increased. Under the condition that the total area of the panel of the front panel of the mobile phone is unchanged, the screen ratio of the mobile phone is correspondingly improved.
In some embodiments, after the image sensor switches to the burst mode to start acquiring an image frame, the on-chip AE module may analyze based on the image frame acquired by the image sensor, so as to directly acquire the ambient light level corresponding to the image frame as the current ambient light level a. For example, the image acquired by the image sensor is an RGB color image, and then the image acquired by the image sensor includes a luminance signal belonging to an RGB color space, and the on-chip AE module may convert to a YUV color space based on the RGB color space in the RGB color image. Here, YUV is a color coding method, Y represents Luminance (luminence or Luma), and UV is used to describe chromaticity and density affecting color. In general, the luminance Y in the YUV color space contains 93% of image luminance information, so the on-chip AE module may determine the current ambient light luminance a based on the Y in the converted YUV color space. The method for obtaining the current ambient light brightness based on the image frame can also adopt other modes, for example, the R, G, B three color channels in the RGB color space of the image frame can be respectively counted to obtain a color histogram, and the distribution condition of the pixel brightness of the image frame can be intuitively seen through the color histogram. The embodiment of the application does not limit the mode of determining the current environment brightness.
In some examples, after determining the current ambient light brightness based on the current image frame, the on-chip AE module may iteratively adjust an exposure parameter corresponding to the current image frame based on the burst mode, so that the current ambient light brightness corresponding to the image frame collected in the subsequent automatic exposure process tends to be stable, thereby completing automatic exposure convergence. And the on-chip AE module can determine an exposure parameter A corresponding to the last frame of image in the automatic exposure processing process, so that the exposure parameter A can be subsequently converted into an exposure parameter corresponding to a target shooting mode, and the image frame output after the subsequent mode conversion meets the requirement of the ambient light brightness and the requirement of the target shooting mode. Alternatively, the details of the automatic exposure processing performed by the image sensor based on the burst mode may refer to the following relevant contents S1202 to S1204, which will not be described herein.
Optionally, after determining that the automatic exposure processing based on the burst mode is completed, the on-chip AE module may send the last determined image frame to the application processor, thereby implementing displaying the image frame on the display screen. As shown in fig. 8, the on-chip AE module determines that the difference between the brightness corresponding to the 3 rd frame image and the brightness corresponding to the 4 th frame image is smaller than the preset threshold, and can determine that the ambient light brightness stabilizer corresponding to the acquired image frame, so that the phenomenon of overexposure or underexposure of the subsequent image frame is not generated. Thus, the electronic device can directly display the 4 th frame image in the display screen.
In some examples, after the image sensor determines that the brightness of the ambient light corresponding to the acquired image frame is stable, S1002-S1006 described below may be executed to convert the exposure parameter corresponding to the burst mode into the exposure parameter corresponding to the target shooting mode, so that the image frame finally displayed by the electronic device meets the user requirement.
S1002, an application processor sends a shooting mode designating instruction to an image sensor.
In some embodiments, corresponding to S905 above, the application processor receives the image frame acquisition result returned by the image sensor, and determines that the image sensor has started to acquire the image frame according to the image frame acquisition result. Then, the application processor may transmit a photographing mode designation instruction to an on-chip AE module in the image sensor according to the target photographing mode. The target shooting mode may be determined according to a setting by a user, or may be determined according to a history shooting mode record of a camera, which is not limited herein.
For example, the user may set the camera to be turned on to directly start the portrait shooting mode, and then the application processor transmits a shooting mode designation instruction corresponding to the portrait shooting mode to the on-chip AE module in response to the image frame acquisition result.
For another example, the shooting mode before the camera is turned off is a night view shooting mode. Then, the application processor transmits a shooting mode designation instruction corresponding to the night scene shooting mode to the on-chip AE module in response to the image frame acquisition result.
Alternatively, the application processor may also send a shooting mode designation instruction to the on-chip AE module according to the operation of the user.
For example, as shown in fig. 11, in response to an operation of clicking the portrait shooting mode control 111 by the user, the application processor transmits an instruction for designating the shooting mode as the portrait shooting mode to the on-chip AE module. Or, in response to a user clicking the night scene shooting mode control, the application processor sends an instruction for designating the shooting mode as the night scene shooting mode to the on-chip AE module.
In some examples, the image sensor may receive a photographing mode designation instruction transmitted from the application processor in the course of performing the auto-exposure process based on the burst mode. That is, the image sensor may perform S1002 in the process of performing S906 a. Or after the image sensor completes the automatic exposure processing based on the burst mode, receiving a shooting mode designating instruction sent by the application processor. That is, the image sensor may perform S1002 after performing S906 a. The embodiments of the present application are not limited in this regard.
S1003, the image sensor determines a target shooting mode indicated by the shooting mode designation instruction.
In some embodiments, an on-chip AE module in the image sensor may determine a target shooting mode indicated by the application processor after receiving the shooting mode specification instruction.
S1004, the image sensor converts the exposure parameter a into an exposure parameter B corresponding to the target shooting mode based on the target shooting mode.
In some embodiments, after determining the target shooting mode, the on-chip AE module may obtain, according to the target shooting mode, a mode parameter corresponding to the target shooting mode from mode parameters that are pre-burned in the on-chip AE module. Then, the on-chip AE module may convert the exposure parameter a into the exposure parameter B based on the mode parameter corresponding to the target photographing mode, so that the image sensor may acquire the target image frame according to the exposure parameter B corresponding to the target photographing mode after completing the fast automatic exposure process.
Optionally, the mode parameters corresponding to the different shooting modes are different. For example, in the video shooting mode, the image sensor can acquire images at a high speed to achieve smoothness of shooting video pictures, while in the portrait mode of photo shooting, the speed requirement for the image sensor to acquire image frames is not high. Therefore, the frame rate of the image sensor in the mode parameter corresponding to the video shooting mode to collect the image is smaller than the frame rate of the image sensor in the mode parameter corresponding to the portrait mode to collect the image. The mode parameters corresponding to the portrait shooting mode include a frame rate of 20fps. After receiving the portrait shooting mode designating instruction sent by the application processor, the on-chip AE module may directly read a frame rate 20fps corresponding to the portrait shooting mode from the prestored mode parameters, and convert the exposure parameter a into the exposure parameter B based on the frame rate 20fps.
Alternatively, the manner of converting the exposure parameter a into the exposure parameter B may be referred to in the art, and embodiments of the present application are not limited herein.
The on-chip AE module is burnt with mode parameters corresponding to different shooting modes. Therefore, the application processor only needs to send the shooting mode designating instruction to the on-chip AE module, so that the on-chip AE module can directly acquire the mode parameter corresponding to the shooting mode designating instruction, and the time of data transmission is saved.
S1005, the image sensor acquires a target image frame based on the exposure parameter B.
In some embodiments, after performing the automatic exposure processing based on the burst mode, the AE module on the chip may determine that the automatic exposure processing in the burst mode is completed after the current ambient light brightness change corresponding to the image frame is stable, and obtain the exposure parameter a. Then, the exposure parameter a is converted into an exposure parameter B corresponding to the target photographing mode based on the target photographing mode, and a target image frame is acquired based on the exposure parameter B.
As illustrated in fig. 8, the on-chip AE module determines that the ambient light level corresponding to the image frame 3 is the same as or similar to the ambient light level corresponding to the image frame 4 and the ambient light levels corresponding to the image frame 4 and the image frame 5 during the automatic exposure process based on the burst mode. For example, the on-chip AE module determines that the difference between the luminance of image frame 3 and the luminance of image frame 4 is less than a preset threshold, and the difference between the luminance of image frame 4 and the luminance of image frame 5 is less than a preset threshold. Then, the on-chip AE module may determine that the change of the ambient light brightness corresponding to the image frame tends to be stable, and complete the automatic exposure process. Then, the on-chip AE module may convert the exposure parameter corresponding to the image frame 5 based on the target shooting mode instructed by the application processor to obtain the exposure parameter B corresponding to the target shooting mode. The image sensor acquires an image frame 6 based on the exposure parameter B corresponding to the target photographing mode.
S1006, the image sensor sends the target image frame to the application processor.
In some embodiments, the on-chip AE module may send the target image frame to the application processor after completing the automatic exposure process for the image frame such that the image sensor acquires the target image frame. For example, as shown in fig. 8, the AE module may determine that image frame 6 is the target image frame, as on-chip. Then the image sensor may send the image frame 6 to the application processor. Accordingly, the application processor may control the display screen to display the image frame 6 after receiving the image frame 6.
In this way, the target image frame with normal exposure can be displayed in the display screen of the electronic device, as shown in (b) of fig. 3, so that the display screen is prevented from undergoing a process of gradually changing from displaying the image frame with overexposed image frame as shown in (a) of fig. 3 or the image frame with underexposed image frame as shown in (c) of fig. 3 to displaying the image with normal exposure as shown in (b) of fig. 3. The method and the device avoid that a user cannot see the display picture in the display screen of the electronic equipment clearly in the process, and can see the picture displayed in the electronic equipment clearly after waiting for a long period of time, so that the use experience of the user is affected.
Alternatively, as in S1001 described above, the image sensor may transmit an image frame to the application processor after performing the auto-exposure process based on the burst mode. That is, during the auto-exposure process of the burst mode, the display screen may display an image frame in which the burst mode is during the auto-exposure process. Since the image sensor performs the auto-exposure processing on the image frames based on the burst mode at a high speed, the time required for changing from the image frames in the overexposed state to the image frames in the exposed normal state is about 50ms or less as calculated in the above embodiment in the process of displaying the image frames involved in the auto-exposure process on the display screen, and the user does not wait for an excessive time.
In addition, the automatic exposure of the image frames can be completed in a short time, and the persistence of vision of human eyes is generally considered to be 0.05 seconds, namely 50ms, so that the display screen of the electronic device is switched from the image frames with overexposed pictures as shown in (a) of fig. 3 or the image frames with underexposed pictures as shown in (c) of fig. 3 to the target image frames with normal exposure as shown in (b) of fig. 3 within about 50ms, and the user basically cannot perceive that the pictures of the display screen are gradually changed from overexposed or underexposed pictures to normal exposure pictures in the automatic exposure process.
Optionally, as shown in fig. 8, the application processor may further perform refined automatic exposure control on the image frame acquired by the image sensor after the acquisition of the target image frame, and the specific implementation manner may be referred to above, which is not described herein.
Fig. 12 is a flowchart of an automatic exposure method according to an embodiment of the present application. As shown in fig. 12, the method includes S1201-S1205.
S1201, in response to a first operation by the user, the electronic device activates the image sensor.
The first operation is used for triggering and starting a first application program, for example, the first application program is a camera application program. In response to the first operation, the first application triggers the activation of the image sensor. The first operation is one or more of operation of a user on an icon of the first application program, operation of indicating to switch the first application program to a foreground display, and operation of starting the first application program through the second application program.
For example, as shown in fig. 1 (a), the electronic device may activate the image sensor upon detecting a click operation of the camera application icon by the user. For another example, as shown in fig. 13 (a), the electronic device displays an interface of the information application (such as the second application), and the electronic device detects that the user performs the operation of sliding from the bottom up from the interface of the information application, that is, a multi-task interface as shown in fig. 13 (b) is displayed, and in the multi-task interface, switching among a plurality of applications can be implemented. The electronic device displays the card corresponding to the camera application program in the multi-task interface, as shown in fig. 13 (c), and detects the operation of clicking the card corresponding to the camera application program in the multi-task interface by the user, so as to determine that the user instructs to switch the camera application program. The electronic device may switch the camera application from background to foreground, trigger the image sensor to be activated, and display the interface as shown in fig. 13 (d). For another example, in response to a user clicking on a control in the second application display interface for launching a first application, the electronic device may launch the first application corresponding to the control, display the first application display interface, and trigger launching of the image sensor.
Alternatively, the first operation may be one or more of the above several operations, and the specific case may be determined based on the actual application scenario.
In some embodiments, the electronic device controls the image sensor to power up and triggers an initialization process of the image sensor through the application processor in response to a first operation of the user as described in S901-S904 above. In the initialization process, the application processor sends an initialization setting instruction to the image sensor so as to enable the image sensor to be switched to a software standby state. After the application processor determines that the image sensor is switched to the software standby state, a burst mode request may be sent to the image sensor. Accordingly, the image sensor initiates a burst mode based on the burst mode request and acquires image frames at a high frame rate in the following steps. And an on-chip AE module included in the image sensor performs automatic exposure processing on the image frames acquired by the image sensor.
Optionally, other contents of S1201 may refer to the related contents of S901-S904, which are not described herein.
S1202, the electronic equipment acquires an N frame image according to a preset first mode through an image sensor, and determines first brightness and first exposure parameters of the N frame image, wherein N is a positive integer.
Wherein the first mode is a burst mode.
In some embodiments, after the electronic device triggers the image sensor to start, the image sensor may be instructed to collect an image frame according to a preset burst mode, and an on-chip AE module in the image sensor may be instructed to perform automatic exposure processing on the image frame according to the preset burst mode.
Optionally, during the automatic exposure process, the on-chip AE module may determine, according to a preset method, a current ambient light level (e.g., a first brightness) corresponding to the image frame. The on-chip AE module may then determine an exposure parameter (e.g., a first exposure parameter) corresponding to the current ambient light level, the exposure parameter including an exposure value and/or a gain value.
Illustratively, as shown in fig. 8, the image sensor acquires image frames, such as image frame 1, in burst mode by an on-chip AE module. The on-chip AE module may then determine the RGB color space of image frame 1 and convert the RGB color space to a YUV color space. Then, the AE module on chip may determine luminance information indicated in the YUV color space, determine a current ambient light level corresponding to the image frame 1, and may determine an exposure value and/or a gain value corresponding to the current ambient light level.
Wherein the exposure value generally refers to exposure time, which refers to the time interval from shutter opening to shutter closing. The longer the exposure time, the more light enters the image sensor, and therefore, the brightness of the image captured by the image sensor increases. Conversely, the exposure time decreases and the image brightness also decreases. The analog electrical signal converted by the image sensor is passed through a module called automatic gain control (automatic gain control, AGC) before being converted to a digital signal. The module amplifies the input signal so that the output signal strength meets the final requirements for image brightness. Under the condition that the incident light energy is small and the setting of the aperture and the exposure time cannot meet the exposure standard requirement, the signal gain can be increased to improve the brightness of the image, otherwise, the signal gain can be reduced to reduce the brightness of the image.
Optionally, the aperture size of the image sensor may also have an effect on the brightness of the image. The aperture size refers to the size of an aperture in the lens of the image sensor into which light can enter. When the lens aperture increases, the amount of light energy reaching the inside of the image sensor through the lens aperture also increases, so that the brightness of the image collected by the image sensor increases. Accordingly, when the lens aperture is reduced, the brightness of the image collected by the image sensor is reduced. In the case where the aperture size of an image sensor in an electronic device such as a mobile phone, a tablet computer, or the like is generally fixed, only the exposure time and the signal gain can be adjusted.
Thus, the image sensor can directly perform an automatic exposure process on the image frame after the image frame is acquired. Compared with the complicated process that the image sensor in the prior art needs to send the acquired image frames to the application processor for processing the automatic exposure, the automatic exposure efficiency is effectively improved.
In addition, the image sensor directly determines the current ambient light brightness according to the acquired image frames. Compared with the prior art, the method has the advantages that the ambient light intensity is not required to be acquired based on the ambient light intensity sensor, the hardware space occupied by the ambient light sensor is reduced, and therefore the volume of the electronic equipment can be reduced, or the hardware occupied space of the ambient light sensor is reduced in other aspects.
S1203, the electronic device obtains an (n+1) -th frame image based on the first exposure parameter through the image sensor, and determines a second brightness and a second exposure parameter of the (n+1) -th frame image.
In some embodiments, in S1202 and S1203, the electronic device performs an automatic exposure process on the acquired image frame in the first mode by the image sensor. In the automatic exposure processing process, an on-chip AE module in the image sensor determines the ambient light brightness corresponding to the image frame acquired by the image sensor, adjusts exposure parameters corresponding to the ambient light brightness, and determines exposure parameters required for acquiring the next frame of image, so that the convergence of the ambient light brightness corresponding to the acquired next frame of image is realized.
For example, the electronic device determines a first adjustment step corresponding to a first brightness of a currently acquired image frame (such as an nth frame image) through the image sensor, and determines a fourth exposure parameter corresponding to the first exposure parameter according to the first adjustment step. The electronic device may then acquire a next frame image (e.g., an (n+1) th frame image) through the image sensor in accordance with the first mode via the fourth exposure parameter.
In some examples, the on-chip AE module is preset with an adjustment step corresponding to the ambient light level, and the adjustment step corresponding to the ambient light level may be set according to shooting experimental data or experience. The on-chip AE module may adjust the determined first exposure parameter based on an adjustment step corresponding to the current ambient light level to determine a fourth exposure parameter. The image sensor can acquire an image based on the fourth exposure parameter, and acquire an (n+1) th frame image of the next frame. Therefore, the on-chip AE module can realize the adjustment of the ambient light brightness corresponding to the image frame based on the adjustment step length through the adjustment of the exposure parameters, so that the ambient light brightness corresponding to the image frame tends to be stable, and the automatic convergence of the ambient light brightness corresponding to the image frame is realized.
For example, as shown in fig. 8, the image sensor acquires the image frame 1, and the on-chip AE module in the image sensor determines the brightness 1 and the exposure parameter 1 corresponding to the image frame 1 (i.e., performs S1202). The on-chip AE module adjusts the exposure parameter 1 and determines the exposure parameter 1 'based on the adjustment step corresponding to the brightness 1, so that the image sensor acquires the image frame 2 based on the exposure parameter 1'. The on-chip AE module continues to determine the brightness 2 and the exposure parameter 2 corresponding to the image frame 2 (i.e., performs S1203). In the process, the corresponding ambient light brightness of the image frame is converged by adjusting the step length, so that the corresponding ambient light brightness of the acquired new image frame tends to be stable.
Also by way of example, after determining the adjustment step based on the picture brightness, the electronic device may also adjust the adjustment step based on the difference in brightness between two consecutive frames of images. For example, as shown above, after the on-chip AE module determines brightness 1 corresponding to image frame 1, an adjustment step corresponding to brightness 1 is determined. Then, if the on-chip AE module can acquire the luminance 1' of the image frame 1' before the image frame 1, it can continuously determine whether the luminance difference between the luminance 1 and the luminance 1' is greater than the preset difference. If the brightness difference between the brightness 1 and the brightness 1' is larger than the preset difference, the brightness of the image frame is not stable, and the adjustment step length corresponding to the brightness 1 can be increased; if the difference between the brightness 1 and the brightness 1' is not greater than the preset difference, it indicates that the brightness of the image frame tends to be stable, and the adjustment step corresponding to the brightness 1 can be reduced. In the process, the adjustment step length can be adjusted through the brightness difference value between two continuous frames of images, so that the corresponding ambient light brightness of the image frames is converged based on the adjustment step length, and the corresponding ambient light brightness of the acquired new image frames tends to be stable.
In some examples, the electronic device may display the (n+1) -th frame image directly during the auto-exposure process, thereby enabling faster display of the image after the image sensor is activated. Therefore, the starting speed of the application program can be effectively improved, and the long waiting time of a user is avoided.
For example, as shown in fig. 8, the (n+1) -th frame image acquired by the on-chip AE module is the image frame 4, and the on-chip AE module has not completed the automatic exposure process at this time, but the electronic device may still directly display the image frame 4. As described above, by using the automatic exposure method provided by the embodiment of the present application, the automatic exposure process can be controlled within 50ms. Based on this, the electronic apparatus, after obtaining the (n+1) -th frame image, takes less than 50ms until the automatic exposure is completed. Whereas the persistence of vision of the human eye is generally considered to be 0.05 seconds, i.e., 50ms, the electronic device displays from the (n+1) -th frame image, and the user is substantially unaware of the change in brightness of the image frame thereafter.
S1204, the electronic device determines that the difference value between the second brightness and the first brightness meets a preset condition.
The preset condition includes that the difference value between the second brightness and the first brightness is m times smaller than or equal to a preset threshold value, and m is a positive integer.
In some embodiments, after determining the ambient light level corresponding to the current image frame during the automatic exposure process, the electronic device may determine a difference between the ambient light level and the ambient light level corresponding to the previous image frame. If the difference is less than or equal to the preset threshold, it may be determined that the auto-exposure process is completed, and the following S1205 is performed. Alternatively, if the difference is greater than the preset threshold, it may be determined that the automatic exposure process is not completed, and the above-described S1202 may be executed back, and the image frame is collected and subjected to the automatic exposure process based on the first mode. Alternatively, the electronic device may determine that the automatic exposure process is completed after determining that the ambient light level difference is less than or equal to the preset threshold value m consecutive times.
For example, after the image sensor acquires the (n+1) -th frame image based on the first exposure parameter, the on-chip AE module may determine a second luminance corresponding to the acquired (n+1) -th frame image. And determining whether a difference value between the second brightness and the first brightness corresponding to the N-th frame image acquired by the previous frame is smaller than or equal to a preset threshold value. If so, indicating that the difference between the second brightness and the first brightness is large, further auto-exposure of the image frame is still required, and S1202 may be returned and performed. Alternatively, if the difference between the second brightness and the first brightness is less than or equal to the preset threshold, it indicates that the brightness of the two image frames is the same or similar, and at this time, it may indicate that the automatic exposure process is completed.
As shown in fig. 8, if the first brightness corresponding to the image frame 4 is 4nit and the second brightness corresponding to the image frame 5 is 5nit during the automatic exposure process, the electronic device determines that the difference between the second brightness and the first brightness is 1nit through the image sensor. If the preset threshold is 2nit, the electronic device determines that the brightness difference value of the continuous image frames is smaller than the preset threshold, and can determine that the automatic exposure process is completed.
As another example, as shown in fig. 8, during the automatic exposure process, if the brightness 1 corresponding to the image frame 1 is 50nit and the brightness 2 corresponding to the image frame 2 is 48nit, the electronic device determines that the difference between the brightness 2 and the brightness 1 is 2nit through the image sensor. If the preset threshold is 2nit, the electronic device determines that the brightness difference value of one continuous image frame is equal to the preset threshold, and the automatic exposure processing can be performed through the image sensor. Then, in the automatic exposure process, if the brightness 3 corresponding to the image frame 3 is 30nit, the electronic device determines that the difference between the brightness 3 and the brightness 2 is 18nit through the image sensor. Then, the electronic device determines by the image sensor that the difference between the luminance 3 and the luminance 2 is greater than a preset threshold. After that, the electronic apparatus determines that the luminance 4 corresponding to the image frame 4 is 31nit and the difference between the luminance 4 and the luminance 3 is 1nit by the image sensor. The electronic device determines that the brightness difference value of one continuous image frame is smaller than a preset threshold value through the image sensor. The electronic device determines, by the image sensor, that the luminance 5 corresponding to the image frame 5 is 29nit, and the difference between the luminance 5 and the luminance 4 is 2nit. The electronic device determines that the brightness difference value of one continuous image frame is smaller than a preset threshold value through the image sensor. At this time, the electronic device may determine that the difference between the second brightness and the first brightness is less than or equal to the preset threshold value for the 2 nd time through the image sensor, for example, m is 2 in the preset condition, and then the electronic device may determine that the automatic exposure process is completed.
In this way, the electronic device determines whether the brightness of the image frame tends to be stable or not according to whether the difference value between the second brightness and the first brightness is smaller than or equal to a preset threshold value through the image sensor, so as to determine whether the automatic exposure processing process is completed or not. The stability of the automatic exposure process is effectively improved.
S1205, the electronic equipment acquires a second mode corresponding to the first operation, and converts the second exposure parameter into a third exposure parameter corresponding to the second mode through the image sensor.
In some embodiments, after determining that the automatic exposure process is completed, the electronic device may convert the current exposure parameter into an exposure parameter corresponding to the second mode indicated by the current first application program, so that the subsequently acquired image frame may not only satisfy the requirement of the current ambient light brightness but also satisfy the display requirement of the second mode.
In some embodiments, the electronic device may determine the second mode, i.e., the target shooting mode, according to the first operation. Then, according to the target shooting mode, the mode parameter corresponding to the target shooting mode can be acquired from the mode parameters stored in advance by the on-chip AE module in the image sensor. The application processor may send a mode setting instruction to the image sensor during the auto-exposure process, or after the auto-exposure is completed. The image sensor determines a second mode based on the mode setting instruction.
Thus, the on-chip AE module performs high-speed automatic exposure processing on the image frames acquired by the image sensor so as to improve the speed of automatic exposure. And the on-chip AE module can convert the exposure parameters obtained by the burst mode based on the actual shooting mode to obtain the exposure parameters corresponding to the actual shooting mode, so that the image sensor can directly collect the image frames corresponding to the actual shooting mode to meet the use requirement of the camera in the actual shooting mode.
Optionally, the on-chip AE module in the image sensor stores one or more mode parameters corresponding to one or more preset modes respectively, where the preset modes include a second mode.
The mode parameter may include one or more of resolution, size of a display screen of the electronic device, and frame rate at which the image sensor captures images.
Therefore, the on-chip AE module can directly acquire the corresponding mode parameter based on the second mode, so that the time for the application processor to acquire the mode parameter and the time for the application processor to transmit the mode parameter to the image sensor are saved, and the speed of automatic exposure is improved.
In some embodiments, the electronic device acquires and displays the target image frame according to the third exposure parameter. Because the third exposure parameter corresponds to the second mode, and the picture exposure of the image frame correspondingly acquired based on the third exposure parameter is correct, the target image frame can be acquired according to the third exposure parameter and displayed, so that a user can watch a display picture with normal exposure.
Optionally, the brightness of the target image frame is the same as or similar to the second brightness. The brightness of the target image frame being similar to the second brightness means that the difference between the brightness of the target image frame and the second brightness is less than or equal to a preset threshold value, so as to meet the requirement of the brightness of the picture. Since the second exposure parameter and the second brightness are obtained based on the n+1st frame image, and the third exposure parameter is converted from the second exposure parameter, the brightness of the target image frame acquired by the image sensor based on the third exposure parameter is the same as or similar to the second brightness. Illustratively, as shown in fig. 8, as the image frame 5 is the n+1th frame image and the image frame 6 is the target image frame, the brightness between the image frame 5 and the image frame 6 is the same or similar.
In some examples, the first frame rate corresponding to the first mode is greater than the second frame rate corresponding to the second mode. That is, the processing speed of the on-chip AE module in the burst mode is greater than the processing speed corresponding to the mode setting. Thus, the on-chip AE module can be ensured to perform high-speed automatic exposure processing in the burst mode.
It will be appreciated that the first exposure parameter, or the second exposure parameter, or the third exposure parameter, or the fourth exposure parameter, which occur during the above-described process, may be an exposure value and/or a gain value.
In this way, the on-chip AE module is arranged in the image sensor, so that the on-chip AE module can directly perform automatic exposure processing on the image frames acquired by the image sensor, and the time for data transmission between the application processor and the image sensor is saved.
And, the processing speed of the on-chip AE module is greater than that of the application processor, so that the automatic exposure processing can be performed at high speed.
In addition, the on-chip AE module may directly acquire the current ambient light level according to the image frame acquired by the image sensor, and implement the automatic exposure process based on the current ambient light level. The electronic device can be prevented from being additionally provided with an ambient light sensor, so that the volume of the electronic device is reduced, and the electronic device can realize an ultra-narrow frame screen.
The automatic exposure method provided by the embodiment of the application is described in detail above with reference to fig. 5 to 13. The electronic device provided in the embodiment of the application is described in detail below with reference to fig. 14.
In one possible design, fig. 14 is a schematic structural diagram of an electronic device according to an embodiment of the present application. As shown in fig. 14, the electronic device 1400 may include: a processing unit 1401 and a transmitting and receiving unit 1402. The electronic device 1400 may be used to implement the functionality of the electronic device referred to in the method embodiments described above.
Optionally, the processing unit 1401 is configured to support the electronic device 1400 to perform S901-S906 in fig. 9, and/or is further configured to support the electronic device 1400 to perform S1001-S1006 in fig. 10, and/or is further configured to support the electronic device 1400 to perform S1201-S1205 in fig. 12.
Optionally, the transceiver unit 1402 is configured to support the electronic device 1400 to perform S901-S906 in fig. 9, and/or is further configured to support the electronic device 1400 to perform S1001-S1006 in fig. 10, and/or is further configured to support the electronic device 1400 to perform S1201-S1205 in fig. 12.
Optionally, the electronic device 1400 may also include a display unit for supporting the electronic device to display interface content.
The transceiver unit 1402 may include a receiving unit and a transmitting unit, may be implemented by a transceiver or a transceiver related circuit component, and may be a transceiver or a transceiver module. The operations and/or functions of the respective units in the electronic device 1400 are respectively for implementing the respective flows of the automatic exposure method in the above method embodiment, and all relevant contents of the steps related to the above method embodiment may be cited in the functional descriptions of the corresponding functional units, which are not repeated herein for brevity.
Optionally, the electronic device 1400 shown in fig. 14 may further include a storage unit (not shown in fig. 14) in which programs or instructions are stored. When the processing unit 1401 and the transceiver unit 1402 execute the program or instructions, the electronic device 1400 shown in fig. 14 is enabled to execute the auto-exposure method in the above-described method embodiment.
The technical effects of the electronic device 1400 shown in fig. 14 may refer to the technical effects of the automatic exposure method in the above-mentioned method embodiment, and will not be described herein.
In addition to the form of the electronic device 1400, the technical solution provided in the embodiments of the present application may also be a functional unit or a chip in the electronic device, or a device used in matching with the electronic device.
Embodiments of the present application also provide a computer-readable storage medium including computer instructions which, when executed on an electronic device as described above, cause the electronic device to perform the functions or steps of the method embodiments described above.
Embodiments of the present application also provide a computer program product comprising a computer program for causing an electronic device to perform the functions or steps of the method embodiments described above when the computer program is run on the electronic device.
It will be apparent to those skilled in the art from this description that, for convenience and brevity of description, only the above-described division of the functional modules is illustrated, and in practical application, the above-described functional allocation may be performed by different functional modules according to needs, i.e. the internal structure of the apparatus is divided into different functional modules to perform all or part of the functions described above.
In the several embodiments provided by the present application, it should be understood that the disclosed apparatus and method may be implemented in other manners. For example, the apparatus embodiments described above are merely illustrative, e.g., the division of modules or units is merely a logical function division, and there may be additional divisions when actually implemented, e.g., multiple units or components may be combined or integrated into another apparatus, or some features may be omitted or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices or units, which may be in electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and the parts shown as units may be one physical unit or a plurality of physical units, may be located in one place, or may be distributed in a plurality of different places. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in the embodiments of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The integrated units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a readable storage medium. Based on such understanding, the technical solution of the embodiments of the present application may be essentially or a part contributing to the prior art or all or part of the technical solution may be embodied in the form of a software product stored in a storage medium, including several instructions for causing a device (may be a single-chip microcomputer, a chip or the like) or a processor (processor) to perform all or part of the steps of the methods of the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read Only Memory (ROM), a random access memory (random access memory, RAM), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
The foregoing is merely illustrative of specific embodiments of the present application, but the scope of the present application is not limited thereto, and any changes or substitutions within the technical scope of the present application should be covered by the scope of the present application. Therefore, the protection scope of the application is subject to the protection scope of the claims.

Claims (11)

1. An automatic exposure method, characterized by being applied to an electronic device, the method comprising:
in response to a first operation by a user, activating an image sensor;
acquiring an N-th frame image according to a preset first mode through the image sensor, and determining first brightness and first exposure parameters of the N-th frame image, wherein N is a positive integer;
acquiring an (n+1) -th frame image based on the first exposure parameter by the image sensor, and determining a second brightness and a second exposure parameter of the (n+1) -th frame image;
determining that the difference value between the second brightness and the first brightness meets a preset condition;
acquiring a second mode corresponding to the first operation, and converting the second exposure parameter into a third exposure parameter corresponding to the second mode through the image sensor;
acquiring and displaying a target image frame according to the third exposure parameter;
Wherein, the first frame rate corresponding to the first mode is larger than the second frame rate corresponding to the second mode; the preset condition includes that the difference value between the second brightness and the first brightness is m times smaller than or equal to a preset threshold value, and m is a positive integer.
2. The method of claim 1, wherein the brightness of the target image frame is the same as or similar to the second brightness.
3. The method of claim 1, wherein one or more mode parameters corresponding to one or more preset modes respectively are stored in the image sensor, and the preset modes include the second mode.
4. A method according to claim 3, wherein the mode parameters comprise one or more of resolution, size of a display screen of the electronic device, frame rate at which an image sensor captures images.
5. The method of claim 1, wherein the acquiring, by the image sensor, an nth frame image according to a preset first pattern and determining a first brightness and a first exposure parameter of the nth frame image comprises:
acquiring the Nth frame image according to the preset first mode through the image sensor;
Determining, by the image sensor, the first brightness of the nth frame image;
and determining the first exposure parameter corresponding to the first brightness through the image sensor.
6. The method of claim 5, wherein the acquiring, by the image sensor, an (n+1) -th frame image based on the first exposure parameter, comprises:
determining a first adjustment step length corresponding to the first brightness through the image sensor;
determining a fourth exposure parameter corresponding to the first exposure parameter according to the first adjustment step length through the image sensor;
and acquiring the (n+1) th frame image through the fourth exposure parameter according to the first mode by the image sensor.
7. The method according to any of claims 1-6, wherein the first exposure parameter, or the second exposure parameter, or the third exposure parameter is an exposure value and/or a gain value.
8. The method according to any one of claims 1-6, further comprising:
and displaying the (n+1) th frame image.
9. The method of any of claims 1-6, wherein the first operation is one or more of a user operation of an icon of a first application, an operation of instructing to switch the first application to a foreground display, an operation of launching the first application through a second application, the first application being a camera application.
10. An electronic device comprising a display screen, an image sensor, a memory, and one or more processors; the display screen, the image sensor, the memory, and the processor are coupled; the display screen is used for displaying the image generated by the processor, the image sensor is used for acquiring image frames, and the memory is used for storing computer program codes, and the computer program codes comprise computer instructions; the computer instructions, when executed by the processor, cause the electronic device to perform the method of any of claims 1-9.
11. A computer readable storage medium comprising computer instructions which, when run on an electronic device, cause the electronic device to perform the method of any of claims 1-9.
CN202310319202.1A 2023-03-29 2023-03-29 Automatic exposure method, electronic equipment and computer readable storage medium Active CN116033275B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202310995464.XA CN116996762B (en) 2023-03-29 2023-03-29 Automatic exposure method, electronic equipment and computer readable storage medium
CN202310319202.1A CN116033275B (en) 2023-03-29 2023-03-29 Automatic exposure method, electronic equipment and computer readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310319202.1A CN116033275B (en) 2023-03-29 2023-03-29 Automatic exposure method, electronic equipment and computer readable storage medium

Related Child Applications (1)

Application Number Title Priority Date Filing Date
CN202310995464.XA Division CN116996762B (en) 2023-03-29 2023-03-29 Automatic exposure method, electronic equipment and computer readable storage medium

Publications (2)

Publication Number Publication Date
CN116033275A CN116033275A (en) 2023-04-28
CN116033275B true CN116033275B (en) 2023-08-15

Family

ID=86089665

Family Applications (2)

Application Number Title Priority Date Filing Date
CN202310995464.XA Active CN116996762B (en) 2023-03-29 2023-03-29 Automatic exposure method, electronic equipment and computer readable storage medium
CN202310319202.1A Active CN116033275B (en) 2023-03-29 2023-03-29 Automatic exposure method, electronic equipment and computer readable storage medium

Family Applications Before (1)

Application Number Title Priority Date Filing Date
CN202310995464.XA Active CN116996762B (en) 2023-03-29 2023-03-29 Automatic exposure method, electronic equipment and computer readable storage medium

Country Status (1)

Country Link
CN (2) CN116996762B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117651212A (en) * 2024-01-29 2024-03-05 荣耀终端有限公司 Exposure parameter adjusting method and electronic equipment

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105681645A (en) * 2015-11-06 2016-06-15 乐视移动智能信息技术(北京)有限公司 Anti-shake photographing method and device, and mobile terminal
JP2016146650A (en) * 2016-03-14 2016-08-12 オリンパス株式会社 Imaging apparatus and imaging method, and display device and display method
CN111316635A (en) * 2019-03-12 2020-06-19 深圳市大疆创新科技有限公司 Camera control method and device
CN113382169A (en) * 2021-06-18 2021-09-10 荣耀终端有限公司 Photographing method and electronic equipment
CN113596345A (en) * 2021-08-09 2021-11-02 荣耀终端有限公司 Parameter adjustment method, display control method, electronic device, and medium
CN114863510A (en) * 2022-03-25 2022-08-05 荣耀终端有限公司 Face recognition method and device
CN115242983A (en) * 2022-09-26 2022-10-25 荣耀终端有限公司 Photographing method, electronic device, computer program product, and readable storage medium
CN115550556A (en) * 2021-06-25 2022-12-30 荣耀终端有限公司 Exposure intensity adjusting method and related device

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4173457B2 (en) * 2004-03-12 2008-10-29 富士フイルム株式会社 Imaging apparatus and control method thereof
US8581974B2 (en) * 2010-05-06 2013-11-12 Aptina Imaging Corporation Systems and methods for presence detection
CN105491285B (en) * 2015-12-03 2019-02-15 浙江宇视科技有限公司 A kind of PIR camera operation mode switching method and device
CN109936698B (en) * 2017-12-18 2021-07-02 杭州海康威视数字技术股份有限公司 Automatic exposure control method and device, electronic equipment and storage medium
CN110718069B (en) * 2019-10-10 2021-05-11 浙江大华技术股份有限公司 Image brightness adjusting method and device and storage medium
CN111263078B (en) * 2020-02-04 2021-11-05 浙江大华技术股份有限公司 Method and device for determining image pickup mode, storage medium and electronic device
CN113271414B (en) * 2020-02-14 2022-11-18 上海海思技术有限公司 Image acquisition method and device
CN112788250B (en) * 2021-02-01 2022-06-17 青岛海泰新光科技股份有限公司 Automatic exposure control method based on FPGA
CN113891008B (en) * 2021-06-10 2022-08-09 荣耀终端有限公司 Exposure intensity adjusting method and related equipment
CN113612932B (en) * 2021-08-09 2022-10-04 荣耀终端有限公司 Method, apparatus and storage medium for adjusting exposure parameters

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105681645A (en) * 2015-11-06 2016-06-15 乐视移动智能信息技术(北京)有限公司 Anti-shake photographing method and device, and mobile terminal
JP2016146650A (en) * 2016-03-14 2016-08-12 オリンパス株式会社 Imaging apparatus and imaging method, and display device and display method
CN111316635A (en) * 2019-03-12 2020-06-19 深圳市大疆创新科技有限公司 Camera control method and device
CN113382169A (en) * 2021-06-18 2021-09-10 荣耀终端有限公司 Photographing method and electronic equipment
CN115550556A (en) * 2021-06-25 2022-12-30 荣耀终端有限公司 Exposure intensity adjusting method and related device
CN113596345A (en) * 2021-08-09 2021-11-02 荣耀终端有限公司 Parameter adjustment method, display control method, electronic device, and medium
CN114863510A (en) * 2022-03-25 2022-08-05 荣耀终端有限公司 Face recognition method and device
CN115242983A (en) * 2022-09-26 2022-10-25 荣耀终端有限公司 Photographing method, electronic device, computer program product, and readable storage medium

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
一种应用于CMOS图像传感器的快速自动曝光控制方法;戈志伟;姚素英;徐江涛;宿晓慧;;天津大学学报(第10期);全文 *

Also Published As

Publication number Publication date
CN116033275A (en) 2023-04-28
CN116996762B (en) 2024-04-16
CN116996762A (en) 2023-11-03

Similar Documents

Publication Publication Date Title
CN114679537B (en) Shooting method and terminal
EP3893491A1 (en) Method for photographing the moon and electronic device
US11800221B2 (en) Time-lapse shooting method and device
CN113132620B (en) Image shooting method and related device
CN110506416B (en) Method for switching camera by terminal and terminal
WO2020077511A1 (en) Method for displaying image in photographic scene and electronic device
CN113475057B (en) Video frame rate control method and related device
CN112351156B (en) Lens switching method and device
CN113170037B (en) Method for shooting long exposure image and electronic equipment
CN114095666B (en) Photographing method, electronic device, and computer-readable storage medium
EP4280586A1 (en) Point light source image detection method and electronic device
CN114466134A (en) Method and electronic device for generating HDR image
CN116033275B (en) Automatic exposure method, electronic equipment and computer readable storage medium
CN115967851A (en) Quick photographing method, electronic device and computer readable storage medium
CN117278850A (en) Shooting method and electronic equipment
CN110609650B (en) Application state switching method and terminal equipment
CN116017138B (en) Light measuring control display method, computer equipment and storage medium
CN115460343B (en) Image processing method, device and storage medium
CN116723382B (en) Shooting method and related equipment
CN116887047B (en) Focusing method, electronic equipment and storage medium
CN116708751B (en) Method and device for determining photographing duration and electronic equipment
US20240137659A1 (en) Point light source image detection method and electronic device
CN116055872B (en) Image acquisition method, electronic device, and computer-readable storage medium
CN116389884B (en) Thumbnail display method and terminal equipment
CN116048323B (en) Image processing method and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant