CN116095497B - Exposure control method, device and terminal equipment - Google Patents

Exposure control method, device and terminal equipment Download PDF

Info

Publication number
CN116095497B
CN116095497B CN202210751140.7A CN202210751140A CN116095497B CN 116095497 B CN116095497 B CN 116095497B CN 202210751140 A CN202210751140 A CN 202210751140A CN 116095497 B CN116095497 B CN 116095497B
Authority
CN
China
Prior art keywords
moment
data
brightness
exposure
assembly
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210751140.7A
Other languages
Chinese (zh)
Other versions
CN116095497A (en
Inventor
许集润
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honor Device Co Ltd
Original Assignee
Honor Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honor Device Co Ltd filed Critical Honor Device Co Ltd
Priority to CN202210751140.7A priority Critical patent/CN116095497B/en
Publication of CN116095497A publication Critical patent/CN116095497A/en
Priority to PCT/CN2023/094365 priority patent/WO2024001579A1/en
Application granted granted Critical
Publication of CN116095497B publication Critical patent/CN116095497B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/71Circuitry for evaluating the brightness variation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/73Circuitry for compensating brightness variation in the scene by influencing the exposure time

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)
  • Exposure Control For Cameras (AREA)

Abstract

The application provides an exposure control method, an exposure control device and terminal equipment, and relates to the technical field of image pickup. And secondly, the terminal equipment acquires brightness data of a second moment acquired by the optical sensor, wherein the second moment is the last closing moment of the image pickup assembly at the first moment. And the terminal equipment determines exposure data of the image pickup assembly at the first moment according to the brightness data at the first moment and the brightness data at the second moment. Finally, the terminal equipment performs exposure control on the image pickup assembly according to the exposure data of the image pickup assembly at the first moment. In this way, the exposure parameters are determined by the ambient brightness when the camera assembly is turned on and the ambient brightness when the camera assembly is turned off last time, so that the problem of screen-flash overexposure caused by overlarge ambient brightness change is solved.

Description

Exposure control method, device and terminal equipment
Technical Field
The present application relates to the field of imaging technologies, and in particular, to an exposure control method, an exposure control device, and a terminal device.
Background
Currently, with the development of terminal technology, terminal devices have become part of people's work and life. The terminal equipment is generally provided with the camera shooting assembly, and the functions of shooting, recording and the like can be realized through the camera shooting assembly on the terminal equipment, so that the work and the life of people are greatly facilitated.
Automatic exposure control (Automatic Exposure Control, AEC) is an indispensable item in the control of the image pickup device. In the related art, the exposure of the camera module at the closing time and the ambient brightness value are generally stored. When the camera shooting assembly is started again, the exposure and the ambient brightness value at the last closing time can be directly used for AEC.
However, because the terminal device is in different environments during use, the exposure and the environment brightness value at the last closing time are directly used for AEC, which may cause the problem of screen-flashing overexposure of the camera shooting assembly during opening.
Disclosure of Invention
The application provides an exposure control method, an exposure control device and terminal equipment, which are used for solving the problem that in the prior art, when a camera shooting assembly is started, screen flash overexposure occurs.
In a first aspect, the present application provides an exposure control method applied to a terminal device, where an imaging assembly and an optical sensor are disposed on the terminal device, the exposure control method provided by the present application includes:
The terminal equipment firstly responds to a starting instruction of the camera shooting assembly, acquires brightness data of a first moment acquired by the optical sensor, wherein the brightness data is used for representing the environment brightness of the camera shooting assembly, and the first moment is the starting moment of the camera shooting assembly under the starting instruction. And secondly, the terminal equipment acquires brightness data of a second moment acquired by the optical sensor, wherein the second moment is the last closing moment of the image pickup assembly at the first moment. And the terminal equipment determines exposure data of the image pickup assembly at the first moment according to the brightness data at the first moment and the brightness data at the second moment. Finally, the terminal equipment performs exposure control on the image pickup assembly according to the exposure data of the image pickup assembly at the first moment.
According to the exposure control method provided by the application, the exposure data of the image pickup assembly at the first moment is determined according to the brightness data at the first moment and the brightness data at the second moment, so that the exposure data at the first moment can be determined in different modes based on different brightness changes, and the problem of screen-flashing overexposure caused by unmatched exposure data due to larger brightness changes is avoided.
In one possible embodiment, the terminal device may determine a brightness change value from the brightness data at the first time and the brightness data at the second time, the brightness change value being used to characterize a degree of change in the ambient brightness of the camera assembly. Then, the terminal equipment determines exposure data of the image pickup assembly at the first moment according to the brightness change value.
In this way, the terminal device can determine the exposure data at the first moment in different ways based on different brightness variation values, so that the exposure data at the first moment more accords with the ambient brightness.
In one possible embodiment, the luminance change value includes a ratio of luminance data at the first time and luminance data at the second time.
In one possible implementation manner, the determining, by the terminal device, exposure data of the image capturing component at the first moment according to the brightness variation value includes: the terminal device determines whether the brightness variation value is within a first interval. If the brightness change value is in the first interval, the terminal equipment acquires the exposure data of the image pickup assembly at the second moment, so that the exposure data of the image pickup assembly at the second moment is determined as the exposure data of the image pickup assembly at the first moment.
In this way, when the brightness variation value is within the first interval, the terminal device can determine that the environmental brightness variation is not large, and then the exposure data at the second time (i.e., the last closing time) can be directly used as the exposure data at the first time, so that the terminal device can quickly complete exposure control.
In one possible implementation manner, after the terminal device determines whether the brightness variation value is within the first interval, the method further includes: if the brightness variation value is not in the first interval, the terminal device may determine exposure data of the image capturing component at the first moment according to the brightness data at the first moment.
In this way, the terminal device can determine that the ambient brightness change is large when the brightness change value is not within the first interval. At this time, the terminal device can recalculate the exposure data of the camera shooting assembly at the first moment by using the brightness data at the first moment, so that the exposure data at the first moment can be ensured to be matched with the ambient brightness, and the problem of screen-flashing overexposure is avoided.
In one possible implementation manner, before determining exposure data of the image capturing assembly at the first moment according to the brightness variation value, the terminal device further includes: and the terminal equipment firstly determines whether the brightness data at the first moment is effective according to the second interval corresponding to the brightness data. And then, if the brightness data at the first moment is valid, determining a brightness change value according to the brightness data at the first moment and the brightness data at the second moment.
In one possible implementation manner, after determining whether the luminance data at the first time is valid according to the second interval corresponding to the luminance data, the terminal device determines the exposure data of the image capturing assembly at the second time as the exposure data of the image capturing assembly at the first time if the luminance data at the first time is invalid.
Thus, the luminance data that does not match significantly is indicated by the second section, and when the luminance data at the first time is valid, the luminance change at the first time and the second time can be further determined. When the luminance data at the first time is invalid, it is determined that the change in the ambient luminance at the first time and the second time cannot be further determined based on the luminance data at the first time, and at this time, the exposure control may be performed at the first time by directly using the exposure data at the second time.
In one possible implementation manner, before acquiring the brightness data of the first moment acquired by the optical sensor, the terminal device further includes: the terminal equipment firstly receives a closing instruction of the camera shooting assembly. Subsequently, the terminal device may acquire the luminance data of the second time acquired by the optical sensor and the exposure data of the image capturing component at the second time. Finally, the terminal device stores the luminance data at the second time and the exposure data at the second time.
In this way, by storing the brightness data at the second time and the exposure data at the second time in advance, the terminal device can acquire the brightness data at the second time to compare with the brightness data at the first time when the camera assembly is started next time, so as to determine the environment brightness change condition, and can directly use the exposure data at the second time to perform exposure control when the environment brightness change is not large.
In one possible embodiment, the optical sensor comprises a multispectral sensor.
In a second aspect, the present application also provides an exposure control apparatus, comprising:
the acquisition module is used for responding to a starting instruction of the camera shooting assembly, acquiring brightness data of a first moment acquired by the optical sensor, wherein the brightness data is used for representing the environment brightness of the camera shooting assembly, and the first moment is the starting moment of the camera shooting assembly under the starting instruction; acquiring brightness data of a second moment acquired by an optical sensor, wherein the second moment is the last closing moment of the camera shooting assembly at the first moment;
the processing module is used for determining exposure data of the image pickup assembly at the first moment according to the brightness data at the first moment and the brightness data at the second moment;
and the control module is used for carrying out automatic exposure control on the image pickup assembly according to the exposure data of the image pickup assembly at the first moment.
In a third aspect, the present application also provides a terminal device, including a processor and a memory, where the memory is configured to store code instructions; the processor is configured to execute code instructions to cause the electronic device to perform the exposure control method as described in the first aspect or any implementation of the first aspect.
In a fourth aspect, the present application also provides a computer-readable storage medium storing instructions that, when executed, cause a computer to perform the exposure control method as described in the first aspect or any implementation of the first aspect.
In a fifth aspect, the application also provides a computer program product comprising a computer program which, when run, causes a computer to perform the exposure control method as described in the first aspect or any implementation of the first aspect.
It should be understood that the second to fifth aspects of the present application correspond to the technical solutions of the first aspect of the present application, and the advantages obtained by each aspect and the corresponding possible embodiments are similar, and are not repeated.
Drawings
FIG. 1a is a photograph of the first 4 frames of a prior art camera assembly after being turned on;
FIG. 1b is a view of a 5 th and 6 th frame of an image taken after a conventional camera assembly is turned on;
fig. 1c is an image taken from the 7 th frame to the 10 th frame after the conventional camera module is turned on;
fig. 2 is a schematic structural diagram of a terminal device according to an embodiment of the present application;
FIG. 3 is a schematic flow chart of an exposure control method according to an embodiment of the present application;
FIG. 4 is a schematic diagram of triggering a start command according to an embodiment of the present application;
FIG. 5 is a schematic diagram of exposure control in different frame images according to an embodiment of the present application;
FIG. 6 is a flowchart illustrating another exposure control method according to an embodiment of the present application;
fig. 7 is a schematic structural diagram of an exposure control device according to an embodiment of the present application;
fig. 8 is a schematic hardware structure of a terminal device according to an embodiment of the present application;
fig. 9 is a schematic structural diagram of a chip according to an embodiment of the present application.
Detailed Description
In order to clearly describe the technical solution of the embodiments of the present application, in the embodiments of the present application, the words "first", "second", etc. are used to distinguish the same item or similar items having substantially the same function and effect. For example, the first value and the second value are merely for distinguishing between different values, and are not limited in their order. It will be appreciated by those of skill in the art that the words "first," "second," and the like do not limit the amount and order of execution, and that the words "first," "second," and the like do not necessarily differ.
In the present application, the words "exemplary" or "such as" are used to mean serving as an example, instance, or illustration. Any embodiment or design described herein as "exemplary" or "for example" should not be construed as preferred or advantageous over other embodiments or designs. Rather, the use of words such as "exemplary" or "such as" is intended to present related concepts in a concrete fashion.
In the present application, "at least one" means one or more, and "a plurality" means two or more. "and/or", describes an association relationship of an association object, and indicates that there may be three relationships, for example, a and/or B, and may indicate: a alone, a and B together, and B alone, wherein a, B may be singular or plural. The character "/" generally indicates that the context-dependent object is an "or" relationship. "at least one of" or the like means any combination of these items, including any combination of single item(s) or plural items(s). For example, at least one (one) of a, b, or c may represent: a, b, c, a-b, a-c, b-c, or a-b-c, wherein a, b, c may be single or plural.
Automatic exposure control (Automatic Exposure Control, AEC) is an indispensable item in the control of the image pickup device. In the related art, the exposure of the camera module at the closing time and the ambient brightness value are generally stored. When the camera shooting assembly is started again, the exposure and the ambient brightness value at the last closing time can be directly used for AEC. However, because the terminal device is in different environments during use, the exposure and the environment brightness value at the last closing time are directly used for AEC, which may cause the problem of screen-flashing overexposure of the camera shooting assembly during opening.
For example, when the photographing in the low-brightness environment is switched to the photographing in the high-brightness environment, or when the photographing in the high-brightness environment is switched to the photographing in the low-brightness environment, the exposure and the ambient brightness value at the last closing time are directly adopted to perform AEC, so that the initial exposure in AEC control is seriously not matched with the actual ambient brightness, and the phenomenon of screen flashing overexposure exists in the first several frames of images after the photographing assembly is opened. Fig. 1a is an image taken by the first 4 frames after the existing imaging assembly is turned on, fig. 1b is an image taken by the 5 th and 6 th frames after the existing imaging assembly is turned on, and fig. 1c is an image taken by the 7 th to 10 th frames after the existing imaging assembly is turned on. As shown in fig. 1a, since AEC control is performed by using the exposure amount at the last closing time and the environmental brightness change is large, the image captured by the image capturing component in the previous 4 frames is severely overexposed, and no object can be seen in the image. As shown in fig. 1b, the exposure is automatically adjusted by AEC control adjustment, and at the beginning of the 5 th and 6 th frames, the image overexposure phenomenon is relieved, and the outline of the object can be seen in the image. As shown in fig. 1c, the image overexposure phenomenon does not converge until the 10 th frame, and the object in the image can be seen clearly.
It should be noted that, the duration of the alleviation and convergence of the image overexposure phenomenon may be determined based on the AEC control algorithm and the ambient brightness variation, and as an example, the alleviation and convergence at the 10 th frame may be obtained at the 5 th frame as shown in fig. 1 a-1 c; if the ambient brightness variation is smaller or the exposure adjustment speed controlled by the AEC control algorithm is faster, the convergence in the 6 th frame can be relieved in the 3 th frame, and the embodiment is not applied to limit the convergence.
In view of the above, the present application provides an exposure control method, apparatus, and terminal device, where the terminal device can obtain the ambient brightness at the start time when the image capturing component is turned on, and based on the ambient brightness at the start time and the ambient brightness at the previous turn-off time, the exposure data of the image capturing component at the start time is obtained. By the method, the change condition of the ambient brightness can be determined through the ambient brightness at the starting time and the ambient brightness at the previous closing time, so that the exposure data of the image pickup assembly when the image pickup assembly is started can be determined in different modes based on different ambient brightness change conditions, and the problem of screen flashing overexposure caused by unmatched exposure data due to large brightness change is avoided.
The following describes an application scenario of the exposure control method according to the present application.
In some embodiments, the exposure control method provided in this embodiment may be applied to a scene of image capturing. When a user opens a shooting component of the terminal equipment to shoot an image, the exposure data of the shooting component can be determined through the exposure control method, so that the exposure control is performed on the shooting component based on the exposure data.
It should be noted that the application scenario mentioned above does not limit the disclosure, and the exposure control method provided in the disclosure may be applied to any scenario in which exposure control is performed on an image capturing component.
It is understood that the terminal device may be a smart phone, a notebook computer, a tablet computer, or the like, which has a camera assembly and an optical sensor. The embodiment of the application does not limit the specific technology and the specific equipment form adopted by the terminal equipment.
In order to better understand the embodiments of the present application, the structure of the terminal device of the embodiments of the present application is described below. Fig. 2 is a schematic structural diagram of a terminal device according to an embodiment of the present application.
Referring to fig. 2, a terminal device may include multiple subsystems that cooperate to perform, coordinate, or monitor one or more operations or functions of the terminal device. The terminal device includes a display 220, a processor 230, a camera assembly 240, an optical sensor 250, a speaker 260, a wireless communication module 270, and a memory 280.
The wireless communication function of the terminal device may be implemented by an antenna, a wireless communication module 270, a modem processor, a baseband processor, and the like. The antenna is used for transmitting and receiving electromagnetic wave signals. Antennas in the terminal device may be used to cover single or multiple communication bands. Different antennas may also be multiplexed to improve the utilization of the antennas.
The wireless communication module 270 may provide solutions for wireless communication including wireless local area network (wirelesslocal area networks, WLAN) (e.g., wireless fidelity (wireless fidelity, wi-Fi) network), bluetooth (BT), global navigation satellite system (global navigation satellite system, GNSS), frequency modulation (frequency modulation, FM), ultra Wide Band (UWB) connection, etc., applied on a terminal device.
Processor 230 may be implemented as any electronic device capable of processing, receiving, or transmitting data or instructions. For example, the processor may be a microprocessor, a central processing unit, an application specific integrated circuit, a field programmable gate array, a digital signal processor, an analog circuit, a digital circuit, or a combination of these devices. The processor may be a single-threaded or multi-threaded processor. The processor may be a single core or multi-core processor.
During use, processor 230 may be configured to access a memory that stores instructions. The instructions may be configured to cause the processor to perform, coordinate or monitor one or more operations or functions of the terminal device.
The display 220 may be located behind the input surface or may be integral therewith. The display 220 may be communicatively coupled to the processor 230. Processor 230 may present information to the user using display 220. In many cases, the processor 230 uses the display 220 to present an interface with which a user can interact.
The camera assembly 240 is connected to the processor 230 of the terminal device, and the camera assembly 240 may take images or videos in response to instructions sent by the processor 230.
The optical sensor 250 is provided on the terminal device and is connected to the processor 230 of the terminal device. The optical sensor 250 is used for collecting the ambient brightness around the terminal device and reporting the ambient brightness to the processor 230 in real time, so that the processor 230 performs exposure control on the camera module based on the ambient brightness.
The speaker 260, also referred to as a "horn," is used to convert audio electrical signals into sound signals. The terminal device may listen to music through the speaker 260 or emit ultrasonic waves. In the embodiment of the present application, a speaker 260 may be disposed at both sides of the terminal device, respectively, so that the terminal device determines the direction of the discovered device with respect to the terminal device.
Memory 280 may be used to store computer executable program code that includes instructions. For example, the memory may be implemented as random access memory, read only memory, flash memory, removable memory, other types of storage elements, or a combination of such devices. Memory 280 may include a stored program area and a stored data area. Memory 280 may store data in a memory data area. For example, the memory 280 stores information such as identity information of a device that establishes a short-range communication connection and a connection time.
It is understood that the above-mentioned exposure control method may be implemented by the exposure control apparatus provided in the embodiment of the present disclosure, and the exposure control apparatus may be part or all of a certain device, for example, the above-mentioned terminal device. The following takes a terminal device as an example, and describes the technical scheme of the present application and how to solve the above technical problems in detail with reference to specific embodiments. The following embodiments may be implemented independently or combined with each other, and the same or similar concepts or processes may not be described in detail in some embodiments.
Fig. 3 is a flowchart of an exposure control method according to an embodiment of the present application. As shown in fig. 3, the exposure control method provided in this embodiment is applied to a terminal device, on which an image capturing component and an optical sensor are disposed, and includes:
Step 301, a terminal device responds to a starting instruction of a camera shooting assembly, acquires brightness data of a first moment acquired by an optical sensor, wherein the brightness data is used for representing the ambient brightness of the camera shooting assembly, and the first moment is the starting moment of the camera shooting assembly under the starting instruction.
In the application, when a user needs to use the camera shooting function of the terminal equipment, a starting instruction of the camera shooting assembly can be sent to the camera shooting assembly. Correspondingly, the terminal equipment responds to a starting instruction of the camera shooting assembly, and the camera shooting assembly is started. Meanwhile, the terminal device can acquire the brightness data of the first moment acquired by the optical sensor, namely, the brightness data of the starting moment of the camera assembly.
The embodiment of the application does not limit the model of the camera shooting assembly, and can be specifically set according to actual conditions.
It should be understood that an optical sensor is a sensor assembly that can measure luminance based on optical principles. In the present application, by providing the optical sensor on the terminal device, the ambient brightness around the image pickup assembly can be detected in real time, thereby assisting the processor in determining the exposure data based on the measured brightness data. It should be understood that the embodiment of the present application is not limited to the type of the optical sensor, and may be a multispectral sensor, for example.
It should be understood that the embodiments of the present application are not limited to how to trigger the start-up instruction of the camera assembly, and in some embodiments, the start-up instruction of the camera assembly may be triggered during the process of the user turning on the camera function. In other embodiments, in some applications, the user may jump to the camera interface by triggering a control of the photographing function, and correspondingly, in the process of jumping to the camera interface, the terminal device may trigger a start instruction of the photographing assembly.
Fig. 4 is a schematic diagram illustrating triggering of a start instruction according to an embodiment of the present application. As shown in fig. 4, the user may turn on the camera function and jump the interface to the camera interface by clicking on the camera button. In the process of jumping to the camera interface, the terminal equipment can trigger a starting instruction of the camera shooting assembly to instruct the camera shooting assembly to be started.
In some embodiments, the brightness data at the first time may be brightness data collected by the optical sensor at the start time of the camera assembly under the start command. It will be appreciated that if an optical sensor is provided on the terminal device, an exposure automation (Automatic Exposure Control, AEC) statistical task can be created accordingly. Subsequently, the optical sensor may be made to register a Service with a Non-Camera Service (NCS Service), and the luminance data collected by the optical sensor may be stored in the NCS Service in real time. Meanwhile, the NCS Service can also acquire the brightness data acquired by the illuminant sensor under the AEC in real time, so that the brightness data at a certain moment can be acquired when the exposure control is performed.
It should be noted that, in the storage process of the luminance data, the luminance data may carry time information (for example, a timestamp), and the terminal device may quickly determine the luminance data at any time through the time information.
Step 302, the terminal device acquires brightness data of a second moment acquired by the optical sensor, wherein the second moment is a last closing moment of the camera shooting assembly at the first moment.
In the present application, when the terminal device starts the image capturing assembly, not only the luminance data at the first time (i.e., the luminance data at the current start time of the image capturing assembly) but also the luminance data at the second time (i.e., the luminance data at the previous close time of the image capturing assembly) can be obtained.
It should be understood that the embodiment of the present application does not limit how to acquire the luminance data of the second time acquired by the optical sensor, for example, the terminal device may acquire the luminance data of the second time from a preset file or from a preset storage location.
In some embodiments, the brightness data at the second time may be stored in a preset file, and when the terminal device turns on the setting component again, the brightness data at the second time may be read from the preset file, so as to perform exposure control. The preset file may be a text (txt) file, for example, to facilitate reading and writing of luminance data at the second time.
In other embodiments, the luminance data at the second time may also be stored in a preset storage location. When the terminal device starts the setting component again, the brightness data at the second moment can be obtained from the preset storage position based on the route information of the preset storage position.
Correspondingly, after the terminal equipment receives the closing instruction of the image pickup assembly at the second moment, the brightness data of the optical sensor at the second moment and the exposure data of the image pickup assembly at the second moment can be acquired in real time. Subsequently, the image capturing component may store the luminance data at the second time and the exposure data at the second time. By storing the luminance data at the second time and the exposure data at the second time, the luminance data at the second time and the exposure data at the second time can be quickly acquired when the image pickup apparatus is turned on again, so as to assist in performing exposure control.
Step 303, the terminal device determines exposure data of the image capturing component at the first moment according to the brightness data at the first moment and the brightness data at the second moment.
In the application, after the terminal equipment acquires the brightness data of the first moment and the brightness data of the second moment, the exposure data of the image pickup assembly at the first moment can be determined based on the brightness data of the first moment and the brightness data of the second moment.
It should be understood that the embodiment of the present application does not limit how the exposure data of the image capturing assembly at the first moment is determined, and in some embodiments, the terminal device may determine, first, a brightness change value according to the brightness data at the first moment and the brightness data at the second moment, where the brightness change value is used to characterize the degree of change of the ambient brightness of the image capturing assembly. Subsequently, the terminal device may determine exposure data of the image capturing component at the first time based on the brightness variation value.
The problem of the overexposure of the screen is caused by the overlarge change of the ambient brightness, and the exposure data at the first moment is determined based on the brightness change value by determining the brightness change value, so that the exposure data at the second moment can not be directly used when the ambient brightness change is overlarge, and the problem of the overexposure of the screen can be avoided.
In some embodiments, the brightness variation value includes a ratio of the brightness data at the first time and the brightness data at the second time, or may also include a difference value between the brightness data at the first time and the brightness data at the second time.
For example, after the terminal device determines the luminance change value from the luminance data at the first time and the luminance data at the second time, it may be determined whether the luminance change value is within the first section. If the brightness variation value is within the first interval, it can be determined that the ambient brightness variation at the first time and the second time is not great. Correspondingly, the terminal equipment acquires the exposure data of the image pickup assembly at the second moment, and determines the exposure data of the image pickup assembly at the second moment as the exposure data of the image pickup assembly at the first moment.
If the brightness variation value is not in the first interval, it can be determined that the ambient brightness variation at the first time and the second time is larger. In this case, if the exposure data of the image capturing device at the second time is directly determined as the exposure data of the image capturing device at the first time, the problem of over-exposure of the screen may be caused. Therefore, the terminal device can redetermine the exposure data of the image capturing component at the first time according to the luminance data at the first time.
It should be noted that, the first interval is used for measuring the degree of change of the ambient brightness, and may be specifically set according to the actual situation. For example, if the brightness variation value includes a ratio of the brightness data at the first time and the brightness data at the second time, the first interval may be set to 0.7-1.3.
It should be understood that, in the embodiment of the present application, how to determine the exposure data of the image capturing assembly at the first moment according to the brightness data at the first moment is not limited, in some embodiments, a mapping relationship may be preset between the brightness data and the exposure data, and the exposure data at the first moment may be determined from the mapping relationship by using the brightness data at the first moment.
In other embodiments, the terminal device may preset a conversion formula between the luminance data and the exposure data, and the exposure data at the first time may be obtained by inputting the luminance data at the first time into the conversion formula.
In the embodiment of the application, when the exposure data at the first moment is determined, whether the brightness change value is in the first interval or not can be checked, and whether the brightness data at the first moment is valid or not can also be checked, so that the brightness data acquired by the brightness sensor is ensured to be correct.
In some embodiments, the terminal device may determine whether the luminance data at the first time is valid according to the second interval corresponding to the luminance data. If the brightness data at the first time is valid, a brightness change value is determined according to the brightness data at the first time and the brightness data at the second time, and a determination mode of the exposure data at the first time is determined based on the brightness change value. If the brightness data at the first moment is invalid, the terminal device can directly determine the exposure data of the image capturing assembly at the second moment as the exposure data of the image capturing assembly at the first moment.
It should be noted that, the second interval is used for measuring whether the brightness data is valid, and may be specifically set according to the actual situation. When the luminance data is within the second section, it may be determined that the luminance data is valid, and when the luminance data is not within the second section, it may be determined that the luminance data is invalid. Illustratively, the second interval may be set to be greater than 0.
And 304, the terminal equipment performs exposure control on the image pickup assembly according to the exposure data of the image pickup assembly at the first moment.
In the present application, after the terminal device determines the exposure data at the first time, the exposure control may be performed on the image capturing component based on the exposure data at the first time.
In some embodiments, the exposure data at the first time may be used as start-up exposure data of the image capturing apparatus, and then the terminal apparatus acquires the ambient brightness in real time based on the optical sensor and performs automatic exposure control in real time through AEC.
According to the exposure control method provided by the embodiment of the application, the exposure data of the image pickup assembly at the first moment is determined according to the brightness data at the first moment and the brightness data at the second moment, so that the exposure data at the first moment can be determined in different modes based on different brightness changes, the problem of screen-flashing overexposure caused by mismatching of the exposure data due to large brightness changes is avoided, and the user experience is improved.
The operation of the exposure control method according to the embodiment of the present application at each time will be described below. Fig. 5 is a schematic diagram of exposure control in different frame images according to an embodiment of the present application. As shown in fig. 5, the camera module is started at the time of the 0 th frame, the-4 th to-1 st frames are before the camera module is started, and the 1 st to 6 th frames are after the camera module is started.
If the terminal equipment receives a starting instruction of the camera shooting component in the-1 frame, starting the camera shooting component in the 0 th frame, and shooting images or videos in the 1 st to 6 th frames. Correspondingly, in the frames-4 to-2, although the camera assembly is closed, the optical sensor still collects the brightness data and reports the brightness data to the AEC statistics. When the terminal equipment receives a starting instruction of the camera shooting assembly in the-1 frame, the camera shooting assembly is started, and meanwhile, the exposure data (a start-up exposure value) when the camera shooting assembly is started at the moment can be determined in the-1 frame.
For example, the terminal device may compare the luminance data (the luminance data acquired in the-1 st frame or the luminance data acquired in the-4 th to-2 nd frames) when the current image capturing assembly is turned on with the luminance data stored when the pre-stored image capturing assembly was turned off, to determine the luminance change value. If the brightness change value is too large, determining a start-up exposure value based on the brightness data when the current camera component is started, and if the brightness change value is not large, directly using the exposure data stored in the previous closing process as the start-up exposure value.
With continued reference to fig. 5, the above-described start-up exposure data may be used for exposure control of images of the previous few frames (e.g., the previous 3-frame image). Then, an exposure value of each frame of image can be calculated by the AEC algorithm, and exposure adjustment is performed on the subsequent image by the exposure value. Illustratively, the exposure value of the 1 st frame image may be used to adjust the 4 th frame image, the exposure value of the 2 nd frame image may be used to adjust the 5 th frame image, and the exposure value of the 3 rd frame image may be used to adjust the 6 th frame image.
A specific description will be given below of how to determine exposure data of the image pickup device at the first timing.
Fig. 6 is a flowchart of another exposure control method according to an embodiment of the present application. As shown in fig. 6, the exposure control method provided in the present embodiment is applied to a terminal device, on which an image capturing component and an optical sensor are provided, and includes:
step 601, acquiring brightness data of a first moment acquired by an optical sensor.
The brightness data are used for representing the ambient brightness of the camera shooting assembly, and the first moment is the starting moment of the camera shooting assembly under a starting instruction.
Step 602, acquiring brightness data of a second moment and exposure data of the second moment, which are acquired by the optical sensor.
The second time is the last closing time of the image pickup assembly at the first time.
Step 603, determining whether the brightness data at the first moment is valid according to the second interval corresponding to the brightness data.
If yes, go to step 605, if no, go to step 604.
Step 604, determining exposure data of the image capturing assembly at the second time as exposure data of the image capturing assembly at the first time.
After step 604, step 608 is performed.
Step 605, determining a brightness change value according to the brightness data at the first time and the brightness data at the second time.
The brightness change value is used for representing the change degree of the environment brightness of the image pickup assembly, and comprises a ratio of brightness data at a first moment to brightness data at a second moment.
Step 606, it is determined whether the brightness variation value is within the first interval.
If yes, go to step 604, if no, go to step 607.
Step 607, determining exposure data of the image capturing component at the first time according to the brightness data at the first time.
And step 608, performing exposure control on the image pickup assembly according to the exposure data of the image pickup assembly at the first moment.
According to the exposure control method provided by the embodiment of the application, the exposure data of the image pickup assembly at the first moment is determined according to the brightness data at the first moment and the brightness data at the second moment, so that the exposure data at the first moment can be determined in different modes based on different brightness changes, the problem of screen-flashing overexposure caused by mismatching of the exposure data due to large brightness changes is avoided, and the user experience is improved.
Those of ordinary skill in the art will appreciate that: all or part of the steps for implementing the above method embodiments may be implemented by hardware associated with program instructions, where the foregoing program may be stored in a computer readable storage medium, and when executed, the program performs steps including the above method embodiments; and the aforementioned storage medium includes: various media that can store program code, such as ROM, RAM, magnetic or optical disks.
Fig. 7 is a schematic structural diagram of an exposure control device according to an embodiment of the present application. The exposure control apparatus may be implemented by software, hardware, or a combination of both, and may be, for example, a terminal device or a processor of the terminal device in the above embodiment to perform the exposure control method in the above embodiment. As shown in fig. 7, the exposure control apparatus 700 includes: an acquisition module 701, a processing module 702 and a control module 703.
The acquiring module 701 is configured to respond to a start instruction of the camera module, and acquire luminance data of a first moment acquired by the optical sensor, where the luminance data is used to characterize ambient luminance of the camera module, and the first moment is a start moment of the camera module under the start instruction; acquiring brightness data of a second moment acquired by an optical sensor, wherein the second moment is the last closing moment of the camera shooting assembly at the first moment;
a processing module 702, configured to determine exposure data of the image capturing component at the first time according to the luminance data at the first time and the luminance data at the second time;
the control module 703 is configured to perform automatic exposure control on the image capturing assembly according to exposure data of the image capturing assembly at the first moment.
In an alternative embodiment, the processing module 702 is specifically configured to determine, according to the luminance data at the first time and the luminance data at the second time, a luminance change value, where the luminance change value is used to characterize a degree of change of the ambient luminance of the image capturing component; and determining exposure data of the image pickup assembly at the first moment according to the brightness change value.
In an alternative embodiment, the luminance variation value comprises a ratio of luminance data at the first time instant and luminance data at the second time instant.
In an alternative embodiment, the processing module 702 is specifically configured to determine whether the brightness variation value is within the first interval; if the brightness change value is in the first interval, acquiring exposure data of the image pickup assembly at a second moment; and determining the exposure data of the image pickup assembly at the second moment as the exposure data of the image pickup assembly at the first moment.
In an alternative embodiment, the processing module 702 is specifically configured to determine, based on the luminance data at the first time, exposure data of the image capturing assembly at the first time if the luminance change value is not within the first interval.
In an optional implementation manner, the processing module 702 is further configured to determine, according to the second interval corresponding to the luminance data, whether the luminance data at the first moment is valid; if the brightness data at the first time is valid, a brightness change value is determined according to the brightness data at the first time and the brightness data at the second time.
In an alternative embodiment, the processing module 702 is further configured to determine, as the exposure data of the image capturing assembly at the first time, the exposure data of the image capturing assembly at the second time if the brightness data at the first time is invalid.
In an alternative embodiment, the obtaining module 701 is further configured to receive a close instruction of the camera module; acquiring brightness data of a second moment acquired by an optical sensor and exposure data of an image pickup assembly at the second moment;
the processing module 702 is further configured to store luminance data at a second time and exposure data at the second time.
In an alternative embodiment, the optical sensor comprises a multispectral sensor.
It should be noted that, the exposure control apparatus provided in the embodiment shown in fig. 7 may be used to execute the method provided in any of the above embodiments, and the specific implementation manner and technical effects are similar, and are not repeated here.
Fig. 8 is a schematic hardware structure of a terminal device according to an embodiment of the present application, as shown in fig. 8, where the terminal device includes a processor 801, a communication line 804 and at least one communication interface (illustrated in fig. 8 by taking the communication interface 802 as an example).
The processor 801 may be a general purpose central processing unit (central processing unit, CPU), microprocessor, application-specific integrated circuit (ASIC), or one or more integrated circuits for controlling the execution of the programs of the present application.
Communication line 804 may include circuitry to communicate information between the components described above.
The communication interface 802, uses any transceiver-like device for communicating with other devices or communication networks, such as ethernet, wireless local area network (wireless local area networks, WLAN), etc.
Possibly, the terminal device may further comprise a memory 803.
The memory 803 may be, but is not limited to, a read-only memory (ROM) or other type of static storage device that can store static information and instructions, a random access memory (random access memory, RAM) or other type of dynamic storage device that can store information and instructions, or an electrically erasable programmable read-only memory (electrically erasable programmable read-only memory, EEPROM), a compact disc read-only memory (compact disc read-only memory) or other optical disc storage, optical disc storage (including compact disc, laser disc, optical disc, digital versatile disc, blu-ray disc, etc.), magnetic disk storage media or other magnetic storage devices, or any other medium that can be used to carry or store desired program code in the form of instructions or data structures and that can be accessed by a computer. The memory may be self-contained and coupled to the processor via communication line 804. The memory may also be integrated with the processor.
The memory 803 is used for storing computer-executable instructions for performing the aspects of the present application, and is controlled by the processor 801 for execution. The processor 801 is configured to execute computer-executable instructions stored in the memory 803, thereby implementing the exposure control method provided by the embodiment of the present application.
Possibly, the computer-executable instructions in the embodiments of the present application may also be referred to as application program codes, which are not limited in particular.
In a particular implementation, the processor 801 may include one or more CPUs, such as CPU0 and CPU1 of FIG. 8, as an embodiment.
In a specific implementation, as an embodiment, the terminal device may include multiple processors, such as processor 801 and processor 805 in fig. 8. Each of these processors may be a single-core (single-CPU) processor or may be a multi-core (multi-CPU) processor. A processor herein may refer to one or more devices, circuits, and/or processing cores for processing data (e.g., computer program instructions).
Fig. 9 is a schematic structural diagram of a chip according to an embodiment of the present application. Chip 90 includes one or more (including two) processors 910 and a communication interface 930.
In some implementations, memory 940 stores the following elements: executable modules or data structures, or a subset thereof, or an extended set thereof.
In an embodiment of the application, memory 940 may include read only memory and random access memory, and provides instructions and data to processor 910. A portion of memory 940 may also include non-volatile random access memory (NVRAM).
In an embodiment of the application, memory 940, communication interface 930, and memory 940 are coupled together by bus system 920. The bus system 920 may include a power bus, a control bus, a status signal bus, and the like, in addition to a data bus. For ease of description, the various buses are labeled as bus system 920 in FIG. 9.
The methods described in the embodiments of the present application may be applied to the processor 910 or implemented by the processor 910. The processor 910 may be an integrated circuit chip having signal processing capabilities. In implementation, the steps of the methods described above may be performed by integrated logic circuitry in hardware or instructions in software in processor 910. The processor 910 described above may be a general purpose processor (e.g., a microprocessor or a conventional processor), a digital signal processor (digital signal processing, DSP), an application specific integrated circuit (application specific integrated circuit, ASIC), an off-the-shelf programmable gate array (field-programmable gate array, FPGA) or other programmable logic device, discrete gates, transistor logic, or discrete hardware components, and the processor 910 may implement or perform the methods, steps, and logic blocks disclosed in embodiments of the application.
The steps of the method disclosed in connection with the embodiments of the present application may be embodied directly in the execution of a hardware decoding processor, or in the execution of a combination of hardware and software modules in a decoding processor. The software modules may be located in a state-of-the-art storage medium such as random access memory, read-only memory, programmable read-only memory, or charged erasable programmable memory (electrically erasable programmable read only memory, EEPROM). The storage medium is located in the memory 940, and the processor 910 reads the information in the memory 940 and performs the steps of the method in combination with its hardware.
In the above embodiments, the instructions stored by the memory for execution by the processor may be implemented in the form of a computer program product. The computer program product may be written in the memory in advance, or may be downloaded in the form of software and installed in the memory.
The computer program product includes one or more computer instructions. When the computer program instructions are loaded and executed on a computer, the processes or functions in accordance with embodiments of the present application are produced in whole or in part. The computer may be a general purpose computer, a special purpose computer, a computer network, or other programmable apparatus. The computer instructions may be stored in a computer-readable storage medium or transmitted from one computer-readable storage medium to another computer-readable storage medium, for example, the computer instructions may be transmitted from one website, computer, server, or data center to another website, computer, server, or data center by wired (e.g., coaxial cable, fiber optic, digital subscriber line (digital subscriber line, DSL), or wireless (e.g., infrared, wireless, microwave, etc.), or semiconductor medium (e.g., solid state disk, SSD)) or the like.
The embodiment of the application also provides a computer readable storage medium. The methods described in the above embodiments may be implemented in whole or in part by software, hardware, firmware, or any combination thereof. Computer readable media can include computer storage media and communication media and can include any medium that can transfer a computer program from one place to another. The storage media may be any target media that is accessible by a computer.
As one possible design, the computer-readable medium may include compact disk read-only memory (CD-ROM), RAM, ROM, EEPROM, or other optical disk memory; the computer readable medium may include disk storage or other disk storage devices. Moreover, any connection is properly termed a computer-readable medium. For example, if the software is transmitted from a website, server, or other remote source using a coaxial cable, fiber optic cable, twisted pair, DSL, or wireless technologies such as infrared, radio, and microwave, then the coaxial cable, fiber optic cable, twisted pair, DSL, or wireless technologies such as infrared, radio, and microwave are included in the definition of medium. Disk and disc, as used herein, includes Compact Disc (CD), laser disc, optical disc, digital versatile disc (digital versatiledisc, DVD), floppy disk and blu-ray disc where disks usually reproduce data magnetically, while discs reproduce data optically with lasers.
Combinations of the above should also be included within the scope of computer-readable media. The foregoing is merely illustrative embodiments of the present application, but the scope of the present application is not limited thereto, and any person skilled in the art can easily think about variations or substitutions within the technical scope of the present application, and the application should be covered. Therefore, the protection scope of the application is subject to the protection scope of the claims.

Claims (10)

1. An exposure control method, characterized by being applied to a terminal device, on which an image pickup assembly and an optical sensor are provided, comprising:
responding to a starting instruction of the camera shooting assembly, acquiring brightness data of a first moment acquired by the optical sensor, wherein the brightness data is used for representing the ambient brightness of the camera shooting assembly, and the first moment is the starting moment of the camera shooting assembly under the starting instruction;
acquiring brightness data of a second moment acquired by the optical sensor, wherein the second moment is the last closing moment of the camera shooting assembly at the first moment;
determining exposure data of the camera shooting assembly at the first moment according to the brightness data at the first moment and the brightness data at the second moment;
Performing exposure control on the image pickup assembly according to exposure data of the image pickup assembly at the first moment;
the determining exposure data of the image capturing component at the first moment according to the brightness data at the first moment and the brightness data at the second moment includes:
determining a brightness change value according to the brightness data of the first moment and the brightness data of the second moment, wherein the brightness change value is used for representing the change degree of the environment brightness of the camera shooting assembly;
determining exposure data of the camera shooting assembly at the first moment according to the brightness change value;
the determining exposure data of the image capturing component at the first moment according to the brightness variation value comprises the following steps:
determining whether the brightness variation value is within a first interval;
if the brightness change value is in the first interval, acquiring exposure data of the image pickup assembly at the second moment;
and determining the exposure data of the image pickup assembly at the second moment as the exposure data of the image pickup assembly at the first moment.
2. The method of claim 1, wherein the brightness variation value comprises a ratio of brightness data at the first time and brightness data at the second time.
3. The method of claim 1, wherein after the determining whether the brightness variation value is within the first interval, the method further comprises:
and if the brightness change value is not in the first interval, determining exposure data of the image pickup assembly at the first moment according to the brightness data at the first moment.
4. The method according to claim 1, wherein before the determining exposure data of the image capturing component at the first timing according to the brightness variation value, the method further comprises:
determining whether the brightness data at the first moment is effective or not according to a second interval corresponding to the brightness data;
the determining exposure data of the image capturing component at the first moment according to the brightness variation value comprises the following steps:
and if the brightness data at the first moment is valid, determining a brightness change value according to the brightness data at the first moment and the brightness data at the second moment.
5. The method of claim 4, wherein after determining whether the luminance data at the first time is valid according to the second interval corresponding to the luminance data, the method further comprises:
And if the brightness data at the first moment is invalid, determining the exposure data of the image pickup assembly at the second moment as the exposure data of the image pickup assembly at the first moment.
6. The method of any of claims 1-5, wherein prior to the acquiring the luminance data at the first time of acquisition by the optical sensor, the method further comprises:
receiving a closing instruction of the camera shooting assembly;
acquiring brightness data of the second moment acquired by the optical sensor and exposure data of the camera shooting assembly at the second moment;
and storing the brightness data at the second moment and the exposure data at the second moment.
7. The method of any one of claims 1-5, wherein the optical sensor comprises a multispectral sensor.
8. An exposure control apparatus, characterized in that the apparatus comprises:
the acquisition module is used for responding to a starting instruction of the camera shooting assembly and acquiring brightness data of a first moment acquired by the optical sensor, wherein the brightness data is used for representing the environment brightness of the camera shooting assembly, and the first moment is the starting moment of the camera shooting assembly under the starting instruction; acquiring brightness data of a second moment acquired by the optical sensor, wherein the second moment is the last closing moment of the camera shooting assembly at the first moment;
The processing module is used for determining exposure data of the image pickup assembly at the first moment according to the brightness data of the first moment and the brightness data of the second moment;
the control module is used for automatically performing exposure control on the image pickup assembly according to the exposure data of the image pickup assembly at the first moment;
the processing module is specifically configured to:
determining a brightness change value according to the brightness data of the first moment and the brightness data of the second moment, wherein the brightness change value is used for representing the change degree of the environment brightness of the camera shooting assembly;
determining exposure data of the camera shooting assembly at the first moment according to the brightness change value;
the processing module is specifically configured to:
determining whether the brightness variation value is within a first interval;
if the brightness change value is in the first interval, acquiring exposure data of the image pickup assembly at the second moment;
and determining the exposure data of the image pickup assembly at the second moment as the exposure data of the image pickup assembly at the first moment.
9. A terminal device comprising a memory, a processor and a computer program stored in the memory and executable on the processor, characterized in that the processor, when executing the computer program, causes the terminal device to perform the method according to any of claims 1 to 7.
10. A computer readable storage medium storing a computer program, which when executed by a processor causes a computer to perform the method of any one of claims 1 to 7.
CN202210751140.7A 2022-06-29 2022-06-29 Exposure control method, device and terminal equipment Active CN116095497B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202210751140.7A CN116095497B (en) 2022-06-29 2022-06-29 Exposure control method, device and terminal equipment
PCT/CN2023/094365 WO2024001579A1 (en) 2022-06-29 2023-05-15 Exposure control method and apparatus, and terminal device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210751140.7A CN116095497B (en) 2022-06-29 2022-06-29 Exposure control method, device and terminal equipment

Publications (2)

Publication Number Publication Date
CN116095497A CN116095497A (en) 2023-05-09
CN116095497B true CN116095497B (en) 2023-10-20

Family

ID=86205154

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210751140.7A Active CN116095497B (en) 2022-06-29 2022-06-29 Exposure control method, device and terminal equipment

Country Status (2)

Country Link
CN (1) CN116095497B (en)
WO (1) WO2024001579A1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116095497B (en) * 2022-06-29 2023-10-20 荣耀终端有限公司 Exposure control method, device and terminal equipment

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1158353B1 (en) * 2000-05-12 2006-10-04 Sanyo Electric Co., Ltd. Camera with automatic exposure control
CN105744178A (en) * 2016-04-15 2016-07-06 惠州Tcl移动通信有限公司 Method and system for controlling brightness when starting camera, and camera terminal
CN109474790A (en) * 2018-11-05 2019-03-15 浙江大华技术股份有限公司 Exposure adjustment method, device and video camera and computer storage medium
CN109729279A (en) * 2018-12-20 2019-05-07 华为技术有限公司 A kind of image capturing method and terminal device
CN110636230A (en) * 2019-10-31 2019-12-31 Oppo广东移动通信有限公司 Exposure adjusting method, device, equipment and storage medium
CN112153305A (en) * 2020-10-22 2020-12-29 努比亚技术有限公司 Camera starting method, mobile terminal and computer storage medium
CN112738493A (en) * 2020-12-28 2021-04-30 Oppo广东移动通信有限公司 Image processing method, image processing apparatus, electronic device, and readable storage medium
CN113364993A (en) * 2021-07-23 2021-09-07 北京字节跳动网络技术有限公司 Exposure parameter value processing method and device and electronic equipment
CN113810601A (en) * 2021-08-12 2021-12-17 荣耀终端有限公司 Terminal image processing method and device and terminal equipment
WO2022067496A1 (en) * 2020-09-29 2022-04-07 深圳市大疆创新科技有限公司 Fast auto-exposure method for camera, and storage medium
CN114339060A (en) * 2020-09-30 2022-04-12 宇龙计算机通信科技(深圳)有限公司 Exposure adjusting method and device, storage medium and electronic equipment

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105611037A (en) * 2015-07-22 2016-05-25 宇龙计算机通信科技(深圳)有限公司 Terminal control method, terminal control device and terminal
CN110351490B (en) * 2018-04-04 2021-06-08 杭州海康威视数字技术股份有限公司 Exposure method, exposure device and image pickup equipment
JP7234015B2 (en) * 2019-04-05 2023-03-07 キヤノン株式会社 Imaging device and its control method
CN110519526B (en) * 2019-09-09 2021-02-26 Oppo广东移动通信有限公司 Exposure time control method and device, storage medium and electronic equipment
CN113556477B (en) * 2021-09-23 2021-12-14 南昌龙旗信息技术有限公司 Ambient brightness determination method and device, medium and camera
CN116095497B (en) * 2022-06-29 2023-10-20 荣耀终端有限公司 Exposure control method, device and terminal equipment

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1158353B1 (en) * 2000-05-12 2006-10-04 Sanyo Electric Co., Ltd. Camera with automatic exposure control
CN105744178A (en) * 2016-04-15 2016-07-06 惠州Tcl移动通信有限公司 Method and system for controlling brightness when starting camera, and camera terminal
CN109474790A (en) * 2018-11-05 2019-03-15 浙江大华技术股份有限公司 Exposure adjustment method, device and video camera and computer storage medium
CN109729279A (en) * 2018-12-20 2019-05-07 华为技术有限公司 A kind of image capturing method and terminal device
CN110636230A (en) * 2019-10-31 2019-12-31 Oppo广东移动通信有限公司 Exposure adjusting method, device, equipment and storage medium
WO2022067496A1 (en) * 2020-09-29 2022-04-07 深圳市大疆创新科技有限公司 Fast auto-exposure method for camera, and storage medium
CN114339060A (en) * 2020-09-30 2022-04-12 宇龙计算机通信科技(深圳)有限公司 Exposure adjusting method and device, storage medium and electronic equipment
CN112153305A (en) * 2020-10-22 2020-12-29 努比亚技术有限公司 Camera starting method, mobile terminal and computer storage medium
CN112738493A (en) * 2020-12-28 2021-04-30 Oppo广东移动通信有限公司 Image processing method, image processing apparatus, electronic device, and readable storage medium
CN113364993A (en) * 2021-07-23 2021-09-07 北京字节跳动网络技术有限公司 Exposure parameter value processing method and device and electronic equipment
CN113810601A (en) * 2021-08-12 2021-12-17 荣耀终端有限公司 Terminal image processing method and device and terminal equipment

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
计算机摄影的伪影及其解决办法的探讨;邹才盛;;中国临床新医学(第07期);全文 *

Also Published As

Publication number Publication date
WO2024001579A1 (en) 2024-01-04
WO2024001579A9 (en) 2024-03-21
CN116095497A (en) 2023-05-09

Similar Documents

Publication Publication Date Title
CN109547701B (en) Image shooting method and device, storage medium and electronic equipment
CN110060213B (en) Image processing method, image processing device, storage medium and electronic equipment
CN111526314B (en) Video shooting method and electronic equipment
CN109167931B (en) Image processing method, device, storage medium and mobile terminal
KR100810310B1 (en) Device and method for reconstructing picture having illumination difference
KR20150099302A (en) Electronic device and control method of the same
CN113538273B (en) Image processing method and image processing apparatus
CN110493538A (en) Image processing method, device, storage medium and electronic equipment
CN113810601B (en) Terminal image processing method and device and terminal equipment
CN109040523B (en) Artifact eliminating method and device, storage medium and terminal
CN110264473B (en) Image processing method and device based on multi-frame image and electronic equipment
CN115526787B (en) Video processing method and device
CN115086567B (en) Time delay photographing method and device
CN110581956A (en) Image processing method, image processing device, storage medium and electronic equipment
CN113810604B (en) Document shooting method, electronic device and storage medium
CN116095497B (en) Exposure control method, device and terminal equipment
CN114727101B (en) Antenna power adjusting method and electronic equipment
CN111563466A (en) Face detection method and related product
CN112087569B (en) Camera and camera starting method and device
WO2016123415A1 (en) Automatic processing of automatic image capture parameter adjustment
CN113572948B (en) Video processing method and video processing device
CN106483906A (en) A kind of GPS automatic correcting time Control management system based on WIFI
CN115686182B (en) Processing method of augmented reality video and electronic equipment
CN117479008B (en) Video processing method, electronic equipment and chip system
EP3984210B1 (en) Emulating light sensitivity of a target background

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant