WO2024001579A1 - 曝光控制方法、装置和终端设备 - Google Patents

曝光控制方法、装置和终端设备 Download PDF

Info

Publication number
WO2024001579A1
WO2024001579A1 PCT/CN2023/094365 CN2023094365W WO2024001579A1 WO 2024001579 A1 WO2024001579 A1 WO 2024001579A1 CN 2023094365 W CN2023094365 W CN 2023094365W WO 2024001579 A1 WO2024001579 A1 WO 2024001579A1
Authority
WO
WIPO (PCT)
Prior art keywords
moment
brightness
data
camera component
exposure
Prior art date
Application number
PCT/CN2023/094365
Other languages
English (en)
French (fr)
Other versions
WO2024001579A9 (zh
Inventor
许集润
Original Assignee
荣耀终端有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 荣耀终端有限公司 filed Critical 荣耀终端有限公司
Publication of WO2024001579A1 publication Critical patent/WO2024001579A1/zh
Publication of WO2024001579A9 publication Critical patent/WO2024001579A9/zh

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/71Circuitry for evaluating the brightness variation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/73Circuitry for compensating brightness variation in the scene by influencing the exposure time

Definitions

  • the present application relates to the field of imaging technology, and in particular, to an exposure control method, device and terminal equipment.
  • Terminal equipment has become a part of people's work and life.
  • Terminal equipment is usually equipped with a camera component. Through the camera component on the terminal device, functions such as photography and video recording can be realized, which greatly facilitates people's work and life.
  • AEC Automatic Exposure Control
  • the exposure amount and the ambient brightness value of the camera component when it is turned off are usually stored.
  • the exposure and ambient brightness values at the last time it was turned off can be directly used for AEC.
  • This application provides an exposure control method, device and terminal equipment to solve the problem of flash screen overexposure when the camera component is turned on in the prior art.
  • this application provides an exposure control method, which is applied to a terminal device.
  • the terminal device is provided with a camera component and an optical sensor.
  • the exposure control method provided by this application includes:
  • the terminal device first responds to the startup instruction of the camera component and obtains the brightness data collected by the optical sensor at the first moment.
  • the brightness data is used to characterize the ambient brightness of the camera component.
  • the first moment is the startup time of the camera component under the startup instruction.
  • the terminal device obtains the brightness data collected by the optical sensor at the second moment.
  • the second moment is the last time the camera component was turned off at the first moment.
  • the terminal device determines the exposure data of the camera component at the first time based on the brightness data at the first time and the brightness data at the second time.
  • the terminal device controls the exposure of the camera component based on the exposure data of the camera component at the first moment.
  • the exposure control method provided by this application determines the exposure data of the camera component at the first moment based on the brightness data at the first moment and the brightness data at the second moment, so that different ways can be used to determine the first moment based on different brightness changes. Exposure data at all times, thereby avoiding the problem of flash screen overexposure caused by exposure data mismatch caused by large changes in brightness.
  • the terminal device can determine a brightness change value based on the brightness data at the first moment and the brightness data at the second moment.
  • the brightness change value is used to characterize the degree of change in the ambient brightness of the camera component. Subsequently, the terminal device determines the exposure data of the camera component at the first moment based on the brightness change value.
  • the terminal device can determine the exposure data at the first moment in different ways based on different brightness change values, so that the exposure data at the first moment is more consistent with the ambient brightness.
  • the brightness change value includes a ratio of the brightness data at the first moment and the brightness data at the second moment.
  • the terminal device determines the exposure data of the camera component at the first moment according to the brightness change value, including: the terminal device determines whether the brightness change value is within the first interval. If the brightness change value is within the first interval, the terminal device obtains the exposure data of the camera component at the second moment, thereby determining the exposure data of the camera component at the second moment as the exposure data of the camera component at the first moment.
  • the terminal device can determine that the ambient brightness has not changed much, and can directly use the exposure data at the second moment (that is, the last time it was turned off) as the exposure data at the first moment. , so that the terminal device can quickly complete exposure control.
  • the method further includes: if the brightness change value is not within the first interval, the terminal device can determine the camera based on the brightness data at the first moment. Exposure data of the component at the first moment.
  • the terminal device can determine that the ambient brightness changes significantly. At this time, the terminal device can use the brightness data at the first moment to recalculate the exposure data of the camera component at the first moment, thereby ensuring that the exposure data at the first moment matches the brightness of the environment and avoiding the problem of flash screen overexposure.
  • the terminal device before the terminal device determines the exposure data of the camera component at the first moment based on the brightness change value, the terminal device further includes: the terminal device first determines the third interval based on the second interval corresponding to the brightness data. Whether the brightness data at a moment is valid. Subsequently, if the brightness data at the first time is valid, the brightness change value is determined based on the brightness data at the first time and the brightness data at the second time.
  • the camera component is The exposure data at the second time is determined as the exposure data of the camera assembly at the first time.
  • the brightness data that obviously does not match are indicated through the second interval.
  • the brightness change between the first moment and the second moment can be further determined.
  • the brightness data at the first moment is invalid, it can be determined that the ambient brightness changes between the first moment and the second moment cannot be further determined based on the brightness data at the first moment.
  • the exposure data at the second moment can be directly used. Exposure control at a moment's notice.
  • the terminal device before the terminal device obtains the brightness data collected by the optical sensor at the first moment, the terminal device further includes: the terminal device first receives a closing instruction of the camera component. Subsequently, the terminal device can obtain the brightness data collected by the optical sensor at the second moment and the exposure data of the camera component at the second moment. Finally, the terminal device stores the brightness data at the second moment and the exposure data at the second moment.
  • the terminal device can obtain the brightness data at the second moment for comparison with the brightness data at the first moment when the camera component is started next time. , determine the change in ambient brightness, and when the ambient brightness does not change significantly, the exposure data at the second moment can be directly used for exposure control.
  • the optical sensor includes a multispectral sensor.
  • this application also provides an exposure control device, which includes:
  • brightness data is used to characterize the ambient brightness of the camera component.
  • the first moment is the startup time of the camera component under the startup command; obtain the brightness data collected by the optical sensor at the second moment, and the second moment is the brightness data of the camera component at the first moment.
  • a processing module configured to determine the exposure data of the camera component at the first moment based on the brightness data at the first moment and the brightness data at the second moment;
  • the control module is used to perform automatic exposure control on the camera component based on the exposure data of the camera component at the first moment.
  • this application also provides a terminal device, including a processor and a memory.
  • the memory is used to store code instructions; the processor is used to run the code instructions, so that the electronic device can execute the first aspect or any one of the first aspects. Implement the exposure control method described in How.
  • this application also provides a computer-readable storage medium.
  • the computer-readable storage medium stores instructions. When the instructions are executed, the computer executes as described in the first aspect or any implementation of the first aspect. exposure control method.
  • the present application also provides a computer program product, including a computer program.
  • the computer program When the computer program is run, the computer executes the exposure control method as described in the first aspect or any implementation of the first aspect.
  • Figure 1a shows the images captured in the first 4 frames after the existing camera component is turned on
  • Figure 1b shows the images taken at the 5th and 6th frames after the existing camera component is turned on
  • Figure 1c shows images taken from frames 7 to 10 after the existing camera component is turned on
  • Figure 2 is a schematic structural diagram of a terminal device provided by an embodiment of the present application.
  • Figure 3 is a schematic flow chart of an exposure control method provided by an embodiment of the present application.
  • Figure 4 is a schematic diagram of triggering a start command provided by an embodiment of the present application.
  • Figure 5 is a schematic diagram of exposure control in different frame images provided by an embodiment of the present application.
  • Figure 6 is a schematic flow chart of another exposure control method provided by an embodiment of the present application.
  • Figure 8 is a schematic diagram of the hardware structure of a terminal device provided by an embodiment of the present application.
  • Figure 9 is a schematic structural diagram of a chip provided by an embodiment of the present application.
  • words such as “first” and “second” are used to distinguish the same or similar items with basically the same functions and effects.
  • the first value and the second value are only used to distinguish different values, and their order is not limited.
  • words such as “first” and “second” do not limit the number and execution order, and words such as “first” and “second” do not limit the number and execution order.
  • At least one refers to one or more, and “plurality” refers to two or more.
  • “And/or” describes the association of associated objects, indicating that there can be three relationships, for example, A and/or B, which can mean: A exists alone, A and B exist simultaneously, and B exists alone, where A, B can be singular or plural.
  • the character “/” generally indicates that the related objects are in an “or” relationship.
  • “At least one of the following” or similar expressions thereof refers to any combination of these items, including any combination of a single item (items) or a plurality of items (items).
  • At least one of a, b, or c can mean: a, b, c, a-b, a-c, b-c, or a-b-c, where a, b, c can be single or multiple .
  • AEC Automatic Exposure Control
  • the exposure amount and the ambient brightness value of the camera component when it is turned off are usually stored.
  • the exposure and ambient brightness values at the last time it was turned off can be directly used for AEC.
  • directly using the exposure and ambient brightness value at the last time it was turned off to perform AEC may cause the camera component to have a flash screen overexposure problem when it is turned on.
  • the mitigation and convergence duration of the above image overexposure phenomenon can be determined based on the AEC control algorithm and the amount of environmental brightness change. For example, it can be alleviated at the 5th frame and at the 10th frame as shown in Figure 1a- Figure 1c. Convergence; if the change in ambient brightness is small or the exposure adjustment speed controlled by the AEC control algorithm is fast, it can also be alleviated in the 3rd frame and converged in the 6th frame. There is no limit to this in the non-application embodiment.
  • this application provides an exposure control method, device and terminal equipment.
  • the terminal equipment turns on the camera component, it can obtain the ambient brightness at the time it is turned on and based on the ambient brightness at the time it is turned on and the ambient brightness at the previous time it was turned off. , to get the exposure data of the camera component when it is turned on.
  • the changes in ambient brightness can be determined based on the ambient brightness at the time it is turned on and the ambient brightness at the previous time it was turned off, so that different methods can be selected to determine the exposure data of the camera component when it is turned on based on different changes in ambient brightness. This avoids the problem of splash screen overexposure caused by exposure data mismatch caused by large brightness changes.
  • the exposure control method provided in this embodiment can be applied in image shooting scenarios.
  • the user When the user opens the camera component of the terminal device to capture an image, the user can determine the exposure data of the camera component through the above exposure control method, thereby performing exposure control on the camera component based on the exposure data.
  • the exposure control method provided by the present disclosure can be applied to any scenario in which exposure control of a camera component is performed.
  • the above-mentioned terminal device may be a smartphone, a notebook computer, a tablet computer, or other devices with a camera component and an optical sensor.
  • the embodiments of this application do not limit the specific technology and specific equipment form used by the terminal equipment.
  • FIG. 2 is a schematic structural diagram of a terminal device provided by an embodiment of the present application.
  • a terminal device may include multiple subsystems that cooperate to perform, coordinate, or monitor one or more operations or functions of the terminal device.
  • the terminal device includes a display 220, a processor 230, a camera component 240, an optical sensor 250, a speaker 260, a wireless communication module 270 and a memory 280.
  • the wireless communication function of the terminal device can be implemented through the antenna, wireless communication module 270, modem processor, baseband processor, etc.
  • Antennas are used to transmit and receive electromagnetic wave signals.
  • Antennas in end devices can be used to cover single or multiple communication bands. Different antennas can also be reused to improve antenna utilization.
  • the wireless communication module 270 can provide wireless local area networks (WLAN) (such as wireless fidelity (Wi-Fi) network), Bluetooth (BT), and global navigation satellite system (GNSS) for use on terminal devices.
  • WLAN wireless local area networks
  • BT Bluetooth
  • GNSS global navigation satellite system
  • FM frequency modulation
  • UWB ultra wide band
  • Processor 230 may be implemented as any electronic device capable of processing, receiving, or sending data or instructions.
  • a processor may be a microprocessor, a central processing unit, an application specific integrated circuit, a field programmable gate array, a digital signal processor, an analog circuit, a digital circuit, or a combination of these devices.
  • a processor can be a single-threaded or multi-threaded processor.
  • the processor can be a single-core or multi-core processor.
  • the camera component 240 is connected to the processor 230 of the terminal device, and the camera component 240 can capture images or videos in response to instructions sent by the processor 230 .
  • Speaker 260 also known as “speaker” is used to convert audio electrical signals into sound signals.
  • the terminal device can listen to music through the speaker 260, or emit ultrasonic waves.
  • a speaker 260 may be provided on both sides of the terminal device, so that the terminal device can determine the direction of the discovered device relative to the terminal device.
  • Memory 280 may be used to store computer executable program code, which includes instructions.
  • the memory may be implemented as random access memory, read-only memory, flash memory, removable memory, other types of storage elements, or combinations of such devices.
  • the memory 280 may include a program storage area and a data storage area.
  • Memory 280 may store data in a storage data area.
  • the memory 280 stores information such as identity information of a device that establishes a short-distance communication connection, and the time of connection.
  • FIG. 3 is a schematic flowchart of an exposure control method provided by an embodiment of the present application. As shown in Figure 3, the exposure control method provided by this embodiment is applied to a terminal device.
  • the terminal device is provided with a camera component and an optical sensor.
  • the exposure control method includes:
  • Step 301 The terminal device responds to the startup instruction of the camera component and obtains the brightness data collected by the optical sensor at the first moment.
  • the brightness data is used to characterize the ambient brightness of the camera component.
  • the first moment is the startup time of the camera component under the startup instruction.
  • the user when the user needs to use the camera function of the terminal device, the user can send a startup instruction of the camera component to the camera component.
  • the terminal device starts the camera component in response to the startup instruction of the camera component.
  • the terminal device can obtain the brightness data at the first moment collected by the optical sensor, that is, the brightness data at the startup moment of the camera component.
  • an optical sensor is a sensor component that can measure brightness based on optical principles.
  • the optical sensor by disposing the optical sensor on the terminal device, the ambient brightness around the camera component can be detected in real time, thereby assisting the processor in determining exposure data based on the measured brightness data.
  • the embodiments of the present application do not limit the model of the optical sensor. As an example, it may be a multispectral sensor.
  • the embodiments of the present application do not limit how to trigger the startup instruction of the camera component.
  • the startup instruction of the camera component can be triggered when the user turns on the camera function.
  • the user can jump to the camera interface by triggering the control of the shooting function.
  • the terminal device can trigger the startup of the camera component. instruction.
  • FIG. 4 is a schematic diagram of triggering a startup instruction provided by an embodiment of the present application.
  • the user can click the camera button to open the camera function and jump to the camera interface.
  • the terminal device can trigger a startup instruction of the camera component to instruct the camera component to turn on.
  • the brightness data at the first moment may be the brightness data collected by the optical sensor at the startup moment of the camera component under the startup instruction.
  • an automatic exposure control Automatic Exposure Control, AEC
  • the optical sensor can be registered with the Non-Camera Service (NCS Service), and the brightness data collected by the optical sensor can be stored in the NCS Service in real time.
  • NCS Service will also provide the brightness data collected by the luminescence sensor under the AEC in real time, so that the brightness data at a certain moment can be obtained when performing exposure control.
  • the above brightness data may carry time information (for example, timestamp) during the saving process. Through the time information, the terminal device can quickly determine the brightness data at any time.
  • time information for example, timestamp
  • Step 302 The terminal device obtains the brightness data collected by the optical sensor at a second moment.
  • the second moment is the last time the camera component is turned off at the first moment.
  • the terminal device when the terminal device starts the camera component, it can not only obtain the brightness data at the first moment (that is, the brightness data at the time when the camera component is started this time), but also obtain the brightness data at the second moment (that is, the brightness data of the camera component at this time). brightness data at the time of the previous shutdown).
  • the embodiment of the present application does not limit how to obtain the brightness data collected by the optical sensor at the second moment.
  • the terminal device can obtain the brightness data at the second moment from a preset file or from a preset storage location.
  • the brightness data at the second moment can be stored in a preset file.
  • the preset file may be a text (txt) file to facilitate reading and writing of the brightness data at the second moment.
  • the brightness data at the second moment can also be stored in a preset storage location.
  • the brightness data at the second moment can be obtained from the preset storage location based on the routing information of the preset storage location.
  • the terminal device after the terminal device receives the closing instruction of the camera component at the second moment, it can also obtain the brightness data of the second moment collected by the optical sensor and the exposure data of the camera component at the second moment in real time. Subsequently, the camera component may store the brightness data at the second moment and the exposure data at the second moment. By storing the brightness data at the second moment and the exposure data at the second moment, the brightness data at the second moment and the exposure data at the second moment can be quickly obtained when the camera device is turned on again, so as to assist in exposure control.
  • Step 303 The terminal device determines the exposure data of the camera component at the first time based on the brightness data at the first time and the brightness data at the second time.
  • the terminal device after the terminal device obtains the brightness data at the first moment and the brightness data at the second moment, it can determine the exposure data of the camera component at the first moment based on the brightness data at the first moment and the brightness data at the second moment.
  • the terminal device may first determine the brightness based on the brightness data at the first moment and the brightness data at the second moment. Change value. This brightness change value is used to characterize the degree of change in the ambient brightness of the camera component. Subsequently, the terminal device can determine the exposure data of the camera component at the first moment according to the brightness change value.
  • the brightness change value includes the ratio of the brightness data at the first moment and the brightness data at the second moment, or may also include the difference between the brightness data at the first moment and the brightness data at the second moment.
  • the terminal device may determine whether the brightness change value is within the first interval. If the brightness change value is within the first interval, it can be determined that the ambient brightness does not change significantly between the first moment and the second moment.
  • the terminal device obtains the exposure data of the camera component at the second moment, and determines the exposure data of the camera component at the second moment as the exposure data of the camera component at the first moment.
  • the terminal device can re-determine the exposure data of the camera component at the first moment based on the brightness data at the first moment.
  • the above-mentioned first interval is used to measure the degree of change in ambient brightness and can be specifically set according to the actual situation. For example, if the brightness change value includes the ratio of the brightness data at the first moment and the brightness data at the second moment, the first interval can be set to 0.7-1.3.
  • the embodiments of the present application are specific to how to determine the brightness of the camera component at the first moment based on the brightness data at the first moment.
  • the exposure data is not limited.
  • a mapping relationship can be preset between the brightness data and the exposure data, and the exposure data at the first moment can be determined from the mapping relationship through the brightness data at the first moment.
  • the terminal device can preset a conversion formula between brightness data and exposure data, and by inputting the brightness data at the first moment into the conversion formula, the exposure data at the first moment can be obtained.
  • the terminal device may determine whether the brightness data at the first moment is valid according to the second interval corresponding to the brightness data. If the brightness data at the first time is valid, the brightness change value is determined based on the brightness data at the first time and the brightness data at the second time, and a method for determining the exposure data at the first time is determined based on the brightness change value. If the brightness data at the first moment is invalid, the terminal device can directly determine the exposure data of the camera component at the second moment as the exposure data of the camera component at the first moment.
  • the above-mentioned second interval is used to measure whether the brightness data is valid, and it can be specifically set according to the actual situation. When the brightness data is within the second interval, it can be determined that the brightness data is valid. When the brightness data is not within the second interval, it can be determined that the brightness data is invalid. For example, the second interval may be set to be greater than 0.
  • the terminal device after the terminal device determines the exposure data at the first moment, it can perform exposure control on the camera component based on the exposure data at the first moment.
  • the exposure data at the first moment can be used as the starting exposure data of the camera device.
  • the terminal device obtains the ambient brightness in real time based on the optical sensor, and performs automatic exposure control in real time through the AEC.
  • the exposure control method provided by the embodiment of the present application determines the exposure data of the camera component at the first moment based on the brightness data at the first moment and the brightness data at the second moment, so it can be determined in different ways based on different brightness changes.
  • the exposure data at the first moment avoids the problem of flash screen overexposure caused by the mismatch of exposure data caused by large brightness changes, and improves the user experience.
  • FIG. 5 is a schematic diagram of exposure control in different frame images provided by an embodiment of the present application. As shown in Figure 5, the camera component starts at frame 0, frames -4 to -1 are before the camera component is started, and frames 1 to 6 are after the camera component is started.
  • the terminal device receives the start command of the camera component at frame -1, it will start at frame 0 and capture images or videos at frames 1-6. Correspondingly, in frames -4 to -2, although the camera component is turned off, the optical sensor is still collecting brightness data and reporting the brightness data to the AEC for statistics.
  • the terminal device receives the startup command of the camera component at -1 frame, while turning on the camera component, it can determine the exposure data (start-up exposure value) when the camera component is turned on this time at -1 frame.
  • the terminal device can combine the brightness data when the camera component is turned on this time (the brightness data obtained at the -1st frame, or the brightness data obtained at the -4th to -2nd frames), and the previously stored brightness data of the camera component. Compare the brightness data stored when it is turned off to determine the brightness change value. If the brightness change value is too large, the flow-on exposure value is determined based on the brightness data when the camera component is turned on this time. If the brightness change value is not large, the exposure data stored when the camera component was turned off is directly used as the flow-on exposure value.
  • the above-mentioned flow-on exposure data can be used for exposure control of the first few frames of images (for example, the first three frames of images).
  • the exposure value of each frame of image can be calculated through the AEC algorithm, and the exposure value of subsequent images can be calculated based on the exposure value. to adjust the exposure.
  • the exposure value of the first frame image can be used to adjust the fourth frame image
  • the exposure value of the second frame image can be used to adjust the fifth frame image
  • the exposure value of the third frame image can be used to adjust the sixth frame image. image.
  • FIG. 6 is a schematic flowchart of another exposure control method provided by an embodiment of the present application. As shown in Figure 6, the exposure control method provided by this embodiment is applied to a terminal device.
  • the terminal device is provided with a camera component and an optical sensor.
  • the exposure control method includes:
  • Step 601 Obtain the brightness data collected by the optical sensor at the first moment.
  • the brightness data is used to characterize the ambient brightness of the camera component, and the first moment is the startup moment of the camera component under the startup command.
  • Step 602 Obtain the brightness data at the second moment and the exposure data at the second moment collected by the optical sensor.
  • the second moment is the last time the camera component is turned off at the first moment.
  • Step 603 Determine whether the brightness data at the first moment is valid according to the second interval corresponding to the brightness data.
  • step 605 If yes, perform step 605; if not, perform step 604.
  • Step 604 Determine the exposure data of the camera component at the second time as the exposure data of the camera component at the first time.
  • step 608 is performed.
  • Step 605 Determine the brightness change value based on the brightness data at the first time and the brightness data at the second time.
  • the brightness change value is used to represent the degree of change in the ambient brightness of the camera component, and the brightness change value includes the ratio of the brightness data at the first moment to the brightness data at the second moment.
  • Step 606 Determine whether the brightness change value is within the first interval.
  • step 604 If yes, perform step 604; if not, perform step 607.
  • Step 607 Determine the exposure data of the camera component at the first time based on the brightness data at the first time.
  • Step 608 Perform exposure control on the camera component according to the exposure data of the camera component at the first moment.
  • the exposure control method provided by the embodiment of the present application determines the exposure data of the camera component at the first moment based on the brightness data at the first moment and the brightness data at the second moment, so it can be determined in different ways based on different brightness changes.
  • the exposure data at the first moment avoids the problem of flash screen overexposure caused by the mismatch of exposure data caused by large brightness changes, and improves the user experience.
  • the aforementioned program can be stored in a computer-readable storage medium.
  • the program When the program is executed, It includes the steps of the above method embodiment; and the aforementioned storage medium includes: ROM, RAM, magnetic disk or optical disk and other various media that can store program codes.
  • FIG. 7 is a schematic structural diagram of an exposure control device provided by an embodiment of the present application.
  • the exposure control device can be implemented by software, hardware, or a combination of the two, and can be, for example, the terminal device or the processor of the terminal device in the above embodiment to execute the exposure control method in the above embodiment.
  • the exposure control device 700 includes: an acquisition module 701 , a processing module 702 and a control module 703 .
  • the acquisition module 701 is used to obtain the brightness data collected by the optical sensor at the first moment in response to the startup instruction of the camera component.
  • the brightness data is used to characterize the ambient brightness of the camera component.
  • the first moment is the startup time of the camera component under the startup instruction. ;
  • the processing module 702 is used to determine the exposure data of the camera component at the first moment based on the brightness data at the first moment and the brightness data at the second moment;
  • the control module 703 is used to perform automatic exposure control on the camera component according to the exposure data of the camera component at the first moment.
  • the processing module 702 is specifically configured to determine a brightness change value based on the brightness data at the first moment and the brightness data at the second moment.
  • the brightness change value is used to characterize changes in the ambient brightness of the camera component. degree; determine the exposure data of the camera component at the first moment based on the brightness change value.
  • the brightness change value includes a ratio of the brightness data at the first moment and the brightness data at the second moment.
  • the processing module 702 is specifically configured to determine whether the brightness change value is within the first interval; if the brightness change value is within the first interval, obtain the exposure data of the camera component at the second moment; The exposure data of the camera component at the second time is determined as the exposure data of the camera component at the first time.
  • the processing module 702 is specifically configured to determine the exposure data of the camera component at the first moment according to the brightness data at the first moment if the brightness change value is not within the first interval.
  • the processing module 702 is also configured to determine whether the brightness data at the first moment is valid according to the second interval corresponding to the brightness data; if the brightness data at the first moment is valid, determine whether the brightness data at the first moment is valid.
  • the brightness data and the brightness data at the second moment are used to determine the brightness change value.
  • the processing module 702 is also configured to determine the exposure data of the camera component at the second moment as the exposure data of the camera component at the first moment if the brightness data at the first moment is invalid.
  • the acquisition module 701 is also used to receive a shutdown instruction of the camera component; obtain the brightness data of the second moment collected by the optical sensor and the exposure data of the camera component at the second moment;
  • the processing module 702 is also used to store the brightness data at the second moment and the exposure data at the second moment.
  • the optical sensor includes a multispectral sensor.
  • FIG. 7 shows an exposure control device provided by an embodiment, which can be used to perform the method provided by any of the above embodiments.
  • the specific implementation methods and technical effects are similar, and will not be described again here.
  • Figure 8 is a schematic diagram of the hardware structure of a terminal device provided by an embodiment of the present application.
  • the terminal device includes a processor 801, a communication line 804 and at least one communication interface (the communication interface is used as an example in Figure 8 803 as an example).
  • the processor 801 can be a general central processing unit (CPU), a microprocessor, an application-specific integrated circuit (ASIC), or one or more processors used to control the execution of the program of the present application. integrated circuit.
  • CPU central processing unit
  • ASIC application-specific integrated circuit
  • Communication lines 804 may include circuitry that communicates information between the above-described components.
  • the communication interface 803 uses any device such as a transceiver to communicate with other devices or communication networks, such as Ethernet, wireless local area networks (WLAN), etc.
  • a transceiver to communicate with other devices or communication networks, such as Ethernet, wireless local area networks (WLAN), etc.
  • the terminal device may also include a memory 802.
  • Memory 802 may be a read-only memory (ROM) or other type of static storage device that can store static information and instructions, a random access memory (random access memory (RAM)) or other type that can store information and instructions. Dynamic storage device, it can also be electrically erasable programmable read-only memory (EEPROM), read-only optical disk (compact disc read-only memory (CD-ROM) or other optical disc storage, optical disc storage (including compressed optical discs, laser discs, optical discs, digital versatile discs, Blu-ray discs, etc.), magnetic disk storage media or other magnetic storage devices, or can be used for portability Or any other medium that stores the desired program code in the form of instructions or data structures and can be accessed by a computer, without limitation.
  • the memory may exist independently and be connected to the processor through a communication line 804 . Memory can also be integrated with the processor.
  • the memory 802 is used to store computer execution instructions for executing the solution of the present application, and the processor 801 controls the execution.
  • the processor 801 is used to execute computer execution instructions stored in the memory 802, thereby implementing the exposure control method provided by the embodiment of the present application.
  • the computer execution instructions in the embodiments of the present application may also be called application codes, which are not specifically limited in the embodiments of the present application.
  • the processor 801 may include one or more CPUs, such as CPU0 and CPU1 in FIG. 8 .
  • the terminal device may include multiple processors, such as the processor 801 and the processor 805 in Figure 8 .
  • processors may be a single-CPU processor or a multi-CPU processor.
  • a processor here may refer to one or more devices, circuits, and/or processing cores for processing data (eg, computer program instructions).
  • FIG. 9 is a schematic structural diagram of a chip provided by an embodiment of the present application.
  • the chip 90 includes one or more (including two) processors 910 and a communication interface 930 .
  • memory 940 stores the following elements: executable modules or data structures, or subsets thereof, or extensions thereof.
  • the memory 940 may include a read-only memory and a random access memory, and provide instructions and data to the processor 910 .
  • a portion of memory 940 may also include non-volatile random access memory (NVRAM).
  • NVRAM non-volatile random access memory
  • the memory 940, the communication interface 930 and the memory 940 are coupled together through the bus system 920.
  • the bus system 920 may also include a power bus, a control bus, a status signal bus, etc.
  • various buses are labeled as bus system 920 in FIG. 9 .
  • the methods described in the above embodiments of the present application can be applied to the processor 910 or implemented by the processor 910 .
  • the processor 910 may be an integrated circuit chip with signal processing capabilities. During the implementation process, each step of the above method can be completed by instructions in the form of hardware integrated logic circuits or software in the processor 910 .
  • the above-mentioned processor 910 can be a general-purpose processor (for example, a microprocessor or a conventional processor), a digital signal processor (digital signal processing, DSP), an application specific integrated circuit (ASIC), or an off-the-shelf programmable gate.
  • Array field-programmable gate array, FPGA
  • the processor 910 can implement or execute the disclosed methods, steps and logical block diagrams in the embodiments of this application. .
  • the steps of the method disclosed in conjunction with the embodiments of the present application can be directly implemented by a hardware decoding processor, or executed by a combination of hardware and software modules in the decoding processor.
  • the software module can be located in a storage medium mature in this field such as random access memory, read-only memory, programmable read-only memory, or electrically erasable programmable read only memory (EEPROM).
  • the storage medium is located in the memory 940.
  • the processor 910 reads the information in the memory 940 and completes the above in combination with its hardware. Describe the steps of the method.
  • the instructions stored in the memory for execution by the processor may be implemented in the form of a computer program product.
  • the computer program product may be written in the memory in advance, or may be downloaded and installed in the memory in the form of software.
  • a computer program product includes one or more computer instructions. When computer program instructions are loaded and executed on a computer, processes or functions according to embodiments of the present application are generated in whole or in part.
  • the computer may be a general purpose computer, a special purpose computer, a computer network, or other programmable device.
  • Computer instructions may be stored in or transmitted from one computer-readable storage medium to another computer-readable storage medium, e.g., computer instructions may be transmitted from a website, computer, server or data center via a wired link (e.g. Coaxial cable, optical fiber, digital subscriber line (DSL) or wireless (such as infrared, wireless, microwave, etc.) means to transmit to another website site, computer, server or data center.
  • a wired link e.g. Coaxial cable, optical fiber, digital subscriber line (DSL) or wireless (such as infrared, wireless, microwave, etc.
  • the computer-readable storage medium can be Any available media that a computer can store or is a data storage device such as a server, data center, or other integrated server that includes one or more available media.
  • available media may include magnetic media (eg, floppy disks, hard disks, or tapes), optical media (eg, Digital versatile disc (digital versatile disc, DVD)), or semiconductor media (for example, solid state disk (solid state disk, SSD)), etc.
  • Computer-readable media may include computer storage media and communication media and may include any medium that can transfer a computer program from one place to another.
  • the storage media can be any target media that can be accessed by the computer.
  • the computer-readable medium may include compact disc read-only memory (CD-ROM), RAM, ROM, EEPROM or other optical disk storage; the computer-readable medium may include a magnetic disk memory or other disk storage device.
  • any connection line is also properly termed a computer-readable medium.
  • coaxial cable, fiber optic cable, twisted pair, DSL or wireless technologies such as infrared, radio and microwave
  • coaxial cable, fiber optic cable, twisted pair, DSL or wireless technologies such as infrared, radio and microwave
  • Disk and optical disk include compact disc (CD), laser disc, optical disc, digital versatile disc (DVD), floppy disk, and Blu-ray disc, where disks typically reproduce data magnetically, while discs reproduce data optically using lasers. Reproduce data.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)
  • Exposure Control For Cameras (AREA)

Abstract

本申请提供一种曝光控制方法、装置和终端设备,涉及摄像技术领域,该曝光控制方法中,终端设备先响应于摄像组件的启动指令,获取光学传感器采集的第一时刻的亮度数据,该第一时刻为摄像组件在启动指令下的启动时刻。其次,终端设备获取光学传感器采集的第二时刻的亮度数据,该第二时刻为摄像组件在第一时刻的前一次关闭时刻。再次,终端设备根据第一时刻的亮度数据和第二时刻的亮度数据,确定摄像组件在第一时刻的曝光数据。最后,终端设备根据摄像组件在第一时刻的曝光数据,对摄像组件进行曝光控制。通过该方式,由于通过摄像组件开启时和上一次关闭时的环境亮度,来确定曝光参数,从而减轻了环境亮度变化过大而导致的闪屏过曝的问题。

Description

曝光控制方法、装置和终端设备
本申请要求于2022年06月29日提交中国国家知识产权局、申请号为202210751140.7、申请名称为“曝光控制方法、装置和终端设备”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及摄像技术领域,尤其涉及一种曝光控制方法、装置和终端设备。
背景技术
目前,随着终端技术的发展,终端设备已经成为人们工作生活的一部分。终端设备上通常配置有摄像组件,通过终端设备上的摄像组件,可以实现摄像录影等功能,极大地方便了人们的工作和生活。
自动曝光控制(Automatic Exposure Control,AEC)是摄像组件的控制中必不可少的一项。相关技术中,通常会将摄像组件在关闭时刻的曝光量和所处的环境亮度值存储下来。当摄像组件再次开启时,可以直接使用上一次关闭时刻的曝光量和环境亮度值进行AEC。
然而,由于终端设备在使用时所处的环境不同,直接使用上一次关闭时刻的曝光量和环境亮度值进行AEC,可能会导致摄像组件在开启时出现闪屏过曝的问题。
发明内容
本申请提供一种曝光控制方法、装置和终端设备,以解决现有技术中摄像组件在开启时出现闪屏过曝的问题。
第一方面,本申请提供了一种曝光控制方法,应用于终端设备,终端设备上设置有摄像组件和光学传感器,本申请提供的曝光控制方法包括:
终端设备首先响应于摄像组件的启动指令,获取光学传感器采集的第一时刻的亮度数据,该亮度数据用于表征摄像组件的环境亮度,该第一时刻为摄像组件在启动指令下的启动时刻。其次,终端设备获取光学传感器采集的第二时刻的亮度数据,第二时刻为摄像组件在第一时刻的前一次关闭时刻。再次,终端设备根据第一时刻的亮度数据和第二时刻的亮度数据,确定摄像组件在第一时刻的曝光数据。最后,终端设备根据摄像组件在第一时刻的曝光数据,对摄像组件进行曝光控制。
本申请提供的曝光控制方法,由于根据第一时刻的亮度数据和第二时刻的亮度数据来确定摄像组件在第一时刻的曝光数据,从而可以基于不同的亮度变化来采用不同的方式确定第一时刻的曝光数据,从而避免了因亮度变化较大造成的曝光数据不匹配而导致的闪屏过曝的问题。
在一种可能的实施方式中,终端设备可以根据第一时刻的亮度数据和第二时刻的亮度数据,确定亮度变化值,该亮度变化值用于表征摄像组件的环境亮度的变化程度。随后,终端设备再根据亮度变化值,确定摄像组件在第一时刻的曝光数据。
这样一来,终端设备可以基于不同的亮度变化值来采用不同的方式确定第一时刻的曝光数据,从而使得第一时刻的曝光数据更符合环境亮度。
在一种可能的实施方式中,亮度变化值包括第一时刻的亮度数据和第二时刻的亮度数据的比值。
在一种可能的实施方式中,终端设备根据亮度变化值,确定摄像组件在第一时刻的曝光数据,包括:终端设备确定亮度变化值是否在第一区间内。若亮度变化值在第一区间内,则终端设备获取摄像组件在第二时刻的曝光数据,从而将摄像组件在第二时刻的曝光数据,确定为摄像组件在第一时刻的曝光数据。
这样一来,当亮度变化值在第一区间内时,终端设备可以确定环境亮度变化不大,则可以直接使用第二时刻(即,上一次关闭时刻)的曝光数据作为第一时刻的曝光数据,从而使终端设备可以快速完成曝光控制。
在一种可能的实施方式中,终端设备确定亮度变化值是否在第一区间内之后,还包括:若亮度变化值不在第一区间内,则终端设备可以根据第一时刻的亮度数据,确定摄像组件在第一时刻的曝光数据。
这样一来,当亮度变化值不在第一区间内时,终端设备可以确定环境亮度变化较大。此时,终端设备可以使用第一时刻的亮度数据重新计算摄像组件在第一时刻的曝光数据,从而可以保证第一时刻的曝光数据与环境亮度匹配,避免出现闪屏过曝的问题。
在一种可能的实施方式中,终端设备在根据亮度变化值,确定摄像组件在第一时刻的曝光数据之前,还包括:终端设备先根据所述亮度数据对应的第二区间,确定所述第一时刻的亮度数据是否有效。随后,若所述第一时刻的亮度数据有效,则根据所述第一时刻的亮度数据和所述第二时刻的亮度数据,确定亮度变化值。
在一种可能的实施方式中,终端设备在根据亮度数据对应的第二区间,确定第一时刻的亮度数据是否有效之后,若所述第一时刻的亮度数据无效,则将所述摄像组件在所述第二时刻的曝光数据,确定为所述摄像组件在所述第一时刻的曝光数据。
这样一来,通过第二区间来指示明显不符合的亮度数据,当第一时刻的亮度数据有效时,可以进一步确定第一时刻和第二时刻的亮度变化。当第一时刻的亮度数据无效时,则可以确定无法基于第一时刻的亮度数据进一步确定第一时刻和第二时刻的环境亮度变化,此时,可以直接使用第二时刻的曝光数据,在第一时刻进行曝光控制。
在一种可能的实施方式中,终端设备在获取光学传感器采集的第一时刻的亮度数据之前,还包括:终端设备先接收摄像组件的关闭指令。随后,终端设备可以获取光学传感器采集的第二时刻的亮度数据和摄像组件在第二时刻的曝光数据。最后,终端设备存储第二时刻的亮度数据和第二时刻的曝光数据。
这样一来,通过预先存储第二时刻的亮度数据和第二时刻的曝光数据,从而使得终端设备在下一次启动摄像组件时,可以获取第二时刻的亮度数据来和第一时刻的亮度数据进行比较,确定环境亮度变化情况,并且,在环境亮度变化不大时,可以直接使用第二时刻的曝光数据,进行曝光控制。
在一种可能的实施方式中,光学传感器包括多光谱传感器。
第二方面,本申请还提供一种曝光控制装置,该装置包括:
获取模块,用于响应于摄像组件的启动指令,获取光学传感器采集的第一时刻的亮度 数据,亮度数据用于表征摄像组件的环境亮度,第一时刻为摄像组件在启动指令下的启动时刻;获取光学传感器采集的第二时刻的亮度数据,第二时刻为摄像组件在第一时刻的前一次关闭时刻;
处理模块,用于根据第一时刻的亮度数据和第二时刻的亮度数据,确定摄像组件在第一时刻的曝光数据;
控制模块,用于根据摄像组件在第一时刻的曝光数据,对摄像组件进行自动曝光控制。
第三方面,本申请还提供一种终端设备,包括处理器和存储器,存储器用于存储代码指令;处理器用于运行代码指令,使得电子设备以执行如第一方面或第一方面的任一种实现方式中描述的曝光控制方法。
第四方面,本申请还提供一种计算机可读存储介质,计算机可读存储介质存储有指令,当指令被执行时,使得计算机执行如第一方面或第一方面的任一种实现方式中描述的曝光控制方法。
第五方面,本申请还提供一种计算机程序产品,包括计算机程序,当计算机程序被运行时,使得计算机执行如第一方面或第一方面的任一种实现方式中描述的曝光控制方法。
应当理解的是,本申请的第二方面至第五方面与本申请的第一方面的技术方案相对应,各方面及对应的可行实施方式所取得的有益效果相似,不再赘述。
附图说明
图1a为现有的摄像组件开启后的前4帧拍摄的图像;
图1b为现有的摄像组件开启后的第5帧和第6帧拍摄的图像;
图1c为现有的摄像组件开启后的第7帧至第10帧拍摄的图像;
图2为本申请实施例提供的一种终端设备的结构示意图;
图3为本申请实施例提供的一种曝光控制方法的流程示意图;
图4为本申请实施例提供的一种启动指令的触发示意图;
图5为本申请实施例提供的一种不同帧图像中曝光控制的示意图;
图6为本申请实施例提供的另一种曝光控制方法的流程示意图;
图7为本申请实施例提供的一种曝光控制装置的结构示意图;
图8为本申请实施例提供的一种终端设备的硬件结构示意图;
图9为本申请实施例提供的一种芯片的结构示意图。
具体实施方式
为了便于清楚描述本申请实施例的技术方案,在本申请的实施例中,采用了“第一”、“第二”等字样对功能和作用基本相同的相同项或相似项进行区分。例如,第一值和第二值仅仅是为了区分不同的值,并不对其先后顺序进行限定。本领域技术人员可以理解“第一”、“第二”等字样并不对数量和执行次序进行限定,并且“第一”、“第二”等字样也并不限定一定不同。
需要说明的是,本申请中,“示例性的”或者“例如”等词用于表示作例子、例证或说明。本申请中被描述为“示例性的”或者“例如”的任何实施例或设计方案不应被解释为比其他实 施例或设计方案更优选或更具优势。确切而言,使用“示例性的”或者“例如”等词旨在以具体方式呈现相关概念。
本申请中,“至少一个”是指一个或者多个,“多个”是指两个或两个以上。“和/或”,描述关联对象的关联关系,表示可以存在三种关系,例如,A和/或B,可以表示:单独存在A,同时存在A和B,单独存在B的情况,其中A,B可以是单数或者复数。字符“/”一般表示前后关联对象是一种“或”的关系。“以下至少一项(个)”或其类似表达,是指的这些项中的任意组合,包括单项(个)或复数项(个)的任意组合。例如,a,b,或c中的至少一项(个),可以表示:a,b,c,a-b,a-c,b-c,或a-b-c,其中a,b,c可以是单个,也可以是多个。
自动曝光控制(Automatic Exposure Control,AEC)是摄像组件的控制中必不可少的一项。相关技术中,通常会将摄像组件在关闭时刻的曝光量和所处的环境亮度值存储下来。当摄像组件再次开启时,可以直接使用上一次关闭时刻的曝光量和环境亮度值进行AEC。然而,由于终端设备在使用时所处的环境不同,直接使用上一次关闭时刻的曝光量和环境亮度值进行AEC,可能会导致摄像组件在开启时出现闪屏过曝的问题。
示例性的,当在低亮环境拍照后切换至高亮环境拍照,或者在高亮环境拍照后切换至低亮环境拍照,由于均直接采用上一次关闭时刻的曝光量和环境亮度值进行AEC,从而使得AEC控制时的初始曝光量与实际的环境亮度严重不匹配,从而使得拍摄组件在开启后的前几帧图像存在闪屏过曝的现象。图1a为现有的摄像组件开启后的前4帧拍摄的图像,图1b为现有的摄像组件开启后的第5帧和第6帧拍摄的图像,图1c为现有的摄像组件开启后的第7帧至第10帧拍摄的图像。如图1a所示,由于采用上一次关闭时刻的曝光量进行AEC控制且环境亮度变化较大,摄像组件在前4帧拍摄到的图像严重过曝,图像中看不到任何物体。如图1b所示,通过AEC控制调节自动调节曝光量,在第5帧和第6帧开始,图像过曝现象得到缓解,图像中可以看到物体轮廓。如图1c所示,图像过曝现象直至第10帧才得到收敛,可以看清图像中的物体。
需要说明的是,上述图像过曝现象的缓解和收敛时长可以基于AEC控制算法和环境亮度变化量确定,示例性的,可以如图1a-图1c所示在第5帧得到缓解在第10帧收敛;若环境亮度变化量较小或AEC控制算法所控制的曝光量调节速度较快,也可以在3帧得到缓解在第6帧收敛,不申请实施例对此不做限制。
有鉴于此,本申请提供了一种曝光控制方法、装置和终端设备,终端设备在开启摄像组件时,可以获取开启时刻的环境亮度,并基于开启时刻的环境亮度和前一次关闭时刻的环境亮度,来摄像组件在开启时的曝光数据。通过该方式,可以通过开启时刻的环境亮度和前一次关闭时刻的环境亮度,来确定环境亮度的变化情况,从而基于不同的环境亮度变化情况选择不同的方式确定摄像组件在开启时的曝光数据,从而避免了因亮度变化较大造成的曝光数据不匹配而导致的闪屏过曝的问题。
下面对本申请涉及的曝光控制方法的应用场景进行说明。
在一些实施例中,本实施例提供的曝光控制方法,可以应用在图像拍摄的场景中。用户在打开终端设备的拍摄组件进行图像拍摄时,可以通过上述曝光控制方法确定摄像组件的曝光数据,从而基于曝光数据,对摄像组件进行曝光控制。
需要说明的是,上述应用场景并不构成对本公开的限制,本公开提供的曝光控制方法可以应用于任何对摄像组件进行曝光控制的场景中。
可以理解的是,上述终端设备可以是智能手机、笔记本电脑、平板电脑等具有摄像组件和光学传感器的设备。本申请的实施例对终端设备所采用的具体技术和具体设备形态不做限定。
为了能够更好地理解本申请实施例,下面对本申请实施例的终端设备的结构进行介绍。示例性的,图2为本申请实施例提供的一种终端设备的结构示意图。
参照图2,终端设备可以包括多个子系统,这些子系统协作以执行、协调或监控终端设备的一个或多个操作或功能。终端设备包括显示器220、处理器230、摄像组件240、光学传感器250、扬声器260、无线通信模块270以及存储器280。
终端设备的无线通信功能可以通过天线,无线通信模块270,调制解调处理器以及基带处理器等实现。天线用于发射和接收电磁波信号。终端设备中的天线可用于覆盖单个或多个通信频带。不同的天线还可以复用,以提高天线的利用率。
无线通信模块270可以提供应用在终端设备上的包括无线局域网(wirelesslocal area networks,WLAN)(如无线保真(wireless fidelity,Wi-Fi)网络),蓝牙(bluetooth,BT),全球导航卫星系统(global navigation satellite system,GNSS),调频(frequency modulation,FM),超宽带(ultra wide band,UWB)连接等无线通信的解决方案。
处理器230可被实现为能够处理、接收或发送数据或指令的任何电子设备。例如,处理器可以是微处理器、中央处理单元、专用集成电路、现场可编程门阵列、数字信号处理器、模拟电路、数字电路或这些设备的组合。处理器可以是单线程或多线程处理器。处理器可以是单核或多核处理器。
在使用期间,处理器230可被配置为访问存储有指令的存储器。该指令可被配置为使处理器执行、协调或监视终端设备的一个或多个操作或功能。
显示器220可以位于输入表面220后方,或者可以与其集成一体。显示器220可以通信地耦接至处理器230。处理器230可以使用显示器220向用户呈现信息。在很多情况下,处理器230使用显示器220来呈现用户可以与之交互的界面。
摄像组件240与终端设备的处理器230连接,摄像组件240可以响应于处理器230发送的指令,拍摄图像或视频。
光学传感器250设置在终端设备上,与终端设备的处理器230连接。光学传感器250用于采集终端设备周围的环境亮度,并将环境亮度实时上报给处理器230,以便处理器230基于环境亮度对摄像组件进行曝光控制。
扬声器260,也称“喇叭”,用于将音频电信号转换为声音信号。终端设备可以通过扬声器260收听音乐,或者发出超声波。在本申请实施例中,可以在终端设备的两侧分别设置一个扬声器260,以便终端设备确定发现的设备相对于终端设备的方向。
存储器280可以用于存储计算机可执行程序代码,可执行程序代码包括指令。例如,存储器可被实现作为随机存取存储器、只读存储器、闪存存储器、可移动存储器、其他类型的存储元件或此类设备的组合。存储器280可以包括存储程序区和存储数据区。存储器280可以在存储数据区存储数据。例如,存储器280存储有建立短距离通信连接的设备的身份信息、以及连接的时刻等信息。
可以理解,上述曝光控制方法可以通过本公开实施例提供的曝光控制装置实现,曝光控制装置可以是某个设备的部分或全部,例如上述终端设备。下面以终端设备为例,并结 合具体地实施例对本申请的技术方案以及本申请的技术方案如何解决上述技术问题进行详细说明。下面这几个具体的实施例可以独立实现,也可以相互结合,对于相同或相似的概念或过程可能在某些实施例中不再赘述。
图3为本申请实施例提供的一种曝光控制方法的流程示意图。如图3所示,本实施例提供的曝光控制方法,应用于终端设备,该终端设备上设置有摄像组件和光学传感器,该曝光控制方法包括:
步骤301、终端设备响应于摄像组件的启动指令,获取光学传感器采集的第一时刻的亮度数据,亮度数据用于表征摄像组件的环境亮度,第一时刻为摄像组件在启动指令下的启动时刻。
在本申请中,用户需要使用终端设备的摄像功能时,可以向摄像组件发送摄像组件的启动指令。相应的,终端设备响应于摄像组件的启动指令,启动摄像组件。同时,终端设备可以获取光学传感器采集的第一时刻的亮度数据,即,摄像组件的启动时刻的亮度数据。
其中,摄像组件用于拍摄图像或视频,并将拍摄到的图像或视频发送给终端设备的处理器或存储器,本申请实施例对于摄像组件的型号不做限制,可以根据实际情况具体设置。
应理解,光学传感器是一种传感器组件,可以依据光学原理进行亮度的测量。在本申请中,通过将光学传感器设置在终端设备上,可以实时检测摄像组件的周围的环境亮度,从而基于测量到的亮度数据辅助处理器确定曝光数据。应理解,本申请实施例对于光学传感器的型号不做限制,示例性的,可以为多光谱传感器。
应理解,本申请实施例对于如何触发摄像组件的启动指令不做限制,在一些实施例中,在用户打开相机功能的过程中可以触发摄像组件的启动指令。在另一些实施例中,在某些应用中,用户可以通过触发拍摄功能的控件,来跳转至相机界面,相应的,在跳转至相机界面的过程中,终端设备可以触发摄像组件的启动指令。
示例性的,图4为本申请实施例提供的一种启动指令的触发示意图。如图4所示,用户可以通过点击相机按钮来打开相机功能并使界面跳转至相机界面。在跳转至相机界面的过程中,终端设备可以触发摄像组件的启动指令,来指示摄像组件开启。
在一些实施例中,第一时刻的亮度数据可以为在启动指令下摄像组件的启动时刻光学传感器采集到的亮度数据。应理解,若终端设备上设置有光学传感器,则相应的,可以创建曝光自动控制(Automatic Exposure Control,AEC)统计任务。随后,可以使光学传感器向非相机服务(Non-Camera Service,NCS Service)注册服务,并将光学传感器采集到的亮度数据实时存入NCS Service中。同时,NCS Service也会实时向AEC下发光学传感器采集到的亮度数据,以便在进行曝光控制时,可以获取某一时刻的亮度数据。
需要说明的是,上述亮度数据在保存过程中,可以携带有时间信息(例如,时间戳),通过时间信息,终端设备可以快速确定出任意时刻的亮度数据。
步骤302、终端设备获取光学传感器采集的第二时刻的亮度数据,第二时刻为摄像组件在第一时刻的前一次关闭时刻。
在本申请中,当终端设备启动摄像组件时,不但可以获取第一时刻的亮度数据(即,摄像组件本次启动时刻的亮度数据),还可以获取第二时刻的亮度数据(即,摄像组件前一次关闭时刻的亮度数据)。
应理解,本申请实施例对于如何获取光学传感器采集的第二时刻的亮度数据不做限 制,例如,终端设备可以从预设的文件,或者,从预设的存储位置中获取第二时刻的亮度数据。
在一些实施例中,第二时刻的亮度数据可以存储在预设的文件中,当终端设备再次开启设置组件时,可以从该预设的文件中读取第二时刻的亮度数据,从而进行曝光控制。示例性的,该预设的文件可以为文本(txt)文件,以方便第二时刻的亮度数据的读取和写入。
在另一些实施例中,第二时刻的亮度数据还可以存储在预设的存储位置。当终端设备再次开启设置组件时,可以基于该预设的存储位置的路由信息,从预设的存储位置中获取第二时刻的亮度数据。
相应的,当终端设备在第二时刻接收到摄像组件的关闭指令后,也可以实时获取光学传感器采集的第二时刻的亮度数据和摄像组件在第二时刻的曝光数据。随后,摄像组件可以存储第二时刻的亮度数据和第二时刻的曝光数据。通过存储第二时刻的亮度数据和第二时刻的曝光数据,在可以在摄像设备再次开启时快速获取第二时刻的亮度数据和第二时刻的曝光数据,以便辅助进行曝光控制。
步骤303、终端设备根据第一时刻的亮度数据和第二时刻的亮度数据,确定摄像组件在第一时刻的曝光数据。
在本申请中,当终端设备获取第一时刻的亮度数据和第二时刻的亮度数据后,可以基于第一时刻的亮度数据和第二时刻的亮度数据确定摄像组件在第一时刻的曝光数据。
应理解,本申请实施例对于如何确定摄像组件在第一时刻的曝光数据不做限制,在一些实施例中,终端设备可以先根据第一时刻的亮度数据和第二时刻的亮度数据,确定亮度变化值,该亮度变化值用于表征摄像组件的环境亮度的变化程度。随后,终端设备可以根据亮度变化值,确定摄像组件在第一时刻的曝光数据。
由于闪屏过曝的问题是由环境亮度的变化过大而引起的,通过确定亮度变化值,再基于亮度变化值来确定第一时刻的曝光数据,从而可以在环境亮度的变化过大不再直接使用第二时刻的曝光数据,从而可以避免闪屏过曝的问题。
在一些实施例中,上述亮度变化值包括第一时刻的亮度数据和第二时刻的亮度数据的比值,或者,也可以包括第一时刻的亮度数据和第二时刻的亮度数据的差值。
示例性的,当终端设备通过第一时刻的亮度数据和第二时刻的亮度数据确定亮度变化值后,可以确定该亮度变化值是否在第一区间内。若亮度变化值在第一区间内,则可以确定第一时刻和第二时刻的环境亮度变化不大。相应的,终端设备获取摄像组件在第二时刻的曝光数据,并将摄像组件在第二时刻的曝光数据,确定为摄像组件在第一时刻的曝光数据。
若亮度变化值不在第一区间内,则可以确定第一时刻和第二时刻的环境亮度变化较大。此时,若直接将摄像组件在第二时刻的曝光数据确定为摄像组件在第一时刻的曝光数据可能导致闪屏过曝的问题。因此,终端设备可以根据第一时刻的亮度数据,重新确定摄像组件在第一时刻的曝光数据。
需要说明的是,上述第一区间用于衡量环境亮度变化程度,可以根据实际情况具体设置。示例性的,若亮度变化值包括第一时刻的亮度数据和第二时刻的亮度数据的比值,可以将第一区间设置为0.7-1.3。
应理解,本申请实施例对于如何根据第一时刻的亮度数据确定摄像组件在第一时刻 的曝光数据不做限制,在一些实施例中,亮度数据和曝光数据之间可以预先设置有映射关系,通过第一时刻的亮度数据可以从映射关系中确定第一时刻的曝光数据。
在另一些实施例中,终端设备可以预先设置亮度数据和曝光数据之间的转换公式,通过将第一时刻的亮度数据输入该转换公式,可以得到第一时刻的曝光数据。
在本申请实施例中,在确定第一时刻的曝光数据时,除了会校验亮度变化值是否在第一区间内,还可以校验第一时刻的亮度数据是否有效,从而保证亮度传感器采集到的亮度数据无误。
在一些实施例中,终端设备可以根据亮度数据对应的第二区间,确定第一时刻的亮度数据是否有效。若第一时刻的亮度数据有效,则根据第一时刻的亮度数据和第二时刻的亮度数据,确定亮度变化值,并基于亮度变化值确定第一时刻的曝光数据的确定方式。若第一时刻的亮度数据无效,终端设备则可以直接将摄像组件在第二时刻的曝光数据,确定为摄像组件在第一时刻的曝光数据。
需要说明的是,上述第二区间用于衡量亮度数据是否有效,其可以依据实际情况具体设置。当亮度数据在该第二区间内,则可以确定亮度数据有效,当亮度数据不在该第二区间内,则可以确定亮度数据无效。示例性的,第二区间可以设置为大于0。
步骤304、终端设备根据摄像组件在第一时刻的曝光数据,对摄像组件进行曝光控制。
在本申请中,当终端设备确定第一时刻的曝光数据后,可以基于第一时刻的曝光数据对摄像组件进行曝光控制。
在一些实施例中,第一时刻的曝光数据可以作为摄像设备的启流曝光数据,随后,终端设备基于光学传感器实时获取环境亮度,并通过AEC实时进行自动曝光控制。
本申请实施例提供的曝光控制方法,由于根据第一时刻的亮度数据和第二时刻的亮度数据来确定摄像组件在第一时刻的曝光数据,从而可以基于不同的亮度变化来采用不同的方式确定第一时刻的曝光数据,进而避免了因亮度变化较大造成的曝光数据不匹配而导致的闪屏过曝的问题,提高了用户体验。
下面对于本申请实施例涉及的曝光控制方法在各个时刻的动作进行说明。图5为本申请实施例提供的一种不同帧图像中曝光控制的示意图。如图5所示,摄像组件在第0帧时启动,第-4至-1帧为摄像组件启动前,第1至6帧为摄像组件启动后。
若终端设备在-1帧时接收到摄像组件的启动指令,在第0帧时启动,并在第1-6帧拍摄图像或视频。相应的,在第-4至-2帧时,虽然摄像组件处于关闭,但光学传感仍在采集亮度数据,并将亮度数据上报给AEC统计。当终端设备在-1帧时接收到摄像组件的启动指令,则在开启摄像组件的同时,可以在-1帧时确定本次摄像组件开启时的曝光数据(启流曝光值)。
示例性的,终端设备可以将本次摄像组件开启时的亮度数据(第-1帧获取的亮度数据,或者,第-4至-2帧获取的亮度数据),和预先存储的摄像组件前一次关闭时存储的亮度数据进行比较,确定亮度变化值。若亮度变化值过大,则基于本次摄像组件开启时的亮度数据确定启流曝光值,若亮度变化值不大,则直接使用前一次关闭时存储的曝光数据作为启流曝光值。
继续参考图5,上述启流曝光数据,可以用于前几帧的图像(例如,前3帧图像)的曝光控制。随后,通过AEC算法可以计算每帧图像的曝光值,并通过该曝光值对后续图 像进行曝光调节。示例性的,第1帧图像的曝光值可以用于调节第4帧图像,第2帧图像的曝光值可以用于调节第5帧图像,第3帧图像的曝光值可以用于调节第6帧图像.
下面对于如何确定摄像组件在第一时刻的曝光数据进行具体说明。
图6为本申请实施例提供的另一种曝光控制方法的流程示意图。如图6所示,本实施例提供的曝光控制方法,应用于终端设备,该终端设备上设置有摄像组件和光学传感器,该曝光控制方法包括:
步骤601、获取光学传感器采集的第一时刻的亮度数据。
其中,亮度数据用于表征摄像组件的环境亮度,第一时刻为摄像组件在启动指令下的启动时刻。
步骤602、获取光学传感器采集的第二时刻的亮度数据和第二时刻的曝光数据。
其中,第二时刻为摄像组件在第一时刻的前一次关闭时刻。
步骤603、根据亮度数据对应的第二区间,确定第一时刻的亮度数据是否有效。
若是,则执行步骤605,若否,则执行步骤604。
步骤604、将摄像组件在第二时刻的曝光数据确定为摄像组件在第一时刻的曝光数据。
在步骤604之后,执行步骤608。
步骤605、根据第一时刻的亮度数据和第二时刻的亮度数据,确定亮度变化值。
其中,亮度变化值用于表征摄像组件的环境亮度的变化程度,亮度变化值包括第一时刻的亮度数据和第二时刻的亮度数据的比值。
步骤606、确定亮度变化值是否在第一区间内。
若是,则执行步骤604,若否,则执行步骤607。
步骤607、根据第一时刻的亮度数据,确定摄像组件在第一时刻的曝光数据。
步骤608、根据摄像组件在第一时刻的曝光数据,对摄像组件进行曝光控制。
本申请实施例提供的曝光控制方法,由于根据第一时刻的亮度数据和第二时刻的亮度数据来确定摄像组件在第一时刻的曝光数据,从而可以基于不同的亮度变化来采用不同的方式确定第一时刻的曝光数据,进而避免了因亮度变化较大造成的曝光数据不匹配而导致的闪屏过曝的问题,提高了用户体验。
本领域普通技术人员可以理解:实现上述方法实施例的全部或部分步骤可以通过程序指令相关的硬件来完成,前述的程序可以存储于一计算机可读取存储介质中,该程序在执行时,执行包括上述方法实施例的步骤;而前述的存储介质包括:ROM、RAM、磁碟或者光盘等各种可以存储程序代码的介质。
图7为本申请实施例提供的一种曝光控制装置的结构示意图。该曝光控制装置可以通过软件、硬件或者两者的结合实现,可例如上述实施例中的终端设备或终端设备的处理器,以执行上述实施例中的曝光控制方法。如图7,该曝光控制装置700包括:获取模块701、处理模块702和控制模块703。
获取模块701,用于响应于摄像组件的启动指令,获取光学传感器采集的第一时刻的亮度数据,亮度数据用于表征摄像组件的环境亮度,第一时刻为摄像组件在启动指令下的启动时刻;获取光学传感器采集的第二时刻的亮度数据,第二时刻为摄像组件在第一时刻的前一次关闭时刻;
处理模块702,用于根据第一时刻的亮度数据和第二时刻的亮度数据,确定摄像组件在第一时刻的曝光数据;
控制模块703,用于根据摄像组件在第一时刻的曝光数据,对摄像组件进行自动曝光控制。
在一种可选的实施方式中,处理模块702,具体用于根据第一时刻的亮度数据和第二时刻的亮度数据,确定亮度变化值,亮度变化值用于表征摄像组件的环境亮度的变化程度;根据亮度变化值,确定摄像组件在第一时刻的曝光数据。
在一种可选的实施方式中,亮度变化值包括第一时刻的亮度数据和第二时刻的亮度数据的比值。
在一种可选的实施方式中,处理模块702,具体用于确定亮度变化值是否在第一区间内;若亮度变化值在第一区间内,则获取摄像组件在第二时刻的曝光数据;将摄像组件在第二时刻的曝光数据,确定为摄像组件在第一时刻的曝光数据。
在一种可选的实施方式中,处理模块702,具体用于若亮度变化值不在第一区间内,则根据第一时刻的亮度数据,确定摄像组件在第一时刻的曝光数据。
在一种可选的实施方式中,处理模块702,还用于根据亮度数据对应的第二区间,确定第一时刻的亮度数据是否有效;若第一时刻的亮度数据有效,则根据第一时刻的亮度数据和第二时刻的亮度数据,确定亮度变化值。
在一种可选的实施方式中,处理模块702,还用于若第一时刻的亮度数据无效,则将摄像组件在第二时刻的曝光数据,确定为摄像组件在第一时刻的曝光数据。
在一种可选的实施方式中,获取模块701,还用于接收摄像组件的关闭指令;获取光学传感器采集的第二时刻的亮度数据和摄像组件在第二时刻的曝光数据;
处理模块702,还用于存储第二时刻的亮度数据和第二时刻的曝光数据。
在一种可选的实施方式中,光学传感器包括多光谱传感器。
需要说明的,图7示实施例提供的曝光控制装置,可用于执行上述任意实施例所提供的方法,具体实现方式和技术效果类似,这里不再进行赘述。
图8为本申请实施例提供的一种终端设备的硬件结构示意图,如图8所示,该终端设备包括处理器801,通信线路804以及至少一个通信接口(图8中示例性的以通信接口803为例进行说明)。
处理器801可以是一个通用中央处理器(central processing unit,CPU),微处理器,特定应用集成电路(application-specific integrated circuit,ASIC),或一个或多个用于控制本申请方案程序执行的集成电路。
通信线路804可包括在上述组件之间传送信息的电路。
通信接口803,使用任何收发器一类的装置,用于与其他设备或通信网络通信,如以太网,无线局域网(wireless local area networks,WLAN)等。
可能的,该终端设备还可以包括存储器802。
存储器802可以是只读存储器(read-only memory,ROM)或可存储静态信息和指令的其他类型的静态存储设备,随机存取存储器(random access memory,RAM)或者可存储信息和指令的其他类型的动态存储设备,也可以是电可擦可编程只读存储器(electrically erasable programmable read-only memory,EEPROM)、只读光盘(compact  disc read-only memory,CD-ROM)或其他光盘存储、光碟存储(包括压缩光碟、激光碟、光碟、数字通用光碟、蓝光光碟等)、磁盘存储介质或者其他磁存储设备、或者能够用于携带或存储具有指令或数据结构形式的期望的程序代码并能够由计算机存取的任何其他介质,但不限于此。存储器可以是独立存在,通过通信线路804与处理器相连接。存储器也可以和处理器集成在一起。
其中,存储器802用于存储执行本申请方案的计算机执行指令,并由处理器801来控制执行。处理器801用于执行存储器802中存储的计算机执行指令,从而实现本申请实施例所提供的曝光控制方法。
可能的,本申请实施例中的计算机执行指令也可以称之为应用程序代码,本申请实施例对此不作具体限定。
在具体实现中,作为一种实施例,处理器801可以包括一个或多个CPU,例如图8中的CPU0和CPU1。
在具体实现中,作为一种实施例,终端设备可以包括多个处理器,例如图8中的处理器801和处理器805。这些处理器中的每一个可以是一个单核(single-CPU)处理器,也可以是一个多核(multi-CPU)处理器。这里的处理器可以指一个或多个设备、电路、和/或用于处理数据(例如计算机程序指令)的处理核。
示例性的,图9为本申请实施例提供的一种芯片的结构示意图。芯片90包括一个或两个以上(包括两个)处理器910和通信接口930。
在一些实施方式中,存储器940存储了如下的元素:可执行模块或者数据结构,或者他们的子集,或者他们的扩展集。
本申请实施例中,存储器940可以包括只读存储器和随机存取存储器,并向处理器910提供指令和数据。存储器940的一部分还可以包括非易失性随机存取存储器(non-volatile random access memory,NVRAM)。
本申请实施例中,存储器940、通信接口930以及存储器940通过总线系统920耦合在一起。其中,总线系统920除包括数据总线之外,还可以包括电源总线、控制总线和状态信号总线等。为了便于描述,在图9中将各种总线都标为总线系统920。
上述本申请实施例描述的方法可以应用于处理器910中,或者由处理器910实现。处理器910可能是一种集成电路芯片,具有信号的处理能力。在实现过程中,上述方法的各步骤可以通过处理器910中的硬件的集成逻辑电路或者软件形式的指令完成。上述的处理器910可以是通用处理器(例如,微处理器或常规处理器)、数字信号处理器(digital signal processing,DSP)、专用集成电路(application specific integrated circuit,ASIC)、现成可编程门阵列(field-programmable gate array,FPGA)或者其他可编程逻辑器件、分立门、晶体管逻辑器件或分立硬件组件,处理器910可以实现或者执行本申请实施例中的公开的各方法、步骤及逻辑框图。
结合本申请实施例所公开的方法的步骤可以直接体现为硬件译码处理器执行完成,或者用译码处理器中的硬件及软件模块组合执行完成。其中,软件模块可以位于随机存储器、只读存储器、可编程只读存储器或带电可擦写可编程存储器(electrically erasable programmable read only memory,EEPROM)等本领域成熟的存储介质中。该存储介质位于存储器940,处理器910读取存储器940中的信息,结合其硬件完成上 述方法的步骤。
在上述实施例中,存储器存储的供处理器执行的指令可以以计算机程序产品的形式实现。其中,计算机程序产品可以是事先写入在存储器中,也可以是以软件形式下载并安装在存储器中。
计算机程序产品包括一个或多个计算机指令。在计算机上加载和执行计算机程序指令时,全部或部分地产生按照本申请实施例的流程或功能。计算机可以是通用计算机、专用计算机、计算机网络或者其他可编程装置。计算机指令可以存储在计算机可读存储介质中,或者从一个计算机可读存储介质向另一计算机可读存储介质传输,例如,计算机指令可以从一个网站站点、计算机、服务器或数据中心通过有线(例如同轴电缆、光纤、数字用户线(digital subscriber line,DSL)或无线(例如红外、无线、微波等)方式向另一个网站站点、计算机、服务器或数据中心进行传输。计算机可读存储介质可以是计算机能够存储的任何可用介质或者是包括一个或多个可用介质集成的服务器、数据中心等数据存储设备。例如,可用介质可以包括磁性介质(例如,软盘、硬盘或磁带)、光介质(例如,数字通用光盘(digital versatile disc,DVD))、或者半导体介质(例如,固态硬盘(solid state disk,SSD))等。
本申请实施例还提供了一种计算机可读存储介质。上述实施例中描述的方法可以全部或部分地通过软件、硬件、固件或者其任意组合来实现。计算机可读介质可以包括计算机存储介质和通信介质,还可以包括任何可以将计算机程序从一个地方传送到另一个地方的介质。存储介质可以是可由计算机访问的任何目标介质。
作为一种可能的设计,计算机可读介质可以包括紧凑型光盘只读储存器(compact disc read-only memory,CD-ROM)、RAM、ROM、EEPROM或其它光盘存储器;计算机可读介质可以包括磁盘存储器或其它磁盘存储设备。而且,任何连接线也可以被适当地称为计算机可读介质。例如,如果使用同轴电缆,光纤电缆,双绞线,DSL或无线技术(如红外,无线电和微波)从网站,服务器或其它远程源传输软件,则同轴电缆,光纤电缆,双绞线,DSL或诸如红外,无线电和微波之类的无线技术包括在介质的定义中。如本文所使用的磁盘和光盘包括光盘(CD),激光盘,光盘,数字通用光盘(digital versatile disc,DVD),软盘和蓝光盘,其中磁盘通常以磁性方式再现数据,而光盘利用激光光学地再现数据。
上述的组合也应包括在计算机可读介质的范围内。以上,仅为本申请的具体实施方式,但本申请的保护范围并不局限于此,任何熟悉本技术领域的技术人员在本申请揭露的技术范围内,可轻易想到变化或替换,都应涵盖在本申请的保护范围之内。因此,本申请的保护范围应以权利要求的保护范围为准。

Claims (13)

  1. 一种曝光控制方法,其特征在于,应用于终端设备,所述终端设备上设置有摄像组件和光学传感器,所述方法包括:
    响应于所述摄像组件的启动指令,获取所述光学传感器采集的第一时刻的亮度数据,所述亮度数据用于表征所述摄像组件的环境亮度,所述第一时刻为所述摄像组件在所述启动指令下的启动时刻;
    获取所述光学传感器采集的第二时刻的亮度数据,所述第二时刻为所述摄像组件在所述第一时刻的前一次关闭时刻;
    根据所述第一时刻的亮度数据和所述第二时刻的亮度数据,确定所述摄像组件在所述第一时刻的曝光数据;
    根据所述摄像组件在所述第一时刻的曝光数据,对所述摄像组件进行曝光控制。
  2. 根据权利要求1所述的方法,其特征在于,所述根据所述第一时刻的亮度数据和所述第二时刻的亮度数据,确定所述摄像组件在所述第一时刻的曝光数据,包括:
    根据所述第一时刻的亮度数据和所述第二时刻的亮度数据,确定亮度变化值,所述亮度变化值用于表征所述摄像组件的环境亮度的变化程度;
    根据所述亮度变化值,确定所述摄像组件在所述第一时刻的曝光数据。
  3. 根据权利要求2所述的方法,其特征在于,所述亮度变化值包括所述第一时刻的亮度数据和所述第二时刻的亮度数据的比值。
  4. 根据权利要求3所述的方法,其特征在于,所述根据所述亮度变化值,确定所述摄像组件在所述第一时刻的曝光数据,包括:
    确定所述亮度变化值是否在第一区间内;
    若所述亮度变化值在所述第一区间内,则获取所述摄像组件在所述第二时刻的曝光数据;
    将所述摄像组件在所述第二时刻的曝光数据,确定为所述摄像组件在所述第一时刻的曝光数据。
  5. 根据权利要求4所述的方法,其特征在于,所述确定所述亮度变化值是否在第一区间内之后,所述方法还包括:
    若所述亮度变化值不在所述第一区间内,则根据所述第一时刻的亮度数据,确定所述摄像组件在所述第一时刻的曝光数据。
  6. 根据权利要求2所述的方法,其特征在于,在所述根据所述亮度变化值,确定所述摄像组件在所述第一时刻的曝光数据之前,所述方法还包括:
    根据所述亮度数据对应的第二区间,确定所述第一时刻的亮度数据是否有效;
    所述根据所述亮度变化值,确定所述摄像组件在所述第一时刻的曝光数据,包括:
    若所述第一时刻的亮度数据有效,则根据所述第一时刻的亮度数据和所述第二时刻的亮度数据,确定亮度变化值。
  7. 根据权利要求6所述的方法,其特征在于,在所述根据所述亮度数据对应的第二区间,确定所述第二时刻的亮度数据是否有效之后,所述方法还包括:
    若所述第一时刻的亮度数据无效,则将所述摄像组件在所述第二时刻的曝光数据,确定为所述摄像组件在所述第一时刻的曝光数据。
  8. 根据权利要求1-7任一项所述的方法,其特征在于,在所述获取所述光学传感器采集的第一时刻的亮度数据之前,所述方法还包括:
    接收所述摄像组件的关闭指令;
    获取所述光学传感器采集的所述第二时刻的亮度数据和所述摄像组件在所述第二时刻的曝光数据;
    存储所述第二时刻的亮度数据和所述第二时刻的曝光数据。
  9. 根据权利要求1-7任一所述的方法,其特征在于,所述光学传感器包括多光谱传感器。
  10. 一种曝光控制装置,其特征在于,所述装置包括:
    获取模块,用于响应于摄像组件的启动指令,获取光学传感器采集的第一时刻的亮度数据,所述亮度数据用于表征所述摄像组件的环境亮度,所述第一时刻为所述摄像组件在所述启动指令下的启动时刻;获取所述光学传感器采集的第二时刻的亮度数据,所述第二时刻为所述摄像组件在所述第一时刻的前一次关闭时刻;
    处理模块,用于根据所述第一时刻的亮度数据和所述第二时刻的亮度数据,确定所述摄像组件在所述第一时刻的曝光数据;
    控制模块,用于根据所述摄像组件在所述第一时刻的曝光数据,对所述摄像组件进行自动曝光控制。
  11. 一种终端设备,包括存储器、处理器以及存储在所述存储器中并可在所述处理器上运行的计算机程序,其特征在于,所述处理器执行所述计算机程序时,使得所述终端设备执行如权利要求1至9任一项所述的方法。
  12. 一种计算机可读存储介质,所述计算机可读存储介质存储有计算机程序,其特征在于,所述计算机程序被处理器执行时,使得计算机执行如权利要求1至9任一项所述的方法。
  13. 一种计算机程序产品,其特征在于,包括计算机程序,当所述计算机程序被运行时,使得计算机执行如权利要求1至9任一项所述的方法。
PCT/CN2023/094365 2022-06-29 2023-05-15 曝光控制方法、装置和终端设备 WO2024001579A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202210751140.7A CN116095497B (zh) 2022-06-29 2022-06-29 曝光控制方法、装置和终端设备
CN202210751140.7 2022-06-29

Publications (2)

Publication Number Publication Date
WO2024001579A1 true WO2024001579A1 (zh) 2024-01-04
WO2024001579A9 WO2024001579A9 (zh) 2024-03-21

Family

ID=86205154

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2023/094365 WO2024001579A1 (zh) 2022-06-29 2023-05-15 曝光控制方法、装置和终端设备

Country Status (2)

Country Link
CN (1) CN116095497B (zh)
WO (1) WO2024001579A1 (zh)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116095497B (zh) * 2022-06-29 2023-10-20 荣耀终端有限公司 曝光控制方法、装置和终端设备

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105611037A (zh) * 2015-07-22 2016-05-25 宇龙计算机通信科技(深圳)有限公司 终端的控制方法、终端的控制装置和终端
WO2019192320A1 (zh) * 2018-04-04 2019-10-10 杭州海康威视数字技术股份有限公司 曝光方法、装置及摄像设备
CN110519526A (zh) * 2019-09-09 2019-11-29 Oppo广东移动通信有限公司 曝光时长控制方法、装置、存储介质及电子设备
JP2020170985A (ja) * 2019-04-05 2020-10-15 キヤノン株式会社 撮像装置およびその制御方法
CN113556477A (zh) * 2021-09-23 2021-10-26 南昌龙旗信息技术有限公司 环境亮度确定方法、装置、介质、产品及摄像头
CN116095497A (zh) * 2022-06-29 2023-05-09 荣耀终端有限公司 曝光控制方法、装置和终端设备

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3615462B2 (ja) * 2000-05-12 2005-02-02 三洋電機株式会社 自動露光調節カメラ
CN105744178A (zh) * 2016-04-15 2016-07-06 惠州Tcl移动通信有限公司 一种摄像头启动时亮度的控制方法、系统及摄像终端
CN109474790B (zh) * 2018-11-05 2021-06-15 浙江大华技术股份有限公司 曝光调整方法、装置和摄像机及计算机存储介质
CN109729279B (zh) * 2018-12-20 2020-11-17 华为技术有限公司 一种图像拍摄方法和终端设备
CN110636230B (zh) * 2019-10-31 2021-03-05 Oppo广东移动通信有限公司 一种曝光调节方法、装置、设备及存储介质
WO2022067496A1 (zh) * 2020-09-29 2022-04-07 深圳市大疆创新科技有限公司 一种相机快速自动曝光的方法以及存储介质
CN114339060A (zh) * 2020-09-30 2022-04-12 宇龙计算机通信科技(深圳)有限公司 一种调节曝光的方法、装置、存储介质及电子设备
CN112153305A (zh) * 2020-10-22 2020-12-29 努比亚技术有限公司 一种相机启动方法、移动终端以及计算机存储介质
CN112738493B (zh) * 2020-12-28 2023-03-14 Oppo广东移动通信有限公司 图像处理方法、图像处理装置、电子设备及可读存储介质
CN113364993B (zh) * 2021-07-23 2023-04-18 北京字节跳动网络技术有限公司 曝光参数值处理方法、装置和电子设备
CN113810601B (zh) * 2021-08-12 2022-12-20 荣耀终端有限公司 终端的图像处理方法、装置和终端设备

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105611037A (zh) * 2015-07-22 2016-05-25 宇龙计算机通信科技(深圳)有限公司 终端的控制方法、终端的控制装置和终端
WO2019192320A1 (zh) * 2018-04-04 2019-10-10 杭州海康威视数字技术股份有限公司 曝光方法、装置及摄像设备
JP2020170985A (ja) * 2019-04-05 2020-10-15 キヤノン株式会社 撮像装置およびその制御方法
CN110519526A (zh) * 2019-09-09 2019-11-29 Oppo广东移动通信有限公司 曝光时长控制方法、装置、存储介质及电子设备
CN113556477A (zh) * 2021-09-23 2021-10-26 南昌龙旗信息技术有限公司 环境亮度确定方法、装置、介质、产品及摄像头
CN116095497A (zh) * 2022-06-29 2023-05-09 荣耀终端有限公司 曝光控制方法、装置和终端设备

Also Published As

Publication number Publication date
CN116095497B (zh) 2023-10-20
WO2024001579A9 (zh) 2024-03-21
CN116095497A (zh) 2023-05-09

Similar Documents

Publication Publication Date Title
WO2020244623A1 (zh) 一种空鼠模式实现方法及相关设备
WO2020057198A1 (zh) 图像处理方法、装置、电子设备及存储介质
WO2021213341A1 (zh) 视频拍摄方法及电子设备
WO2017096857A1 (zh) 相机拍摄参数调整方法及装置
JP2017123671A (ja) 撮像装置
WO2024001579A1 (zh) 曝光控制方法、装置和终端设备
KR102149448B1 (ko) 이미지를 처리하기 위한 전자 장치 및 방법
US8417293B2 (en) Electronic device, method of controlling the same, and program
CN105282372A (zh) 摄像机命令集主机命令转换
WO2017173585A1 (zh) 一种拍照方法及终端
US20200068097A1 (en) Frame Synchronization Method For Image Data, Image Signal Processing Apparatus, And Terminal
CN115526787B (zh) 视频处理方法和装置
CN114996168B (zh) 一种多设备协同测试方法、测试设备及可读存储介质
WO2019104633A1 (zh) 无人机系统和通信方法
US20140032551A1 (en) Communication apparatus, method of controlling the communication apparatus, and recording medium
CN114205336A (zh) 跨设备音频播放方法、移动终端、电子设备及存储介质
CN115665342A (zh) 图像处理方法、图像处理电路、电子设备和可读存储介质
US11120272B2 (en) Imaging apparatus, electronic device, and method of transmitting image data
US20150189151A1 (en) Information processing apparatus, imaging apparatus, information processing method, information processing program, and imaging system
WO2020248705A1 (zh) 摄像机及摄像机启动方法、装置
CN107872558B (zh) 一种智能电子设备、图像处理单元、图像采集装置和图像采集方法
WO2022170866A1 (zh) 数据传输方法、装置及存储介质
JP5975005B2 (ja) 画像処理装置、情報処理装置、及び画像転送方法
JP2007006125A (ja) 画像処理支援装置、電子カメラ、画像処理装置、現像処理システム、並びにこれらの画像処理支援装置および画像処理装置を実現するプログラム
JP2015111817A (ja) 撮像装置、操作端末及びそれらの制御方法、システム、並びにプログラム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23829765

Country of ref document: EP

Kind code of ref document: A1