CN116723410B - Method and device for adjusting frame interval - Google Patents

Method and device for adjusting frame interval Download PDF

Info

Publication number
CN116723410B
CN116723410B CN202211131516.0A CN202211131516A CN116723410B CN 116723410 B CN116723410 B CN 116723410B CN 202211131516 A CN202211131516 A CN 202211131516A CN 116723410 B CN116723410 B CN 116723410B
Authority
CN
China
Prior art keywords
frame
module
interval
frames
frame interval
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202211131516.0A
Other languages
Chinese (zh)
Other versions
CN116723410A (en
Inventor
白春玉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honor Device Co Ltd
Original Assignee
Honor Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honor Device Co Ltd filed Critical Honor Device Co Ltd
Priority to CN202211131516.0A priority Critical patent/CN116723410B/en
Publication of CN116723410A publication Critical patent/CN116723410A/en
Application granted granted Critical
Publication of CN116723410B publication Critical patent/CN116723410B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D30/00Reducing energy consumption in communication networks
    • Y02D30/70Reducing energy consumption in communication networks in wireless communication networks

Abstract

The application relates to the field of terminal shooting and provides a method and a device for adjusting a frame interval, wherein the method comprises the following steps: the AE module obtains a first frame shot by a main camera and N second frames shot by N auxiliary cameras, wherein N is a positive integer; the AE module determines whether the frame interval of the first frame and the frame intervals of the N second frames are integer multiples of an ambient light brightness change period; when the frame interval between the first frame and a target frame in the N second frames is not an integral multiple of the ambient light brightness change period, the AE module determines whether the target frame is a synchronous frame; when the target frame is an unsynchronized frame, the AE module adjusts a frame interval of the target frame based on an anti-flicker algorithm. The method can avoid the severe frame rate change after the function for eliminating flicker is started.

Description

Method and device for adjusting frame interval
Technical Field
The application relates to the field of terminal shooting, in particular to a method and a device for adjusting a frame interval.
Background
Many terminal devices have a photographing function whose basic principle is to record photons entering a lens through an image sensor, and display the recorded result (i.e., an image or video) on a screen. In some cases, the environment where the terminal device is located has a light source (such as a lamp using alternating current) with a brightness change, when the exposure time of the image sensor is not in an integer multiple relationship with the brightness change period of the light source, the signal intensities accumulated at different positions of the image are different, so that streaks (bands) with alternating brightness appear on the image. When the screen displays the preview video, the preview video also has a streak movement phenomenon. This phenomenon of streaking on an image or video is known as flicker (flicker).
The essence of flicker is that the phase relation of the exposure starting point of a pixel relative to alternating current is continuously changed, one method for eliminating flicker is to start an anti-flicker mode, and set the frame interval to be an integral multiple of the brightness change period, so that the number of photons accumulated in each frame of image in the exposure time is the same, and stripes on a video are not moved any more.
In the process of eliminating flicker, the terminal device needs to adjust the frame interval of the camera so that the frame interval of the camera is an integer multiple of the ambient light brightness change period. However, in some photographing modes (e.g., portrait mode and large aperture mode), after the user turns on the flicker elimination function, the frame rate of the camera may be drastically changed, seriously affecting the photographing effect.
Disclosure of Invention
The embodiment of the application provides a method and a device for adjusting frame intervals, which can avoid the severe change of frame rate after the function of eliminating flicker is started.
In a first aspect, a method for adjusting a frame interval is provided, and the method is applied to a terminal device, the terminal device starts an anti-flicker mode, the terminal device includes an Auto Exposure (AE) module, and the method includes: the AE module acquires a first frame shot by a main camera and N second frames shot by N auxiliary cameras, wherein N is a positive integer; the AE module determines whether the frame interval of the first frame and the frame intervals of the N second frames are integer multiples of an ambient light brightness change period; when the frame interval between the first frame and a target frame in the N second frames is not an integral multiple of the ambient light brightness change period, the AE module determines whether the target frame is a synchronous frame; when the target frame is an unsynchronized frame, the AE module adjusts a frame interval of the target frame based on an anti-flicker algorithm.
When multiple cameras are working simultaneously, the frames acquired by the multiple cameras need to be aligned with the time stamps. It is common practice to increment the frame length lines (frames hlines) of the primary or secondary shots at the sync frame to facilitate frame alignment time stamps for the primary and secondary shot acquisitions. Increasing the frame length line of the main shot causes a change in the frame interval of the main shot or the auxiliary shot (e.g., a decrease in the frame rate), thereby causing the frame interval of the main shot or the auxiliary shot to deviate from an integer multiple of the period of the ambient light brightness change. When the AE module finds that the frame interval of the main shot or the auxiliary shot is not an integer multiple of the ambient light brightness change period, the frame interval of the main shot or the auxiliary shot is adjusted so that the frame interval of the main shot or the auxiliary shot becomes an integer multiple of the ambient light brightness change period, however, the main shot and the auxiliary shot become asynchronous and are cyclically reciprocated, and finally the frame rate of the main shot or the auxiliary shot is changed drastically.
In the above method provided by the application, under the condition that the terminal device starts the anti-flicker mode, the AE module first identifies whether the current frame (the first frame and the N second frames) is an integer multiple of the ambient light brightness change period, and if so, the frame interval does not need to be adjusted; if not, the frame interval needs to be adjusted. When the frame interval needs to be adjusted, judging whether the target frame (the frame needing to be adjusted in the frame intervals in the first frame and the N second frames) is a synchronous frame or not; if the target frame is not a synchronous frame, the anti-flicker algorithm is used for adjusting the frame interval, so that the situation that the frame interval is adjusted by two algorithms simultaneously to cause the frame rate of the main shot or the auxiliary shot to change drastically can be avoided.
Optionally, the terminal device further comprises a sensor module, and the method further comprises: the sensor module determines whether the AE module adjusts a frame interval of a current frame, the current frame being the first frame and/or the N second frames; when the AE module adjusts the frame interval of the current frame, the sensor module determines not to adjust the frame intervals of the main camera and the N auxiliary cameras; when the AE module does not adjust the frame interval of the current frame, the sensor module adjusts the frame intervals of the primary camera and the N secondary cameras.
In some cases, in order to reduce power consumption, the sensor module may reduce the frame rate of the camera output, for example, in the case where the frame rates of the main shot and the auxiliary shot are both 30fps, the sensor module may reduce the frame rates of the main shot and the auxiliary shot by 1.25 times to 24fps. However, the module for updating the main shooting frame rate and the module for updating the auxiliary shooting frame rate in the sensor module are two different sub-modules, the input of the module for updating the main shooting frame rate is the output of the AE module, and the input of the module for updating the auxiliary shooting frame rate is the output of the image sensor, so if the AE module adjusts the frame rate, the frame rate is adjusted again by the sensor module, so that the frame rates of the main shooting and the auxiliary shooting are different, and the main shooting and the auxiliary shooting are asynchronous. In this embodiment, the sensor module adjusts the frame rate under the condition that the AE module does not adjust the frame rate, so that the input amounts of the module for updating the main shooting frame rate and the module for updating the auxiliary shooting frame rate are the same (both are the output of the image sensor), and the main shooting and the auxiliary shooting can be prevented from being asynchronous while the power consumption is reduced.
Optionally, the sensor module determines whether the AE module adjusts a frame interval of the current frame, including: the sensor module receives a first frame interval of the current frame from an image sensor; the sensor module receives a second frame interval of the current frame from the AE module; when the first frame interval is the same as the second frame interval, the sensor module determines that the AE module does not adjust the frame interval of the current frame; when the first frame interval is different from the second frame interval, the sensor module determines that the AE module adjusts the frame interval of the current frame.
The sensor module may adjust the frame rate by comparing the frame intervals obtained from the AE module and the image sensor, if the two frame intervals are the same, indicating that the AE module does not adjust the frame rate. If the two frame intervals are different, indicating that the AE module adjusts the frame rate, the sensor module does not adjust the frame rate any more.
Optionally, before the AE module determines whether the target frame is a sync frame, the method further includes: the AE module determines the frequency of ambient light; when the frequency of the ambient light is a preset frequency, the AE module determines whether the target frame is a synchronization frame.
If the ambient light frequency is not the preset frequency, the AE module cannot execute the anti-flicker algorithm to adjust the frame interval of the target frame, and the anti-flicker algorithm and the frame synchronization algorithm cannot conflict, so that whether the target frame is a synchronous frame or not is not required to be judged.
Optionally, the AE module adjusts a frame interval of the target frame based on an anti-flicker algorithm, including: the AE module adjusts the frame interval of the target frame by adjusting the frame length line of the camera corresponding to the target frame, wherein the adjusted frame interval of the main camera is the same as the adjusted frame intervals of the N auxiliary cameras, and the adjusted frame interval of the main camera and the adjusted frame interval of the N auxiliary cameras are integer multiples of the brightness change period of the ambient light, which is the inverse of the frequency of the ambient light.
The frame interval of the asynchronous frame is regulated to make the frame interval of the asynchronous frame reach the integral multiple of the brightness change period of the ambient light, so that the anti-flicker effect can be realized, and the severe change of the frame rate can be avoided.
Optionally, the method further comprises: when the frequency of the ambient light is not a preset frequency, the AE module determines the brightness of the ambient light; the AE module determines exposure according to the brightness of the ambient light; and the AE module adjusts the frame interval of the main camera and the auxiliary camera according to the exposure.
If the ambient light frequency is not the preset frequency, the AE module cannot execute the anti-flicker algorithm to adjust the frame intervals of the first frame and the frame intervals of the N second frames, and the frame intervals of the main camera and the auxiliary camera can be directly adjusted according to the exposure.
Optionally, the method further comprises: when the first frame and the N second frames are synchronous frames, the AE module adjusts a frame interval of the first frame or the N second frames based on a frame synchronization algorithm.
Optionally, the AE module determines whether the first frame and the N second frames are synchronous frames, including: the AE module determines whether the first frame and the N second frames are synchronous frames according to the identification of the first frame and the identification of the N second frames, wherein the identification is used for indicating whether the first frame is the synchronous frame.
Optionally, before the AE module obtains the first frame captured by the main camera and the N second frames captured by the N auxiliary cameras, the method further includes: the camera APP of the terminal equipment receives user operation, wherein the user operation is used for selecting a multi-shooting mode, and the multi-shooting mode is a mode of jointly shooting by the main camera and the N auxiliary cameras; the camera APP responds to the user operation, controls the main camera to shoot the first frames, and controls the N auxiliary cameras to shoot the N second frames.
In a second aspect, there is provided an apparatus for adjusting a frame interval, comprising means for performing any of the methods of the first aspect. The device can be a terminal device or a chip in the terminal device. The apparatus may include an input unit and a processing unit.
When the apparatus is a terminal device, the processing unit may be a processor, and the input unit may be a communication interface; the terminal device may further comprise a memory for storing computer program code which, when executed by the processor, causes the terminal device to perform any of the methods of the first aspect.
When the device is a chip in the terminal equipment, the processing unit may be a logic processing unit inside the chip, and the input unit may be an output interface, a pin, a circuit, or the like; the chip may also include memory, which may be memory within the chip (e.g., registers, caches, etc.), or memory external to the chip (e.g., read-only memory, random access memory, etc.); the memory is for storing computer program code which, when executed by the processor, causes the chip to perform any of the methods of the first aspect.
In a third aspect, there is provided a computer readable storage medium storing computer program code which, when run by an apparatus for adjusting a frame interval, causes the apparatus to perform any one of the methods of the first aspect.
In a fourth aspect, there is provided a computer program product comprising: computer program code which, when run by an apparatus adjusting a frame interval, causes the apparatus to perform any one of the methods of the first aspect.
Drawings
FIG. 1 is a schematic illustration of a change in brightness of an ambient light source;
FIG. 2 is a schematic view of an exposure;
FIG. 3 is a schematic diagram of a flicker phenomenon;
fig. 4 is a schematic diagram of a hardware architecture of a terminal device suitable for use in the present application;
FIG. 5 is a schematic diagram of a software architecture suitable for use with the terminal device of the present application;
FIG. 6 is a schematic diagram of a primary and secondary camera module provided herein;
fig. 7 is a schematic diagram of a photographing scene provided in the present application;
fig. 8 is a schematic diagram of a data flow corresponding to a photographing scene provided in the present application;
FIG. 9 is a schematic diagram of a method of adjusting frame spacing provided herein;
FIG. 10 is a schematic diagram of a frame synchronization method provided herein;
fig. 11 is a schematic diagram of another method for adjusting a frame interval provided in the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be described below with reference to the accompanying drawings.
Fig. 1 is a schematic diagram of a change in brightness of an ambient light source.
The upper half of fig. 1 shows the waveform of alternating current, the direction and magnitude of the voltage of which vary periodically with time, exhibiting the form of a sine wave. When the ambient light source is a lamp using alternating current, the brightness of the lamp also changes periodically, and as shown in the lower part of fig. 1, the brightness change period is a general period of the alternating current. For example, the period of the alternating current is 1/60s, and the period of the brightness change of the lamp is 1/120s.
When the voltage of the alternating current is 0, the brightness of the lamp is also 0, and this time is a dark time. When a lamp using alternating current exists in a photographing environment, the exposure amount of the terminal device at a dark time is 0, and the exposure amount of the terminal device at a time other than the dark time is not 0, and thus, the image photographed by the terminal device may appear as stripes.
Fig. 2 is a schematic view of exposure.
The environment where the terminal equipment is located has light sources (such as lamps using alternating current) with brightness change, the exposure mode of the image sensor is line-by-line exposure, when the exposure time of the image sensor is not in integral multiple relation with the brightness change period of the light sources, the signal intensities accumulated at different positions of the image are different, and streaks (banding) with brightness and darkness are generated on the image. When the screen displays the preview video, the preview video also has a streak movement phenomenon. This phenomenon of streaking on an image or video is called flicker (flicker), as shown in fig. 3.
The essence of flicker is that the phase relation of the exposure starting point of the pixel to alternating current is continuously changed, and one method for eliminating flicker is to set the exposure time to be an integral multiple of the period of the change of the ambient light brightness, so that the number of photons accumulated by each row of pixels in the exposure time is the same, and the stripes on the video are not moved any more.
The above method requires adjusting the frame interval so that the frame interval becomes an integer multiple of the period of the ambient light level change. For a double shot scene (a scene with three or more cameras has the following problems), in order to synthesize images shot by different cameras, the terminal device needs to adjust the frame interval to synchronize frames acquired by different cameras, so that two mechanisms for adjusting the frame interval exist, and when the two mechanisms are effective at the same time, the frame interval is caused to change drastically.
First, a hardware architecture and a software architecture suitable for the terminal device of the present application will be described.
As shown in fig. 4, the terminal device may be a mobile phone, a foldable electronic device, a tablet computer, a desktop computer, a laptop computer, a handheld computer, a notebook computer, an ultra-mobile personal computer (UMPC), a netbook, a cellular phone, a personal digital assistant (personal digital assistant, PDA), an augmented reality (augmented reality, AR) device, a Virtual Reality (VR) device, an artificial intelligence (artificial intelligence, AI) device, a wearable device, a vehicle-mounted device, a smart home device, or a smart city device. The embodiment of the application does not particularly limit the specific type of the terminal device.
The terminal device may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (universal serial bus, USB) interface 130, a charge management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, keys 190, a motor 191, an indicator 192, a camera module 193, a display 194, and a subscriber identity module (subscriber identification module, SIM) card interface 195. The sensor module 180 may include a pressure sensor 180A, a gyro sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, and the like.
The configuration shown in fig. 4 does not constitute a specific limitation on the terminal device. In other embodiments of the present application, the terminal device may include more or fewer components than those shown in fig. 4, or the terminal device may include a combination of some of the components shown in fig. 4, or the terminal device may include sub-components of some of the components shown in fig. 4. The components shown in fig. 4 may be implemented in hardware, software, or a combination of software and hardware.
Processor 110 may include one or more processing units. For example, the processor 110 may include at least one of the following processing units: application processors (application processor, AP), modem processors, graphics processors (graphics processing unit, GPU), image signal processors (image signal processor, ISP), controllers, video codecs, digital signal processors (digital signal processor, DSP), baseband processors, neural-Network Processors (NPU). The different processing units may be separate devices or integrated devices.
The controller can generate operation control signals according to the instruction operation codes and the time sequence signals to finish the control of instruction fetching and instruction execution.
A memory may also be provided in the processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that the processor 110 has just used or recycled. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Repeated accesses are avoided and the latency of the processor 110 is reduced, thereby improving the efficiency of the system.
In some embodiments, the processor 110 may include one or more interfaces. For example, the processor 110 may include at least one of the following interfaces: inter-integrated circuit, I2C) interfaces, inter-integrated circuit audio (inter-integrated circuit sound, I2S) interfaces, pulse code modulation (pulse code modulation, PCM) interfaces, universal asynchronous receiver transmitter (universal asynchronous receiver/transmitter, UART) interfaces, mobile industry processor interfaces (mobile industry processor interface, MIPI), general-purpose input/output (GPIO) interfaces, SIM interfaces, USB interfaces.
The I2C interface is a bi-directional synchronous serial bus comprising a serial data line (SDA) and a serial clock line (derail clock line, SCL). In some embodiments, the processor 110 may contain multiple sets of I2C buses. The processor 110 may be coupled to the touch sensor 180K, charger, flash, camera module 193, etc., respectively, through different I2C bus interfaces. For example: the processor 110 may be coupled to the touch sensor 180K through an I2C interface, so that the processor 110 and the touch sensor 180K communicate through an I2C bus interface to implement a touch function of the terminal device.
The I2S interface may be used for audio communication. In some embodiments, the processor 110 may contain multiple sets of I2S buses. The processor 110 may be coupled to the audio module 170 via an I2S bus to enable communication between the processor 110 and the audio module 170. In some embodiments, the audio module 170 may transmit an audio signal to the wireless communication module 160 through the I2S interface, to implement a function of answering a call through the bluetooth headset.
PCM interfaces may also be used for audio communication to sample, quantize and encode analog signals. In some embodiments, the audio module 170 and the wireless communication module 160 may be coupled through a PCM interface. In some embodiments, the audio module 170 may also transmit audio signals to the wireless communication module 160 through the PCM interface to implement a function of answering a call through the bluetooth headset. Both the I2S interface and the PCM interface may be used for audio communication.
The UART interface is a universal serial data bus for asynchronous communications. The bus may be a bi-directional communication bus. It converts the data to be transmitted between serial communication and parallel communication. In some embodiments, a UART interface is typically used to connect the processor 110 with the wireless communication module 160. For example: the processor 110 communicates with a bluetooth module in the wireless communication module 160 through a UART interface to implement a bluetooth function. In some embodiments, the audio module 170 may transmit an audio signal to the wireless communication module 160 through a UART interface, to implement a function of playing music through a bluetooth headset.
The MIPI interface may be used to connect the processor 110 with peripheral devices such as the display 194 and camera module 193. The MIPI interfaces include camera serial interfaces (camera serial interface, CSI), display serial interfaces (display serial interface, DSI), and the like. In some embodiments, processor 110 and camera module 193 communicate through a CSI interface to implement camera functionality of the terminal device. The processor 110 and the display 194 communicate via a DSI interface to implement the display function of the terminal device.
The GPIO interface may be configured by software. The GPIO interface may be configured as a control signal interface as well as a data signal interface. In some embodiments, a GPIO interface may be used to connect the processor 110 with the camera module 193, the display 194, the wireless communication module 160, the audio module 170, and the sensor module 180. The GPIO interface may also be configured as an I2C interface, an I2S interface, a UART interface, or a MIPI interface.
The connection relationship between the modules shown in fig. 4 is only schematically illustrated, and does not constitute a limitation on the connection relationship between the modules of the terminal device. Alternatively, each module of the terminal device may use an interface connection manner different from the connection manner in the above embodiment, or each module of the terminal device may use a combination of multiple connection manners in the above embodiment.
The USB interface 130 is an interface conforming to the USB standard specification, and may be, for example, a Mini (Mini) USB interface, a Micro (Micro) USB interface, or a C-type USB (USB Type C) interface. The USB interface 130 may be used to connect a charger to charge a terminal device, to transfer data between the terminal device and a peripheral device, and to connect a headset to play audio through the headset. The USB interface 130 may also be used to connect other devices, such as AR equipment.
The charge management module 140 is used to receive power from a charger. The charger can be a wireless charger or a wired charger. In some wired charging embodiments, the charge management module 140 may receive the current of the wired charger through the USB interface 130. In some wireless charging embodiments, the charge management module 140 may receive electromagnetic waves (current path shown in dashed lines) through a wireless charging coil of the terminal device. The charging management module 140 may also supply power to the terminal device through the power management module 141 while charging the battery 142.
The power management module 141 is used for connecting the battery 142, the charge management module 140 and the processor 110. The power management module 141 receives input from the battery 142 and/or the charge management module 140 to power the processor 110, the internal memory 121, the display 194, the camera module 193, and the wireless communication module 160. The power management module 141 may also be used to monitor parameters such as battery capacity, battery cycle times, and battery state of health (e.g., leakage, impedance). Alternatively, the power management module 141 may be provided in the processor 110, or the power management module 141 and the charge management module 140 may be provided in the same device.
The wireless communication function of the terminal device may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, the modem processor, the baseband processor, and other devices.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the terminal device may be used to cover a single or multiple communication bands. Different antennas may also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed into a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 150 may provide a solution for wireless communication applied at the terminal device, for example, to the following solutionsOne less: second generation (2) th generation, 2G) mobile communication solutions, third generation (3 th generation, 3G) mobile communication solution, fourth generation (4 th generation, 5G) mobile communication solution, fifth generation (5 th generation, 5G) mobile communication solution. The mobile communication module 150 may include at least one filter, switch, power amplifier and low noise amplifier (low noise amplifier, LNA), etc. The mobile communication module 150 may receive electromagnetic waves from the antenna 1, perform processes such as filtering and amplifying the received electromagnetic waves, and then transmit the electromagnetic waves to a modem processor for demodulation. The mobile communication module 150 may further amplify the signal modulated by the modem processor, and the amplified signal is converted into electromagnetic waves by the antenna 1 and radiated. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the processor 110. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be provided in the same device as at least some of the modules of the processor 110.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating the low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then transmits the demodulated low frequency baseband signal to the baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then transferred to the application processor. The application processor outputs sound signals through audio devices (e.g., speaker 170A and receiver 170B) or displays images or video through a display screen 194. In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be provided in the same device as the mobile communication module 150 or other functional module, independent of the processor 110.
Similar to the mobile communication module 150, the wireless communication module 160 may also provide wireless communication solutions applied on the terminal device, such as at least one of the following: wireless local area network (wireless local area networks, WLAN), bluetooth (BT), global navigation satellite system (global navigation satellite system, GNSS), frequency modulation (frequency modulation, FM), near field communication (near field communication, NFC), infrared (IR). The wireless communication module 160 may be one or more devices that integrate at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 2, frequency-modulates and filters the electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, frequency modulate and amplify it, and convert the signal into electromagnetic waves to radiate via the antenna 2.
In some embodiments, antenna 1 of the terminal device is coupled to mobile communication module 150 and antenna 2 of the terminal device is coupled to wireless communication module 160.
The terminal device may implement display functions through the GPU, the display screen 194, and the application processor. The GPU is a microprocessor for image processing, and is connected to the display 194 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. Processor 110 may include one or more GPUs that execute program instructions to generate or change display information.
The display screen 194 may be used to display images or video. The display 194 includes a display panel. The display panel may employ a liquid crystal display (liquid crystal display, LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (AMOLED), a flexible light-emitting diode (FLED), a Mini light-emitting diode (Mini LED), a Micro light-emitting diode (Micro LED), a Micro OLED (Micro OLED), or a quantum dot LED (quantum dot light emitting diodes, QLED). In some embodiments, the terminal device may include 1 or N display screens 194, N being a positive integer greater than 1.
The terminal device may implement a photographing function through a camera module 193, an ISP, a DSP, a video codec, a GPU, a display screen 194, an application processor, and the like.
The camera module 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image onto the photosensitive element. The photosensitive element may be a charge coupled device (charge coupled device, CCD) or a Complementary Metal Oxide Semiconductor (CMOS) phototransistor, wherein the CCD and CMOS may be referred to as an image sensor. The photosensitive element converts the optical signal into an electrical signal, which is then transferred to the ISP to be converted into a digital image signal. The ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into a standard Red Green Blue (RGB), YUV, etc. format image signal. In some embodiments, the terminal device may include 1 or N camera modules 193, N being a positive integer greater than 1.
The camera module 193 further includes a flicker sensor for detecting a brightness variation period of the ambient light so that the terminal device adjusts an exposure time of the image sensor based on the brightness variation period of the ambient light.
The ISP is used to process the data fed back by the camera module 193. For example, when shooting, a shutter is opened, light is transmitted to a camera photosensitive element through a lens, an optical signal is converted into an electric signal, and the camera photosensitive element transmits the electric signal to an ISP for processing and is converted into an image visible to naked eyes. The ISP can carry out algorithm optimization on noise, brightness and color of the image, and can optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be disposed in the camera module 193.
The DSP is used to process digital signals, and may process other digital signals in addition to digital image signals. For example, when the terminal device selects a frequency point, the DSP is used to perform fourier transform or the like on the frequency point energy.
Video codecs are used to compress or decompress digital video. The terminal device may support one or more video codecs. In this way, the terminal device may play or record video in multiple encoding formats, for example: dynamic picture experts group (moving picture experts group, MPEG) 1, MPEG2, MPEG3, and MPEG4.
The NPU is a processor which refers to the biological neural network structure, for example, refers to the transmission mode among human brain neurons to rapidly process input information, and can also be continuously self-learned. The intelligent cognition and other functions of the terminal equipment can be realized through the NPU, for example: image recognition, face recognition, speech recognition, and text understanding.
In some embodiments, the camera module 193 may be composed of a color camera module and a 3D sensing module.
In some embodiments, the photosensitive element of the camera of the color camera module may be a CCD or CMOS phototransistor. The photosensitive element converts the optical signal into an electrical signal, which is then transferred to the ISP to be converted into a digital image signal. The ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into an image signal in a standard RGB, YUV, or the like format.
In some embodiments, the 3D sensing module may be a time of flight (TOF) 3D sensing module or a structured light (3D) sensing module. The structured light 3D sensing is an active depth sensing technology, and basic components of the structured light 3D sensing module may include an IR emitter, an IR camera module, and the like. The working principle of the structured light 3D sensing module is that a light spot (pattern) with a specific pattern is emitted to a shot object, then a light spot pattern code (light coding) on the surface of the object is received, and the difference between the light spot and an original projected light spot is compared, and the three-dimensional coordinate of the object is calculated by utilizing the triangle principle. The three-dimensional coordinates include the distance from the terminal device to the object to be photographed. The TOF 3D sensing may be an active depth sensing technology, and the basic components of the TOF 3D sensing module may include an IR emitter, an IR camera module, and the like. The working principle of the TOF 3D sensing module is to calculate the distance (namely depth) between the TOF 3D sensing module and the shot object through the time of infrared ray turn-back so as to obtain a 3D depth map.
The structured light 3D sensing module can also be applied to the fields of face recognition, somatosensory game machines, industrial machine vision detection and the like. The TOF 3D sensing module can also be applied to the fields of game machines, AR/VR and the like.
In other embodiments, camera module 193 may also be comprised of two or more cameras. The two or more cameras may include a color camera that may be used to capture color image data of the object being photographed. The two or more cameras may employ stereoscopic vision (stereo) technology to acquire depth data of the photographed object. The stereoscopic vision technology is based on the principle of parallax of human eyes, and obtains distance information, namely depth information, between terminal equipment and a photographed object by shooting images of the same object from different angles through two or more cameras under a natural light source, and then performing operations such as a triangulation method.
In some embodiments, the terminal device may include 1 or more camera modules 193. For example, the terminal device may include 1 front camera module 193 and 1 rear camera module 193. The front camera module 193 can be used to collect color image data and depth data of a photographer facing the display screen 194, and the rear camera module can be used to collect color image data and depth data of a photographed object (such as a person, a landscape, etc.) facing the photographer.
In some embodiments, a CPU, GPU or NPU in the processor 110 may process color image data and depth data acquired by the camera module 193. In some embodiments, the NPU may identify color image data acquired by the camera module 193 by a neural network algorithm, such as a convolutional neural network algorithm, based on which bone point identification techniques are based, to determine bone points of the captured person. The CPU or GPU may also be operable to run a neural network algorithm to effect determination of skeletal points of the captured person from the color image data. In some embodiments, the CPU, GPU or NPU may be further configured to confirm the stature (such as body proportion, weight of the body part between the skeletal points) of the photographed person based on the depth data collected by the camera module 193 (which may be a 3D sensing module) and the identified skeletal points, and further determine body beautification parameters for the photographed person, and finally process the photographed image of the photographed person according to the body beautification parameters, so that the body form of the photographed person in the photographed image is beautified.
The external memory interface 120 may be used to connect an external memory card, such as a Secure Digital (SD) card, to enable expansion of the memory capabilities of the terminal device. The external memory card communicates with the processor 110 through an external memory interface 120 to implement data storage functions. Files such as music and video are stored in an external memory card.
The internal memory 121 may be used to store computer executable program code including instructions. The internal memory 121 may include a storage program area and a storage data area. Wherein the storage program area may store application programs required for at least one function (e.g., a sound playing function and an image playing function) of the operating system. The storage data area may store data (e.g., audio data and phonebooks) created during use of the terminal device. Further, the internal memory 121 may include a high-speed random access memory, and may also include a nonvolatile memory such as: at least one disk storage device, a flash memory device, and a universal flash memory (universal flash storage, UFS), etc. The processor 110 performs various processing methods of the terminal device by executing instructions stored in the internal memory 121 and/or instructions stored in a memory provided in the processor.
The terminal device may implement audio functions, such as music playing and recording, through an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, an application processor, and the like.
The audio module 170 is used to convert digital audio information into an analog audio signal output, and may also be used to convert an analog audio input into a digital audio signal. The audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 or a portion of the functional modules of the audio module 170 may be disposed in the processor 110.
The speaker 170A, also referred to as a horn, is used to convert audio electrical signals into sound signals. The terminal device may listen to music or hands-free conversation through speaker 170A.
A receiver 170B, also referred to as an earpiece, converts the audio electrical signal into a sound signal. When a user uses the terminal device to answer a call or voice message, the voice can be answered by bringing the receiver 170B close to the ear.
Microphone 170C, also known as a microphone or microphone, is used to convert sound signals into electrical signals. When a user makes a call or transmits voice information, a sound signal may be input to the microphone 170C by sounding near the microphone 170C. The terminal device may be provided with at least one microphone 170C. In other embodiments, the terminal device may be provided with two microphones 170C to implement the noise reduction function. In other embodiments, the terminal device may further be provided with three, four or more microphones 170C to perform the functions of sound signal acquisition, noise reduction, sound source identification, and directional recording.
The earphone interface 170D is used to connect a wired earphone. The earphone interface 170D may be a USB interface 130 or a 3.5mm open mobile terminal platform (open mobile terminal platform, OMTP) standard interface, a american cellular telecommunications industry association (cellular telecommunications industry association of the USA, CTIA) standard interface.
The pressure sensor 180A is used to sense a pressure signal, and may convert the pressure signal into an electrical signal. In some embodiments, the pressure sensor 180A may be disposed on the display screen 194. The pressure sensor 180A may be of various types, such as a resistive pressure sensor, an inductive pressure sensor, or a capacitive pressure sensor. The capacitive pressure sensor may be a device comprising at least two parallel plates with conductive material, the capacitance between the electrodes changing when a force is applied to the pressure sensor 180A, the terminal device determining the strength of the pressure based on the change in capacitance. When a touch operation acts on the display screen 194, the terminal device detects the touch operation according to the pressure sensor 180A. The terminal device may also calculate the position of the touch based on the detection signal of the pressure sensor 180A. In some embodiments, touch operations that act on the same touch location, but at different touch operation strengths, may correspond to different operation instructions. For example: executing an instruction for checking the short message when the touch operation with the touch operation intensity smaller than the first pressure threshold acts on the short message application icon; and executing the instruction of newly creating the short message when the touch operation with the touch operation intensity being larger than or equal to the first pressure threshold acts on the short message application icon.
The gyro sensor 180B may be used to determine a motion gesture of the terminal device. In some embodiments, the angular velocity of the terminal device about three axes (i.e., x-axis, y-axis, and z-axis) may be determined by the gyro sensor 180B. The gyro sensor 180B may be used for photographing anti-shake. For example, when the shutter is pressed, the gyro sensor 180B detects the angle of the shake of the terminal device, calculates the distance to be compensated by the lens module according to the angle, and makes the lens counteract the shake of the terminal device by the reverse movement, thereby realizing anti-shake. The gyro sensor 180B can also be used for scenes such as navigation and motion sensing games.
The air pressure sensor 180C is used to measure air pressure. In some embodiments, the terminal device calculates altitude from barometric pressure values measured by barometric pressure sensor 180C, aiding in positioning and navigation.
The magnetic sensor 180D includes a hall sensor. The terminal device may detect the opening and closing of the flip cover using the magnetic sensor 180D. In some embodiments, when the terminal device is a flip machine, the terminal device may detect the opening and closing of the flip according to the magnetic sensor 180D. The terminal equipment can set the characteristics of automatic unlocking of the flip cover and the like according to the detected opening and closing state of the leather sheath or the detected opening and closing state of the flip cover.
The acceleration sensor 180E can detect the magnitude of acceleration of the terminal device in various directions (typically, x-axis, y-axis, and z-axis). The magnitude and direction of gravity can be detected when the terminal device is stationary. The acceleration sensor 180E may also be used to identify the gesture of the terminal device, as an input parameter for applications such as landscape switching and pedometer.
The distance sensor 180F is used to measure a distance. The terminal device may measure the distance by infrared or laser. In some embodiments, for example in a shooting scene, the terminal device may range using the distance sensor 180F to achieve fast focus.
The proximity light sensor 180G may include, for example, a light-emitting diode (LED) and a light detector, for example, a photodiode. The LED may be an infrared LED. The terminal device emits infrared light outwards through the LED. The terminal device detects infrared reflected light from nearby objects using a photodiode. When the reflected light is detected, the terminal device may determine that an object is present nearby. When no reflected light is detected, the terminal device may determine that there is no object nearby. The terminal device can use the proximity light sensor 180G to detect whether the user holds the terminal device close to the ear for communication, so as to automatically extinguish the screen to achieve the purpose of saving electricity. The proximity light sensor 180G may also be used for automatic unlocking and automatic screen locking in holster mode or pocket mode.
The fingerprint sensor 180H is used to collect a fingerprint. The terminal equipment can utilize the collected fingerprint characteristics to realize the functions of unlocking, accessing an application lock, shooting, answering an incoming call and the like.
The temperature sensor 180J is for detecting temperature. In some embodiments, the terminal device performs a temperature processing strategy using the temperature detected by temperature sensor 180J. For example, when the temperature reported by the temperature sensor 180J exceeds a threshold, the terminal device performs a reduction in performance of a processor located near the temperature sensor 180J in order to reduce power consumption for implementing thermal protection. In other embodiments, the terminal device heats the battery 142 when the temperature is below another threshold to avoid the terminal device from being abnormally shut down due to low temperatures. In other embodiments, when the temperature is below a further threshold, the terminal device performs boosting of the output voltage of the battery 142 to avoid abnormal shutdown caused by low temperatures.
The touch sensor 180K, also referred to as a touch device. The touch sensor 180K may be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, which is also called a touch screen. The touch sensor 180K is for detecting a touch operation acting thereon or thereabout. The touch sensor 180K may communicate the detected touch operation to the application processor to determine the touch event type. Visual output related to touch operations may be provided through the display 194. In other embodiments, the touch sensor 180K may also be disposed on the surface of the terminal device and at a different location than the display 194.
The keys 190 include a power-on key and an volume key. The keys 190 may be mechanical keys or touch keys. The terminal device may receive the key input signal and implement a function associated with the key input signal.
The motor 191 may generate vibration. The motor 191 may be used for incoming call alerting as well as for touch feedback. The motor 191 may generate different vibration feedback effects for touch operations acting on different applications. The motor 191 may also produce different vibration feedback effects for touch operations acting on different areas of the display screen 194. Different application scenarios (e.g., time alert, receipt message, alarm clock, and game) may correspond to different vibration feedback effects. The touch vibration feedback effect may also support customization.
The indicator 192 may be an indicator light, which may be used to indicate a change in state of charge and charge, or may be used to indicate a message, missed call, and notification.
The SIM card interface 195 is used to connect a SIM card. The SIM card can be inserted into the SIM card interface 195 to make contact with the terminal device, or can be pulled out from the SIM card interface 195 to be separated from the terminal device. The terminal device may support 1 or N SIM card interfaces, N being a positive integer greater than 1. The same SIM card interface 195 may simultaneously insert multiple cards, which may be of the same type or of different types. The SIM card interface 195 may also be compatible with external memory cards. The terminal equipment interacts with the network through the SIM card to realize the functions of communication, data communication and the like. In some embodiments, the terminal device employs an embedded SIM (eSIM) card, which may be embedded in the terminal device and not separable from the terminal device.
The hardware system of the terminal device is described in detail above, and the software system of the terminal device is described below. The software system may adopt a layered architecture, an event driven architecture, a microkernel architecture, a micro service architecture or a cloud architecture, and the embodiment of the present application exemplarily describes the software system of the terminal device by taking the layered architecture as an example.
As shown in fig. 5, the software system using the hierarchical architecture is divided into several layers, each of which has a clear role and division. The layers communicate with each other through a software interface. In some embodiments, the software system may be divided into five layers, from top to bottom, an application layer, an application framework layer, a native (native) layer, a hardware abstraction layer (hardware abstract layer, HAL), and a kernel layer, respectively.
The application layer may include a series of application packages.
Application packages may include camera, calendar, call, map, WLAN, music, text messages, gallery, call, navigation, bluetooth, and video Applications (APP).
The application framework layer provides an application programming interface (application programming interface, API) and programming framework for application programs of the application layer. The application framework layer may include some predefined functions.
For example, the application framework layer includes a content provider, a view system, and a manager, where the manager may include an activity manager, a notification manager, a window manager, an input manager, and a resource manager.
The content provider is used to store and retrieve data and make such data accessible to applications. The data may include video, images, audio, calls made and received, browsing history and bookmarks, and phonebooks.
The view system includes visual controls, such as controls to display text and controls to display pictures. The view system may be used to build applications. The display interface may be composed of one or more views, for example, a display interface including a text notification icon may include a view displaying text and a view displaying a picture.
The activity manager may provide an activity management service (activity manager service, AMS) that may be used for the initiation, switching, scheduling of system components (e.g., activities, services, content providers, and broadcast receivers), and management and scheduling of application processes.
The notification manager allows the application to display notification information in a status bar, can be used to communicate notification type messages, can automatically disappear after a short dwell, and does not require user interaction. Such as a notification manager, is used for download completion notification and message alerting. The notification manager may also manage notifications that appear in the system top status bar in the form of charts or scroll bar text, such as notifications for applications running in the background. The notification manager may also manage notifications that appear on the screen in the form of dialog windows, such as prompting text messages in status bars, sounding prompts, vibrating electronic devices, and flashing lights.
The window manager provides window management services (window manager service, WMS) that may be used for window management, window animation management, surface management, and as a transfer station to the input system.
The input manager may provide input management services (input manager service, IMS), which may be used to manage inputs to the system, e.g., touch screen inputs, key inputs, sensor inputs, etc. The IMS retrieves events from the input device node and distributes the events to the appropriate windows through interactions with the WMS.
The resource manager provides various resources for the application program, such as localization strings, icons, pictures, layout files, and video files.
The native layer is also called a system runtime layer, and comprises a native C/C++ library and An Zhuoyun rows (Android TM runtime,ART)。
The native C/c++ library may include a number of functional modules such as: surface manager (surface manager), media frame (media frame), 3D graphics processing library (e.g., embedded system open graphics library (open graphics library for embedded system, openGL ES)), 2D graphics engine (e.g., shadow graphics library (skia graphics library, SGL)), and C-standard library (C standard library, libc).
The surface manager is used to manage the display subsystem and provides a fusion of the 2D and 3D layers for the plurality of applications.
The media framework supports playback and recording in multiple audio formats, playback and recording in multiple video formats, and still image files. The media framework may support a variety of audio video coding formats such as MPEG4, H.264, MPEG audio layer III (moving picture experts group audio layer III, MP 3), advanced audio coding (advanced audio coding, AAC), adaptive multi-rate (AMR), joint photographic experts group (joint photographic experts group, JPEG), and portable network graphics (portable network graphics, PNG).
SGL is the drawing engine for 2D drawing.
OpenGL ES may be used to implement 3D graphics drawing, image rendering, compositing, and layer processing.
libc is used to provide the basic C language function.
ART is primarily responsible for converting source code into machine code. ART also has functions of memory management, garbage collection, etc.
ART includes core libraries and virtual machines.
The core libraries are mainly used to provide basic Java class libraries, such as basic data structure, math, input/output (I/O), tools, networks, etc. The core library also provides system APIs for developers.
The virtual machine may be a Dalvik virtual machine that converts source code into machine code using Just In Time (JIT) compilation strategy, or an ART virtual machine that converts source code into machine code using advanced time (AOT) compilation strategy.
The application layer and the application framework layer run in a virtual machine. The virtual machine converts java files of the application program layer and the application program framework layer into binary files. The virtual machine is used for executing functions such as management of object life cycle, stack management, thread management, security and exception management, garbage collection and the like.
The hardware abstraction layer runs in a user space (user space), encapsulates the kernel layer driver, and provides a call interface to the upper layer. The hardware abstraction layer contains, for example, a Chi-cdk, an Auto Exposure (AE) module, an image sensor module, and a scintillation sensor module.
Chi-cdk is primarily used to process data for some of the devices that assist in image sensor mapping (e.g., scintillation sensor and gyro sensor 180B).
The AE module is mainly used to control exposure times of the flicker sensor and the image sensor (e.g., CMOS in the camera module 193).
The image sensor module is mainly used for processing data acquired by the image sensor and validating control data (such as gain and shutter) of the image sensor.
The image sensor module comprises a main shooting module and an auxiliary shooting module, wherein the main shooting module is used for processing data of the main camera, and the main shooting module is used for processing data of the auxiliary camera.
As shown in fig. 6, the main camera module includes a pre-sensor update module, for updating parameters such as exposure and frame rate of the main camera; the auxiliary shot module includes a pre-sensor update module preparesensoupdate and a frame rate matching exposure information adjustment module AdjustExposureInfoForFPSMatch, prepareSensorUpdate for updating parameters such as exposure and frame rate of the main shot, and an adjustExposurenfo forfpsmatch for adjusting the frame rate of the auxiliary shot so as to facilitate alignment of the auxiliary shot and the main shot with frames (i.e., time stamps of the aligned frames).
The flicker sensor module is mainly used for processing data collected by the flicker sensor and controlling data (such as gain and shutter) of the effective flicker sensor.
The kernel layer is a layer between hardware and software. The kernel layer contains, for example, a flicker sensor driver and an image sensor driver.
The image sensor driver is mainly used for processing data collected by the image sensor and validating control data (such as gain and shutter) of the image sensor.
The flicker sensor driver is mainly used for processing data collected by the flicker sensor and validating control data (such as gain and shutter) of the flicker sensor.
It should be understood that the hardware structures and software architectures shown in fig. 4-6 are merely exemplary illustrations of terminal devices, and do not constitute limitations on the hardware and software of the terminal devices, and that the terminal devices may have other types of hardware structures and software architectures.
The workflow of the software system and the hardware system of the terminal device is exemplarily described below in connection with a shooting scenario.
As shown in fig. 7, when a user performs a touch operation on the display screen 194, the touch sensor 180K senses the touch operation, and transmits a corresponding hardware interrupt to the operating system layer, which processes the touch operation into a raw input event including, for example, information of coordinates and time stamps of the touch operation. The original input event is stored at the kernel layer. The application framework layer acquires an original input event from the kernel layer, and identifies a control corresponding to the original input event. When the touch operation is a clicking operation and the control is a control of the camera APP, the control invokes an interface of the application framework layer to start the camera APP, and further, the kernel layer is invoked to start an image sensor driver and a flicker sensor driver, and an image or video is captured by the camera module 193.
Fig. 8 is an information flow diagram of a shooting scene.
After the user clicks the icon of the camera APP, the camera APP generates a control instruction for starting image capturing, and the control instruction is sequentially transmitted to hardware such as a flicker sensor and an image sensor as indicated by solid-line arrows, and the hardware starts to operate. The flicker sensor collects ambient light data and transmits the ambient light data to the flicker sensor driver. The scintillation sensor drive in turn communicates ambient light data to the AE module. The AE module is also capable of acquiring a frame rate of a current frame acquired by the image sensor through the image sensor drive.
The following describes an embodiment of a method for adjusting a frame interval of a camera provided in the present application.
As shown in fig. 9, the method includes the following steps.
S901, the AE module acquires a current frame captured by the main camera and current frames captured by N auxiliary cameras (i.e., a first frame and N second frames).
In some shooting modes, the terminal device needs to use a plurality of cameras to shoot, synthesizes frames shot by the cameras, and finally displays the synthesized frames to a user. For example, in the portrait mode and the large aperture mode, the terminal device needs two cameras to participate in shooting, and the method of the application can be also suitable for scenes shot by three or more cameras together.
S902, the AE module determines whether the frame interval of the current frame is an integer multiple of the ambient light brightness change period.
Under the condition that the terminal equipment starts an anti-flicker mode, firstly, an AE module is required to identify whether a current frame (a first frame and N second frames) is an integral multiple of an ambient light brightness change period, if so, the frame interval does not need to be adjusted, S901 can be executed in a return mode, frames acquired by each camera at the next moment are acquired, and whether the frame interval of the frames acquired by each camera at the next moment is an integral multiple of the ambient light brightness change period is determined; if not, the frame interval needs to be adjusted. When the frame interval needs to be adjusted, it is further determined whether the target frame (the frame of the first frame and the N second frames that needs to be adjusted) is a synchronous frame, i.e. the following steps are performed.
S903, the AE module determines whether the target frame in the current frame is a sync frame.
If the current frame shot by the main camera is not synchronous with the current frames shot by the N auxiliary cameras, the AE module needs to do synchronous processing. However, the synchronization process needs to adjust the frame interval, and in the case that the terminal device starts the anti-flicker mode, the AE module needs to adjust the frame interval so that the frame interval becomes an integer multiple of the period of the ambient light brightness change, so that two mechanisms for adjusting the frame interval exist, and when the two mechanisms are simultaneously effective, the frame interval is caused to change severely. Therefore, when the AE module needs to adjust the frame interval to execute the anti-flicker algorithm, the AE module cannot directly adjust the frame interval, but needs to execute S903 to determine whether the frame interval of the current frame can be adjusted.
Optionally, the AE module determines whether the current frame is a synchronous frame according to an identification of the current frame, for example, the identification indicates that the current frame is an asynchronous frame when 0, and indicates that the current frame is a synchronous frame when 1.
If the target frame is a sync frame, the AE module performs S904; if the target frame is not a sync frame, the AE module performs S905.
S904, the AE module adjusts the frame interval of the current frame based on the frame synchronization algorithm.
Due to factors of device technology, the actual frame rate of different cameras is different from the set frame rate. For example, the sensor drive sets a frame rate of 24fps (i.e., a frame interval of 1000/24=41.66 ms), but the frame rates of two cameras (e.g., main and sub) may not be standard 24fps, it is possible that the frame rate of one camera is 23.98fps (frame interval of 41.70 ms) and the frame rate of the other camera is 24.02fps (frame interval of 41.63 ms). The time of each frame acquisition of the two cameras is different, the difference is accumulated more and more along with the time, and finally the imaging quality is seriously affected.
The cumulative effect of this difference is shown in fig. 10.
Assuming that the 1 st frame of the main shot and the auxiliary shot are synchronized, then the frame out times of each frame differ by 0.07ms, by the 5 th frame, the frame out times of the main shot and the auxiliary shot have been different by 0.35ms, and by the 100 th frame, the frame out times of the main shot and the auxiliary shot have been different by 7ms. In general, in order to ensure imaging quality, the frame-out time difference of two frames needs to be controlled within 1ms, and thus, synchronization processing needs to be performed for the main shot and the auxiliary shot.
When the synchronization processing is performed, the frame interval of the camera with faster frames can be lengthened. For example, the frame rate of the main shot is 24.02fps (the frame interval is 41.63 ms), the frame rate of the auxiliary shot is 23.98fps (the frame interval is 41.70 ms), and the AE module may perform synchronization processing once every 5 frames, adjust the frames of the 6 th frame of the main shot so that the frame interval of the 6 th frame of the main shot is lengthened to 41.98ms (41.63 ms+0.35 ms), and thus the frame out times of the 6 th frames of the main shot and auxiliary shot output become the same again, that is, the main shot and the auxiliary shot complete synchronization.
S905, the AE module adjusts the frame interval of the target frame based on the anti-flicker algorithm.
The AE module has a frame interval monitoring mechanism, if (frame interval-ambient light level change period) >0.1ms, the AE module considers that the frame interval is not an integer multiple of the ambient light level change period, and the AE module adjusts the frame interval upwards to be an integer multiple of the ambient light level change period, and 0.1ms is an example of a threshold value and is not limiting.
For example, the ambient light level change period is 50ms, the frame interval of the main shot is 50.05ms, the frame interval of the auxiliary shot is 50.45ms, 50.05-50=0.05 <0.1ms, the ae module determines that the frame interval of the main shot is 50ms, and the frame interval of the main shot is an integer multiple of the ambient light level change period without adjusting the frame rate of the main shot; 50.45-50 = 0.45>0.1ms, the AE module determines that the secondary shot frame interval is not 50ms and the secondary shot frame interval is not an integer multiple of the ambient light level variation period, and the AE module adjusts the secondary shot frame interval (i.e., the frame interval of the target frame) to an integer multiple of 10ms, i.e., 60ms.
In the above method provided by the application, under the condition that the terminal device starts the anti-flicker mode, the AE module first identifies whether the current frame (the first frame of the main shot output and the N second frames of the N auxiliary shot outputs) is an integer multiple of the ambient light brightness change period, if so, the frame interval does not need to be adjusted; if not, the frame interval needs to be adjusted. When the frame interval needs to be adjusted, judging whether the target frame (the frame needing to be adjusted in the frame intervals in the first frame and the N second frames) is a synchronous frame or not; if the target frame is not a synchronous frame, the anti-flicker algorithm is used for adjusting the frame interval, so that the situation that the frame interval is adjusted by two algorithms simultaneously to cause the frame rate of the main shot or the auxiliary shot to change drastically can be avoided.
How the method shown in fig. 9 achieves the above-described advantageous effects is described in detail below in connection with examples.
As shown in fig. 10, the AE module detects the ambient light at 50hz by the flicker sensor, and the AE module can adjust the frame rate of the main and auxiliary shots to 20fps. However, due to errors caused by device processing, the frame rate of the primary and secondary shots may not be 20fps, and it is possible that the frame rate of the primary shot is 19.98fps (50.05 ms) and the frame rate of the secondary shot is 20.02fps (49.95 ms), so that the frame times of the primary and secondary shots differ by 0.5ms after 5 frames. By the 6 th frame, for synchronization, the frame interval of the image sensor with the fast frame out (the image sensor with the auxiliary shot) becomes 50.45ms after time compensation, and the image sensor with the slow frame out (the image sensor with the main shot) is 50.05ms.
If the AE module performs the anti-flicker algorithm at the synchronization frame (frame 6), the AE module may perform the following method: the AE module determines that the frame rate of the secondary shot is not an integer multiple of the ambient light frequency (50.45-50 = 0.45>0.1 ms) based on a frame interval monitoring mechanism, adjusts the frame interval of the secondary shot image sensor to 60ms. The AE module determines that the frame rate of the main shot is an integral multiple (50.05-50=0.05 <0.1 ms) of the ambient light frequency based on a frame interval monitoring mechanism, and the frame rate of the main shot is 50ms, so that the frames of the auxiliary shot of the main shot are asynchronous, the synchronization mechanism is invalid, the frame interval of the auxiliary shot is larger and larger along with the progress of time, the frame interval of the main shot is unchanged, and finally, the frames of the main shot and the auxiliary shot cannot be synthesized.
If the AE module does not perform the anti-flicker algorithm in the synchronization frame (frame 6), the AE module may perform the following method: starting with frame 7, the primary and secondary shots recover the original frame rate (19.98 fps and 20.02 fps), and if the ambient light frequency is 50Hz, the AE module adjusts the frame rate of the primary and secondary shots to 20fps (50 ms), the frame rate of the primary shot is 19.98fps (50.05 ms) and the frame rate of the secondary shot is 20.02fps (49.95 ms) due to device process factors, but the time difference between the frame intervals of the primary and secondary shots is still within 1ms within 5 frames, and the primary and secondary shots are still synchronous. Where 1ms is an example of a threshold for frame synchronization beyond which the AE module would consider primary and secondary to be out of sync.
It can be seen that, after applying the method of fig. 9, the AE module executes the anti-flicker algorithm in the asynchronous frame, so that the flicker can be resisted, and the severe change of the frame rate can be avoided.
Optionally, the method of fig. 9 further includes: the sensor module determines whether the AE module adjusts a frame interval of a current frame; when the AE module adjusts the frame interval of the current frame, the sensor module determines that the frame intervals of the main camera and the N auxiliary cameras are not adjusted; when the AE module does not adjust the frame interval of the current frame, the sensor module adjusts the frame intervals of the main camera and the N auxiliary cameras.
In some cases, in order to reduce power consumption, the sensor module may reduce the frame rate of the camera output, for example, in the case where the frame rates of the main shot and the auxiliary shot are both 30fps, the sensor module may reduce the frame rates of the main shot and the auxiliary shot by 1.25 times to 24fps. However, the module for updating the main shooting frame rate and the module for updating the auxiliary shooting frame rate in the sensor module are two different sub-modules (as shown in fig. 6), the input of the module for updating the main shooting frame rate is the output of the AE module, and the input of the module for updating the auxiliary shooting frame rate is the output of the image sensor, so if the AE module adjusts the frame rate, the sensor module adjusts the frame rate again, the frame rates of the main shooting and the auxiliary shooting are different, and the main shooting and the auxiliary shooting are asynchronous.
Alternatively, the sensor module may receive a first frame interval of the current frame from the image sensor and a second frame interval of the current frame from the AE module; when the first frame interval is the same as the second frame interval, the sensor module determines that the AE module does not adjust the frame interval of the current frame; when the first frame interval is different from the second frame interval, the sensor module determines that the AE module adjusts the frame interval of the current frame.
In this embodiment, the sensor module adjusts the frame rate under the condition that the AE module does not adjust the frame rate, so that the input amounts of the module for updating the main shooting frame rate and the module for updating the auxiliary shooting frame rate are the same (both are the output of the image sensor), and the main shooting and the auxiliary shooting can be prevented from being asynchronous while the power consumption is reduced.
Fig. 11 is another method for adjusting a frame interval provided herein. The method includes the following.
S1101, the AE module obtains a current frame shot by the main camera and current frames shot by the N auxiliary cameras.
S1102, the AE module determines whether the frame interval of the current frame is an integer multiple of the ambient light brightness change period.
S1101 is the same as S901, S1102 is the same as S902, and specific details are not described again.
If the frame interval of the current frame is an integer multiple of the ambient light level variation period, the AE module may determine the frequency of the ambient light and perform the following steps.
S1103, the AE module determines whether the frequency of the ambient light is a preset frequency.
The anti-flicker algorithm generally adjusts the frame interval of the target frame based on a preset frequency (such as the common alternating current frequency of 50Hz and 60 Hz), if the ambient light frequency is not the preset frequency, the AE module cannot execute the anti-flicker algorithm to adjust the frame interval of the target frame, and the anti-flicker algorithm does not conflict with the frame synchronization algorithm, so that it is not necessary to determine whether the target frame is a synchronization frame.
If the frequency of the ambient light is the preset frequency, the AE module executes S1104; if the frequency of the ambient light is not the preset frequency, the AE module performs S1107.
S1104, the AE module determines whether the target frame in the current frame is a sync frame.
The target frame is a frame in the current frame for which the anti-flicker algorithm needs to be performed to adjust the frame interval. If the target frame is a sync frame, the AE module executes S1105; if the target frame is not a sync frame, the AE module performs S1106.
S1105, the AE module adjusts the frame interval of the current frame based on the frame synchronization algorithm.
S1106, the AE module adjusts the frame interval of the target frame based on the anti-flicker algorithm.
S1104 is the same as S903, S1105 is the same as S904, S1106 is the same as S905, and the detailed description is omitted.
S1107, the AE module determines the brightness of the ambient light.
S1108, the AE module determines an exposure value according to the brightness of the ambient light.
S1109, the AE module adjusts the frame interval of the current frame according to the exposure value.
If the ambient light frequency is not the preset frequency, the AE module cannot execute the anti-flicker algorithm to adjust the frame interval of the current frame, and the frame interval of the main camera and the auxiliary camera can be adjusted directly according to the exposure.
For example, when the brightness of the ambient light is large, the AE module may select 30 or 24 as the frame rate of the current frame based on the photographing mode, wherein the night view and professional mode may select 30 as the frame rate, and the portrait and large aperture may select 24 as the frame rate; when the brightness of the ambient light is large, the AE module may select 17 as the frame rate of the current frame.
The present application also provides a computer program product which, when executed by a processor, implements the method of any of the method embodiments of the present application.
The computer program product may be stored in a memory and eventually converted to an executable object file that can be executed by a processor through preprocessing, compiling, assembling, and linking.
The computer program product may also cure code in the chip. The present application is not limited to the specific form of computer program product.
The present application also provides a computer readable storage medium having stored thereon a computer program which, when executed by a computer, implements a method according to any of the method embodiments of the present application. The computer program may be a high-level language program or an executable object program.
The computer readable storage medium may be volatile memory or nonvolatile memory, or may include both volatile memory and nonvolatile memory. The nonvolatile memory may be a read-only memory (ROM), a Programmable ROM (PROM), an Erasable PROM (EPROM), an electrically Erasable EPROM (EEPROM), or a flash memory. The volatile memory may be random access memory (random access memory, RAM) which acts as an external cache. By way of example, and not limitation, many forms of RAM are available, such as Static RAM (SRAM), dynamic RAM (DRAM), synchronous DRAM (SDRAM), double data rate SDRAM (DDR SDRAM), enhanced SDRAM (ESDRAM), synchronous DRAM (SLDRAM), and direct memory bus RAM (DR RAM).
It will be clearly understood by those skilled in the art that, for convenience and brevity of description, specific working processes and technical effects of the apparatus and device described above may refer to corresponding processes and technical effects in the foregoing method embodiments, which are not described in detail herein.
In several embodiments provided in the present application, the disclosed systems, apparatuses, and methods may be implemented in other manners. For example, some features of the method embodiments described above may be omitted, or not performed. The above-described apparatus embodiments are merely illustrative, the division of units is merely a logical function division, and there may be additional divisions in actual implementation, and multiple units or components may be combined or integrated into another system. In addition, the coupling between the elements or the coupling between the elements may be direct or indirect, including electrical, mechanical, or other forms of connection.
It should be understood that, in various embodiments of the present application, the size of the sequence number of each process does not mean that the execution sequence of each process should be determined by its functions and internal logic, and should not constitute any limitation on the implementation process of the embodiments of the present application.
In addition, the terms "system" and "network" are often used interchangeably herein. The term "and/or" herein is merely one association relationship describing the associated object, meaning that there may be three relationships, e.g., a and/or B, may represent: a exists alone, A and B exist together, and B exists alone. In addition, the character "/" herein generally indicates that the front and rear associated objects are an "or" relationship.
In summary, the foregoing description is only a preferred embodiment of the technical solution of the present application, and is not intended to limit the scope of the present application. Any modification, equivalent replacement, improvement, etc. made within the spirit and principles of the present application should be included in the protection scope of the present application.

Claims (11)

1. A method for adjusting a frame interval, applied to a terminal device, the terminal device opening an anti-flicker mode, the terminal device including an AE module, the method comprising:
the AE module acquires a first frame shot by a main camera and N second frames shot by N auxiliary cameras, wherein N is a positive integer;
the AE module determines whether the frame interval of the first frame and the frame intervals of the N second frames are integer multiples of an ambient light brightness change period;
When the frame interval between the first frame and a target frame in the N second frames is not an integral multiple of the ambient light brightness change period, the AE module determines whether the target frame is a synchronous frame or not, and the target frame is a frame needing to adjust the frame interval in the first frame and the N second frames;
when the target frame is an unsynchronized frame, the AE module adjusts a frame interval of the target frame based on an anti-flicker algorithm.
2. The method of claim 1, wherein the terminal device further comprises a sensor module, the method further comprising:
the sensor module determines whether the AE module adjusts a frame interval of a current frame, the current frame being the first frame and/or the N second frames;
when the AE module adjusts the frame interval of the current frame, the sensor module determines not to adjust the frame intervals of the main camera and the N auxiliary cameras;
when the AE module does not adjust the frame interval of the current frame, the sensor module adjusts the frame intervals of the primary camera and the N secondary cameras.
3. The method of claim 2, wherein the sensor module determining whether the AE module has adjusted a frame interval of the current frame comprises:
The sensor module receives a first frame interval of the current frame from an image sensor;
the sensor module receives a second frame interval of the current frame from the AE module;
when the first frame interval is the same as the second frame interval, the sensor module determines that the AE module does not adjust the frame interval of the current frame;
when the first frame interval is different from the second frame interval, the sensor module determines that the AE module adjusts the frame interval of the current frame.
4. The method of claim 1, wherein the AE module is prior to determining whether the target frame is a sync frame, the method further comprising:
the AE module determines the frequency of ambient light;
when the frequency of the ambient light is a preset frequency, the AE module determines whether the target frame is a synchronization frame.
5. The method of claim 4, wherein the AE module adjusts a frame interval of the target frame based on an anti-flicker algorithm, comprising:
the AE module adjusts the frame interval of the target frame by adjusting the frame length line of the camera corresponding to the target frame, wherein the adjusted frame interval of the main camera is the same as the adjusted frame intervals of the N auxiliary cameras, and the adjusted frame interval of the main camera and the adjusted frame interval of the N auxiliary cameras are integer multiples of the brightness change period of the ambient light, which is the inverse of the frequency of the ambient light.
6. The method according to claim 4, wherein the method further comprises:
when the frequency of the ambient light is not a preset frequency, the AE module determines the brightness of the ambient light;
the AE module determines exposure according to the brightness of the ambient light;
and the AE module adjusts the frame interval of the main camera and the auxiliary camera according to the exposure.
7. The method according to any one of claims 1 to 6, further comprising:
when the first frame and the N second frames are synchronous frames, the AE module adjusts a frame interval of the first frame or the N second frames based on a frame synchronization algorithm.
8. The method of any one of claims 1 to 6, wherein the AE module determining whether the first frame and the N second frames are synchronous frames comprises:
the AE module determines whether the first frame and the N second frames are synchronous frames according to the identification of the first frame and the identification of the N second frames, wherein the identification is used for indicating whether the first frame is the synchronous frame.
9. The method of any one of claims 1 to 6, wherein before the AE module obtains the first frame captured by the primary camera and the N second frames captured by the N secondary cameras, the method further comprises:
The camera APP of the terminal equipment receives user operation, wherein the user operation is used for selecting a multi-shooting mode, and the multi-shooting mode is a mode of jointly shooting by the main camera and the N auxiliary cameras;
the camera APP responds to the user operation, controls the main camera to shoot the first frames, and controls the N auxiliary cameras to shoot the N second frames.
10. An apparatus for adjusting a frame interval, comprising a processor and a memory, the processor and the memory coupled, the memory for storing a computer program that, when executed by the processor, causes the apparatus to perform the method of any of claims 1-9.
11. A computer readable storage medium, characterized in that the computer readable storage medium stores a computer program, which when executed by a processor causes an apparatus comprising the processor to perform the method of any one of claims 1 to 9.
CN202211131516.0A 2022-09-16 2022-09-16 Method and device for adjusting frame interval Active CN116723410B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211131516.0A CN116723410B (en) 2022-09-16 2022-09-16 Method and device for adjusting frame interval

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211131516.0A CN116723410B (en) 2022-09-16 2022-09-16 Method and device for adjusting frame interval

Publications (2)

Publication Number Publication Date
CN116723410A CN116723410A (en) 2023-09-08
CN116723410B true CN116723410B (en) 2024-03-22

Family

ID=87870303

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211131516.0A Active CN116723410B (en) 2022-09-16 2022-09-16 Method and device for adjusting frame interval

Country Status (1)

Country Link
CN (1) CN116723410B (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104754200A (en) * 2013-12-31 2015-07-01 联芯科技有限公司 Method and system for adjusting automatic exposure
CN105323499A (en) * 2014-07-28 2016-02-10 佳能株式会社 Image pickup apparatus and method of controlling same
CN105739936A (en) * 2016-01-26 2016-07-06 广东欧珀移动通信有限公司 User terminal control method and user terminal
CN109151255A (en) * 2018-08-31 2019-01-04 惠州华阳通用电子有限公司 A kind of camera flashing removing method and device based on Photoelectric Detection
JP2020088669A (en) * 2018-11-28 2020-06-04 キヤノン株式会社 Imaging apparatus
CN111355864A (en) * 2020-04-16 2020-06-30 浙江大华技术股份有限公司 Image flicker elimination method and device
CN114070993A (en) * 2020-07-29 2022-02-18 华为技术有限公司 Image pickup method, image pickup apparatus, and readable storage medium

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB201418619D0 (en) * 2014-10-20 2014-12-03 Apical Ltd Method

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104754200A (en) * 2013-12-31 2015-07-01 联芯科技有限公司 Method and system for adjusting automatic exposure
CN105323499A (en) * 2014-07-28 2016-02-10 佳能株式会社 Image pickup apparatus and method of controlling same
CN105739936A (en) * 2016-01-26 2016-07-06 广东欧珀移动通信有限公司 User terminal control method and user terminal
CN109151255A (en) * 2018-08-31 2019-01-04 惠州华阳通用电子有限公司 A kind of camera flashing removing method and device based on Photoelectric Detection
JP2020088669A (en) * 2018-11-28 2020-06-04 キヤノン株式会社 Imaging apparatus
CN111355864A (en) * 2020-04-16 2020-06-30 浙江大华技术股份有限公司 Image flicker elimination method and device
CN114070993A (en) * 2020-07-29 2022-02-18 华为技术有限公司 Image pickup method, image pickup apparatus, and readable storage medium

Also Published As

Publication number Publication date
CN116723410A (en) 2023-09-08

Similar Documents

Publication Publication Date Title
WO2020168956A1 (en) Method for photographing the moon and electronic device
AU2020229917B2 (en) Recording frame rate control method and related apparatus
CN113132620A (en) Image shooting method and related device
CN112262563B (en) Image processing method and electronic device
CN115473957A (en) Image processing method and electronic equipment
CN114095666B (en) Photographing method, electronic device, and computer-readable storage medium
WO2023284715A1 (en) Object reconstruction method and related device
CN115272138B (en) Image processing method and related device
CN110138999B (en) Certificate scanning method and device for mobile terminal
CN114727220A (en) Equipment searching method and electronic equipment
CN116723410B (en) Method and device for adjusting frame interval
CN114697516B (en) Three-dimensional model reconstruction method, apparatus and storage medium
CN114079725B (en) Video anti-shake method, terminal device, and computer-readable storage medium
CN114283195A (en) Method for generating dynamic image, electronic device and readable storage medium
CN113495733A (en) Theme pack installation method and device, electronic equipment and computer readable storage medium
CN116723382B (en) Shooting method and related equipment
CN115460343B (en) Image processing method, device and storage medium
CN116744118A (en) Method and device for determining shooting parameters
CN116051351B (en) Special effect processing method and electronic equipment
CN116055872B (en) Image acquisition method, electronic device, and computer-readable storage medium
CN116095512B (en) Photographing method of terminal equipment and related device
CN115686182B (en) Processing method of augmented reality video and electronic equipment
CN116233599B (en) Video mode recommendation method and electronic equipment
CN113297875B (en) Video text tracking method and electronic equipment
CN116723382A (en) Shooting method and related equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant