CN115526787A - Video processing method and device - Google Patents

Video processing method and device Download PDF

Info

Publication number
CN115526787A
CN115526787A CN202210193591.3A CN202210193591A CN115526787A CN 115526787 A CN115526787 A CN 115526787A CN 202210193591 A CN202210193591 A CN 202210193591A CN 115526787 A CN115526787 A CN 115526787A
Authority
CN
China
Prior art keywords
shooting
image
terminal equipment
terminal device
frame rate
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202210193591.3A
Other languages
Chinese (zh)
Other versions
CN115526787B (en
Inventor
崔瀚涛
王宁
刘虎
蒋明欣
唐智伟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honor Device Co Ltd
Original Assignee
Honor Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honor Device Co Ltd filed Critical Honor Device Co Ltd
Priority to CN202311673982.6A priority Critical patent/CN117911299A/en
Priority to CN202210193591.3A priority patent/CN115526787B/en
Publication of CN115526787A publication Critical patent/CN115526787A/en
Priority to PCT/CN2023/071381 priority patent/WO2023160285A1/en
Application granted granted Critical
Publication of CN115526787B publication Critical patent/CN115526787B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/90Dynamic range modification of images or parts thereof
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20172Image enhancement details
    • G06T2207/20208High dynamic range [HDR] image processing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20221Image fusion; Image merging

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Studio Devices (AREA)

Abstract

The embodiment of the application provides a video processing method and a video processing device, relates to the technical field of terminals, and is applied to terminal equipment, wherein the method comprises the following steps: the terminal equipment receives operation for starting video recording; responding to the operation of starting video recording, and starting video recording by the terminal equipment; the terminal equipment acquires a first image sequence of a shooting scene by using the first shooting parameters; the terminal equipment adjusts the shooting parameters according to the shooting scene to obtain second shooting parameters; the terminal equipment acquires a second image sequence of the shooting scene by using the second shooting parameters; and the terminal equipment obtains a video processing result based on the first image sequence and the second image sequence. Therefore, the terminal equipment can match proper shooting parameters for the shooting scene, and dynamically adjust the shooting parameters according to the change of the shooting scene, so that the terminal equipment can obtain videos with better shooting effects based on different shooting parameters.

Description

Video processing method and device
Technical Field
The present application relates to the field of terminal technologies, and in particular, to a video processing method and apparatus.
Background
With the popularization and development of the internet, the functional requirements of people for terminal devices are becoming more diversified, for example, users can use the terminal devices to capture High Dynamic Range (HDR) videos.
In general, the frame rate that an HDR video can support is fixed to 30 frames per second (fps) due to the limitation of the HDR capability of a terminal device, but the capturing effect of the HDR video obtained based on the frame rate is not good.
Disclosure of Invention
The embodiment of the application provides a video processing method and device, and terminal equipment can match appropriate shooting parameters for a shooting scene and dynamically adjust the shooting parameters according to the change of the shooting scene, so that the terminal equipment can obtain a video with a better shooting effect based on different shooting parameters.
In a first aspect, an embodiment of the present application provides a video processing method, which is applied to a terminal device, and the method includes: the terminal equipment receives operation for starting video recording; responding to the operation of starting video recording, and starting video recording by the terminal equipment; the terminal equipment acquires a first image sequence of a shooting scene by using the first shooting parameters; the first shooting parameter is used for indicating the shooting parameter adopted by the terminal equipment when video recording is carried out based on the double conversion gain DCG; the terminal equipment adjusts the shooting parameters according to the shooting scene to obtain second shooting parameters; the terminal equipment acquires a second image sequence of the shooting scene by using the second shooting parameters; and the terminal equipment obtains a video processing result based on the first image sequence and the second image sequence. Therefore, the terminal equipment can match proper shooting parameters for the shooting scene, and dynamically adjust the shooting parameters according to the change of the shooting scene, so that the terminal equipment can obtain videos with better shooting effects based on different shooting parameters.
The first shooting parameter may be DCG setting information in the embodiment of the present application; the second shooting parameter may be binning setting information in the embodiment of the present application.
In one possible implementation manner, before the terminal device receives the operation for starting the video recording, the method further includes: the terminal equipment receives an operation for checking a setting item corresponding to video recording; responding to the operation of viewing the setting item corresponding to the video recording, and displaying a first interface by the terminal equipment; the first interface comprises: a control for setting a video frame rate; the method comprises the steps that terminal equipment receives operation aiming at a control for setting a video frame rate; responding to the operation of the control for setting the video frame rate, and displaying a second interface by the terminal equipment; wherein, the second interface includes: a control for setting the video frame rate to automatic; the terminal equipment receives operation for starting video recording, and the operation comprises the following steps: in the case where the video frame rate is automatic, the terminal apparatus receives an operation for starting video recording. Therefore, the user can flexibly set the video frame rate according to the shooting requirement, for example, the video frame rate is set to be automatic, and the use experience of the user using the video shooting function is further improved.
In a possible implementation manner, the second shooting parameter is used for indicating a shooting parameter adopted by the terminal device when performing video recording based on merging; the first photographing parameters may include: and the parameter is used for indicating the image sensor to acquire the image data at the first frame rate.
In a possible implementation manner, the adjusting, by the terminal device, the shooting parameter according to the shooting scene to obtain a second shooting parameter includes: when the terminal equipment determines that the state of the terminal equipment meets a first preset state and the brightness of a shooting scene is greater than a brightness threshold value, the terminal equipment adjusts shooting parameters to obtain second shooting parameters; the second shooting parameters may include: parameters for instructing the image sensor to acquire images at a second frame rate; the second frame rate is greater than the first frame rate; or when the terminal device determines that the state of the terminal device meets a first preset state and the brightness of the shooting scene is less than or equal to a brightness threshold, the terminal device adjusts the shooting parameters to obtain second shooting parameters; the second shooting parameters may include: and the parameters are used for indicating the image sensor to acquire the images at the first frame rate. In this way, the terminal device can flexibly adjust the shooting parameters based on the state of the terminal device, the brightness condition of the shooting scene and the like.
The first frame rate may be 30fps in the embodiment of the present application, and the second frame rate may be 60fps in the embodiment of the present application.
In a possible implementation manner, the adjusting, by the terminal device, the shooting parameter according to the shooting scene to obtain a second shooting parameter includes: when the terminal equipment determines that the state of the terminal equipment meets a second preset state and detects a preset pattern of the marquee in a shooting scene, the terminal equipment adjusts shooting parameters to obtain second shooting parameters; the second photographing parameters may include: and the parameters are used for indicating the image sensor to acquire the images at the second frame rate. In this way, the terminal device can flexibly adjust the shooting parameters based on the state of the terminal device, the brightness condition of the shooting scene and the like.
In one possible implementation, the method further includes: when the terminal device determines that the state of the terminal device meets a first preset state and the shooting scene meets a High Dynamic Range (HDR) scene, or when the terminal device determines that the state of the terminal device meets a second preset state and a preset pattern of the ticker is not detected in the shooting scene, the terminal device reduces a second frame rate in the second shooting parameters to the first frame rate; and the terminal equipment adjusts the shooting parameters to obtain first shooting parameters. Thus, when the binding 60 is switched to the DCG30, the terminal device can avoid the DCG60 from occurring by lowering the binding 60 to the binding 30 and then switching the DCG, thereby increasing the stability of the image sensor.
In a possible implementation manner, the adjusting, by the terminal device, the shooting parameter according to the shooting scene to obtain a second shooting parameter includes: when the terminal equipment determines that the temperature of the terminal equipment is greater than the temperature threshold value, the terminal equipment adjusts the shooting parameters to obtain second shooting parameters; the second photographing parameters may include: parameters for instructing the image sensor to acquire images at a third frame rate; the third frame rate is less than the first frame rate. Therefore, the terminal equipment can flexibly adjust shooting parameters based on the temperature condition of the equipment, and the normal work of the terminal equipment is prevented from being influenced by overhigh temperature.
In a possible implementation manner, the second shooting parameter may further include: a parameter for indicating that the number of data storage bits is 12 bits, a parameter for indicating that the output format is the RAW data format RAW12, and a parameter for indicating that phase focusing is supported; the first shooting parameter may further include: a parameter for indicating that the number of data storage bits is 12 bits, a parameter for indicating that the output format is the RAW data format RAW12, and a parameter for indicating that phase focusing is supported.
In a possible implementation manner, the obtaining, by the terminal device, a video processing result based on the first image sequence and the second image sequence includes: the terminal equipment receives an operation for finishing video recording; and responding to the operation of finishing video recording, and obtaining a video processing result by the terminal equipment based on the first image sequence and the second image sequence. In this way, the terminal device can end the video processing process based on the operation of the user receiving the video recording.
In a possible implementation manner, the obtaining, by the terminal device, a video processing result based on the first image sequence and the second image sequence includes: the terminal equipment respectively carries out image preprocessing on the first image sequence and the second image sequence to obtain a first image sequence after the image preprocessing and a second image sequence after the image preprocessing; the terminal equipment respectively carries out image post-processing on the first image sequence after the image pre-processing and the second image sequence after the image pre-processing to obtain a first image sequence after the image post-processing and a second image sequence after the image post-processing; and the terminal equipment obtains a video processing result based on the first image sequence after the image post-processing and the second image sequence after the image post-processing. Therefore, the terminal equipment can adjust the pictures of the image sequence based on the processes of image preprocessing, image postprocessing and the like, so that the picture effect of the video processing result is better.
In one possible implementation, the image post-processing includes one or more of: image correction and adjustment processing, local tone mapping processing, or gamma correction processing.
In a second aspect, an embodiment of the present application provides a video processing apparatus, where the processing unit is configured to start a video recording operation; the processing unit is used for responding to the operation of starting video recording and is also used for starting the video recording; the processing unit is also used for acquiring a first image sequence of a shooting scene by utilizing the first shooting parameters; the first shooting parameter is used for indicating the shooting parameter adopted by the terminal equipment when video recording is carried out based on the double conversion gain DCG; the processing unit is also used for adjusting the shooting parameters according to the shooting scene to obtain second shooting parameters; the processing unit is also used for acquiring a second image sequence of the shooting scene by utilizing the second shooting parameters; and the processing unit is also used for obtaining a video processing result based on the first image sequence and the second image sequence.
In a possible implementation manner, the processing unit is further configured to receive an operation for viewing a setting item corresponding to the video recording; the display unit is used for displaying a first interface in response to the operation of viewing the setting item corresponding to the video recording; the first interface includes: a control for setting a video frame rate; the processing unit is also used for receiving the operation of the control for setting the video frame rate; the display unit is used for responding to the operation of the control for setting the video frame rate and displaying a second interface; wherein, the second interface includes: a control for setting the video frame rate to automatic; and the processing unit is also used for receiving the operation for starting video recording by the terminal equipment under the condition that the video frame rate is automatic.
In a possible implementation manner, the second shooting parameter is used for indicating a shooting parameter adopted by the terminal device when performing video recording based on merging; the first photographing parameters may include: and the parameter is used for indicating the image sensor to acquire the image data at the first frame rate.
In a possible implementation manner, when the terminal device determines that the state of the terminal device satisfies a first preset state and the brightness of the shooting scene is greater than a brightness threshold, the processing unit is specifically configured to adjust the shooting parameters to obtain second shooting parameters; the second shooting parameters may include: parameters for instructing the image sensor to acquire images at a second frame rate; the second frame rate is greater than the first frame rate; or when the terminal device determines that the state of the terminal device meets a first preset state and the brightness of the shooting scene is less than or equal to a brightness threshold, the processing unit is specifically configured to adjust the shooting parameters to obtain second shooting parameters; the second shooting parameters may include: and the parameter is used for indicating the image sensor to acquire the image at the first frame rate.
In a possible implementation manner, when the terminal device determines that the state of the terminal device satisfies a second preset state and detects a preset pattern of the ticker in a shooting scene, the processing unit is specifically configured to adjust the shooting parameters to obtain second shooting parameters; the second photographing parameters may include: and the parameter is used for indicating the image sensor to acquire the image at the second frame rate.
In a possible implementation manner, when the terminal device determines that the state of the terminal device satisfies a first preset state and the shooting scene satisfies a high dynamic range HDR scene, or when the terminal device determines that the state of the terminal device satisfies a second preset state and a preset pattern of the ticker is not detected in the shooting scene, the processing unit is further configured to reduce a second frame rate in the second shooting parameters to the first frame rate; and the processing unit is also used for adjusting the shooting parameters to obtain first shooting parameters.
In a possible implementation manner, when the terminal device determines that the temperature of the terminal device is greater than the temperature threshold, the processing unit is specifically configured to adjust the shooting parameter to obtain a second shooting parameter; the second photographing parameters may include: parameters for instructing the image sensor to acquire images at a third frame rate; the third frame rate is less than the first frame rate.
In a possible implementation manner, the second shooting parameter may further include: a parameter for indicating that the number of data storage bits is 12 bits, a parameter for indicating that the output format is the RAW data format RAW12, and a parameter for indicating that phase focusing is supported; the first shooting parameter may further include: a parameter for indicating that the number of data storage bits is 12 bits, a parameter for indicating that the output format is the RAW data format RAW12, and a parameter for indicating that phase focusing is supported.
In a possible implementation manner, the processing unit is specifically configured to receive an operation for ending video recording; in response to the operation of ending the video recording, the processing unit is further specifically configured to obtain a video processing result based on the first image sequence and the second image sequence.
In a possible implementation manner, the processing unit is specifically configured to perform image preprocessing on the first image sequence and the second image sequence, respectively, to obtain a first image sequence after the image preprocessing and a second image sequence after the image preprocessing; the processing unit is further specifically configured to perform image post-processing on the first image sequence after the image pre-processing and the second image sequence after the image pre-processing, respectively, to obtain a first image sequence after the image post-processing and a second image sequence after the image post-processing; the processing unit is further specifically configured to obtain a video processing result based on the first image sequence after image post-processing and the second image sequence after image post-processing.
In one possible implementation, the image post-processing includes one or more of: image correction and adjustment processing, local tone mapping processing, or gamma correction processing.
In a third aspect, an embodiment of the present application provides a video processing apparatus, including a processor and a memory, where the memory is used to store code instructions; the processor is configured to execute the code instructions to cause the electronic device to perform the video processing method as described in the first aspect or any implementation manner of the first aspect.
In a fourth aspect, embodiments of the present application provide a computer-readable storage medium storing instructions that, when executed, cause a computer to perform a video processing method as described in the first aspect or any implementation manner of the first aspect.
In a fifth aspect, a computer program product comprises a computer program which, when executed, causes a computer to perform a video processing method as described in the first aspect or any of the implementations of the first aspect.
It should be understood that the third aspect to the fifth aspect of the present application correspond to the technical solutions of the first aspect of the present application, and the beneficial effects obtained by the aspects and the corresponding possible implementations are similar and will not be described again.
Drawings
Fig. 1 is a schematic diagram of a merging and DCG provided in an embodiment of the present application;
fig. 2 is a schematic diagram of a hardware structure of a terminal device according to an embodiment of the present disclosure;
fig. 3 is a schematic diagram of a software architecture of a terminal device according to an embodiment of the present application;
FIG. 4 is a graph illustrating image sensitivity (ISO value) and dynamic range gain according to an embodiment of the present disclosure;
fig. 5 is a schematic interface diagram for setting a video frame rate according to an embodiment of the present disclosure;
fig. 6 is a schematic flowchart of a video processing method according to an embodiment of the present application;
fig. 7 is a schematic view of an interface for starting video recording according to an embodiment of the present disclosure;
fig. 8 is a schematic flowchart of another video processing method according to an embodiment of the present application;
fig. 9 is a schematic structural diagram of a video processing apparatus according to an embodiment of the present application;
fig. 10 is a schematic hardware structure diagram of another terminal device according to an embodiment of the present application;
fig. 11 is a schematic structural diagram of a chip according to an embodiment of the present disclosure.
Detailed Description
The present application relates to the field of photography, and some terms of the field of photography are described below to facilitate understanding of the methods provided by the present application.
1. Combination (binding)
binning is an image readout mode in which charges induced in adjacent picture elements are added together and read out in a one-pixel mode. For example, in the process of shooting an image by the electronic device, light reflected by a target object is collected by a camera, so that the reflected light is transmitted to an image sensor. The image sensor includes a plurality of photosensitive elements, each of which collects a charge of one pixel and performs a binning operation on pixel information. Specifically, binning may combine n × n pixels into one pixel. For example, binning may synthesize adjacent 2 × 2 pixels into one pixel, that is, the color of the adjacent 2 × 2 pixels is presented in the form of one pixel.
For example, fig. 1 is a schematic diagram of a merging and DCG provided in an embodiment of the present application. As shown in fig. 1, when an image is a 4 × 4 pixel, binning may implement synthesizing adjacent 2 × 2 pixels into one pixel, so that the image sensor may merge the 4 × 4 image into a 2 × 2 image and output the 2 × 2 image as an image based on binning by the image sensor.
2. Dual Conversion Gain (DCG)
An image sensor with double conversion gain DCG capability is provided with two potential wells in one pixel, wherein the two potential wells correspond to different full well capacities and different conversion gains CG, the large full well capacity corresponds to Low Conversion Gain (LCG) and low photosensitivity, and the small full well capacity corresponds to High Conversion Gain (HCG) and high photosensitivity. In this way, the sensor can acquire two images in one exposure using two potential wells (two sensitivities) and two conversion gains in the same scene: an image in a high-sensing mode and an image in a low-sensing mode. The two acquired images are then combined into one image by the electronic device, i.e., the HDR technique.
For example, as shown in fig. 1, after combining adjacent n × n pixels into one pixel, the image sensor may further use two conversion gains, for example, obtaining image data at the two conversion gains based on the HCG and the LCG, respectively, fusing the image data based on the HCG and the image data based on the LCG to obtain a fused image, and outputting the fused image as an image of the image sensor based on the DCG.
In the embodiments of the present application, terms such as "first" and "second" are used to distinguish the same or similar items having substantially the same function and action. For example, the first value and the second value are only used for distinguishing different values, and the order of the values is not limited. Those skilled in the art will appreciate that the terms "first," "second," and the like do not denote any order or importance, but rather the terms "first," "second," and the like do not denote any order or importance.
It is noted that, in the present application, words such as "exemplary" or "for example" are used to mean exemplary, illustrative, or descriptive. Any embodiment or design described herein as "exemplary" or "e.g.," is not necessarily to be construed as preferred or advantageous over other embodiments or designs. Rather, use of the word "exemplary" or "such as" is intended to present relevant concepts in a concrete fashion.
In the present application, "at least one" means one or more, "a plurality" means two or more. "and/or" describes the association relationship of the associated object, indicating that there may be three relationships, for example, a and/or B, which may indicate: a exists alone, A and B exist simultaneously, and B exists alone, wherein A and B can be singular or plural. The character "/" generally indicates that the former and latter associated objects are in an "or" relationship. "at least one of the following" or similar expressions refer to any combination of these items, including any combination of the singular or plural items. For example, at least one (one) of a, b, or c, may represent: a, b, c, a and b, a and c, b and c, or a, b and c, wherein a, b and c can be single or multiple.
With the development of technology, users have higher and higher requirements on the shooting effect of videos shot by terminal devices such as mobile phones, and therefore, more terminal devices can support shooting to obtain HDR videos. However, due to the limitation of the HDR capability of the terminal device, the frame rate that the HDR video can support is usually 30 frames per second (fps), so that the requirements for different frame rates for shooting scenes such as a bright scene, a moving scene, and a night scene cannot be met, and the video shooting effect is further affected. Among other things, the HDR capability may include: interlaced (stager) HDR-based imaging methods, and DCG-based imaging methods.
For example, for a motion scene and a highlight scene, both the stager HDR imaging method and the DCG imaging method can only support a frame rate of 30fps, which affects a highlight scene, a motion scene, or other shooting scenes that need to present more screen information. For example, when the terminal device shoots a moving scene with a frame rate of 30fps, the shot video may be jammed due to a small frame rate, and the smoothness of the video is affected.
For example, for dim light scenes, the DCG-based imaging method will cause problems such as noise in dim light scenes. It can be understood that since the DCG-based imaging method is determined based on both the HCG and the LCG, the image data output by the DCG-based imaging method is a combination of the image data obtained based on the HCG and the image data obtained based on the LCG by 4 times. In shooting a dark scene, the image data obtained based on the LCG has larger noise than the image data obtained based on the HCG, so the noise of the LCG is amplified in the dark scene, and thus the shooting effect of the video is poor when shooting the dark scene based on the DCG method.
Therefore, in a motion scene or a highlight scene, the smoothness of the video is affected by a small frame rate, or in a dim light scene, the final video shooting effect is affected by a problem of large noise caused by a DCG-based imaging method.
In view of this, an embodiment of the present application provides a video processing method, where a terminal device may match a proper frame rate for a shooting scene, and dynamically adjust a frame rate according to a change of the shooting scene, so that the terminal device may obtain a video with a better shooting effect based on different frame rates.
It is understood that the terminal equipment may also be referred to as a terminal (terminal), a User Equipment (UE), a Mobile Station (MS), a Mobile Terminal (MT), and so on. The terminal device may be a mobile phone (mobile phone) supporting a video recording function, a smart television, a wearable device, a tablet computer (Pad), a computer with a wireless transceiving function, a Virtual Reality (VR) device, an Augmented Reality (AR) device, a wireless terminal in industrial control (industrial control), a wireless terminal in self-driving (self-driving), a wireless terminal in remote surgery (remote medical supply), a wireless terminal in smart grid (smart grid), a wireless terminal in transportation safety (transportation safety), a wireless terminal in city (smart city), a wireless terminal in home (smart home), and the like. The embodiment of the present application does not limit the specific technology and the specific device form adopted by the terminal device.
Therefore, in order to better understand the embodiments of the present application, the following describes the structure of the terminal device according to the embodiments of the present application. Exemplarily, fig. 2 is a schematic structural diagram of a terminal device provided in an embodiment of the present application.
The terminal device may include a processor 110, an external memory interface 120, an internal memory 121, a Universal Serial Bus (USB) interface 130, a charging management module 140, a power management module 141, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, a button 190, an indicator 192, a camera 193, a display screen 194, and the like. The sensor module 180 may include a gyroscope sensor 180A, an acceleration sensor 180B, and a temperature sensor 180C.
It is to be understood that the illustrated structure of the embodiments of the present application does not constitute a specific limitation to the terminal device. In other embodiments of the present application, a terminal device may include more or fewer components than shown, or some components may be combined, some components may be split, or a different arrangement of components may be used. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Processor 110 may include one or more processing units. The different processing units may be separate devices or may be integrated into one or more processors. A memory may also be provided in processor 110 for storing instructions and data.
Processor 110 may include one or more processing units, such as: the processor 110 may include an Application Processor (AP), a modem processor, a Graphics Processing Unit (GPU), an Image Signal Processor (ISP), a controller, a video codec, a Digital Signal Processor (DSP), a baseband processor, and/or a neural-Network Processing Unit (NPU), etc. Wherein, the different processing units may be independent devices or may be integrated in one or more processors.
The USB interface 130 is an interface conforming to the USB standard specification, and may be a Mini USB interface, a Micro USB interface, a USB Type C interface, or the like. The USB interface 130 may be used to connect a charger to charge the terminal device, and may also be used to transmit data between the terminal device and a peripheral device. And the earphone can also be used for connecting an earphone and playing audio through the earphone. The interface may also be used to connect other terminal devices, such as AR devices and the like.
The charging management module 140 is configured to receive charging input from a charger. The charger can be a wireless charger or a wired charger. The power management module 141 is used for connecting the charging management module 140 and the processor 110.
The wireless communication function of the terminal device can be realized by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, the modem processor, the baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Antennas in terminal devices may be used to cover single or multiple communication bands. Different antennas can also be multiplexed to improve the utilization of the antennas.
The mobile communication module 150 may provide a solution including 2G/3G/4G/5G wireless communication applied on the terminal device. The mobile communication module 150 may include at least one filter, a switch, a power amplifier, a Low Noise Amplifier (LNA), and the like. The mobile communication module 150 may receive the electromagnetic wave from the antenna 1, filter, amplify, etc. the received electromagnetic wave, and transmit the electromagnetic wave to the modem processor for demodulation.
The wireless communication module 160 may provide a solution for wireless communication applied to a terminal device, including Wireless Local Area Networks (WLANs) (e.g., wireless fidelity (Wi-Fi) networks), bluetooth (BT), global Navigation Satellite System (GNSS), frequency Modulation (FM), and the like.
The terminal device realizes the display function through the GPU, the display screen 194, and the application processor. The GPU is a microprocessor for image processing, connected to the display screen 194 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering.
The display screen 194 is used to display images, video, and the like. The display screen 194 includes a display panel. In some embodiments, the terminal device may include 1 or N display screens 194, N being a positive integer greater than 1.
The terminal device can implement a shooting function through the ISP, the camera 193, the video codec, the GPU, the display screen 194, the application processor, and the like.
The ISP is used to process the data fed back by the camera 193. For example, when a photo is taken, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electrical signal, and the camera photosensitive element transmits the electrical signal to the ISP for processing and converting into an image visible to naked eyes. The ISP can also carry out algorithm optimization on noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in camera 193.
The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image to the photosensitive element. The photosensitive element may be a Charge Coupled Device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor. The photosensitive element converts the optical signal into an electrical signal, and then transmits the electrical signal to the ISP to be converted into a digital image signal. And the ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into image signal in standard RGB, YUV and other formats.
The digital signal processor is used for processing digital signals, and can process digital image signals and other digital signals. For example, when the terminal device selects the frequency point, the digital signal processor is used for performing fourier transform and the like on the frequency point energy.
Video codecs are used to compress or decompress digital video. The terminal device may support one or more video codecs. Thus, the terminal device can play or record videos in various encoding formats, such as: moving Picture Experts Group (MPEG) 1, MPEG2, MPEG3, MPEG4, and the like.
The camera 193 is used to capture still images or video. In some embodiments, the terminal device may include 1 or N cameras 193, N being a positive integer greater than 1.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to extend the storage capability of the terminal device. The external memory card communicates with the processor 110 through the external memory interface 120 to implement a data storage function. For example, files such as music, video, etc. are saved in the external memory card.
The internal memory 121 may be used to store computer-executable program code, which includes instructions. The internal memory 121 may include a program storage area and a data storage area.
The terminal device can implement an audio function through the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the headphone interface 170D, and the application processor. Such as music playing, recording, etc.
The audio module 170 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal. The speaker 170A, also called a "horn", is used to convert the audio electrical signal into a sound signal. The terminal device can listen to music through the speaker 170A, or listen to a handsfree call. The receiver 170B, also called "earpiece", is used to convert the electrical audio signal into an acoustic signal. When the terminal device answers a call or voice information, it is possible to answer a voice by bringing the receiver 170B close to the human ear. The earphone interface 170D is used to connect a wired earphone.
The microphone 170C, also referred to as a "microphone," is used to convert sound signals into electrical signals.
The sensor module 180 may include a gyro sensor 180A, an acceleration sensor 180B, and a temperature sensor 180C.
The gyroscope sensor is used for determining the motion attitude of the terminal equipment. In some embodiments, the angular velocity of the terminal device about three axes (i.e., x, y, and z axes) may be determined by a gyroscope sensor. The gyro sensor may be used for photographing anti-shake.
The acceleration sensor can detect the magnitude of acceleration of the terminal device in various directions (generally three axes). When the terminal equipment is static, the size and the direction of gravity can be detected. The method can also be used for identifying the attitude of the terminal equipment, and is applied to application programs such as horizontal and vertical screen switching, pedometers and the like.
In the embodiment of the application, the gyroscope sensor and the acceleration sensor can be jointly used for detecting the scene where the terminal device is located, for example, whether the terminal device is held by a user or placed in a tripod, and then the terminal device can be matched with a proper frame rate based on different located scenes.
The temperature sensor is used for detecting the temperature condition of the terminal equipment.
In a possible implementation manner, the sensor module may further include one or more of the following sensors, for example: a pressure sensor, a barometric pressure sensor, a magnetic sensor, a distance sensor, a proximity light sensor, a fingerprint sensor, a touch sensor, an ambient light sensor, or a bone conduction sensor, etc. (not shown in fig. 2).
The keys 190 include a power-on key, a volume key, and the like. The keys 190 may be mechanical keys. Or may be touch keys. The terminal device may receive a key input, and generate a key signal input related to user setting and function control of the terminal device. Indicator 192 may be an indicator light that may be used to indicate a state of charge, a change in charge, or a message, missed call, notification, etc.
The software system of the terminal device may adopt a layered architecture, an event-driven architecture, a micro-core architecture, a micro-service architecture, or a cloud architecture, which is not described herein again.
Exemplarily, fig. 3 is a schematic diagram of a software architecture of a terminal device according to an embodiment of the present application.
The layered architecture divides the software into several layers, each layer having a clear role and division of labor. The layers communicate with each other through a software interface. In some embodiments, the Android system is divided into five layers, which are an Application (APP) layer, an application framework (framework) layer, a system library (library), a Hardware Abstraction Layer (HAL), a kernel (kernel) layer, and the like from top to bottom.
As shown in fig. 3, a camera or the like may be included in the application layer.
In a possible implementation manner, the application layer may further include: gallery, settings, maps, music, etc. (not shown in fig. 3).
The application framework layer provides an Application Programming Interface (API) and a programming framework for the application program of the application layer. The application framework layer includes a number of predefined functions.
As shown in fig. 3, the application framework layer may include a camera application programming interface, a media recorder (media recorder), a surface view (surface view), and the like.
In the embodiment of the application, the media recording is used for recording videos or acquiring picture data, and the data can be accessed by an application program. The surface view is used for displaying a preview screen.
In a possible implementation manner, the application framework layer may further include: a notification manager, a content manager, a window manager, etc. (not shown in fig. 3), which are not limited in the embodiments of the present application.
As shown in fig. 3, a camera service (camera service) may be included in the system library.
In a possible implementation manner, the system library may further include a plurality of functional modules, for example: a surface manager, a media library, a three-dimensional graphics processing library, and a 2D graphics engine, etc. (not shown in fig. 3).
The purpose of the hardware abstraction layer is to abstract hardware, which can provide a uniform interface for querying hardware devices for applications at an upper layer, such as an interface conforming to the hardware abstraction layer interface description language (HIDL) protocol.
In this embodiment, the hardware abstraction layer may include: a camera pipeline (camera pipeline), a sensing module, a decision module, and an Automatic Exposure Control (AEC).
In the embodiment of the application, the camera process can be called by a camera service in a system library.
The perception module is used for identifying a shooting scene based on data such as the brightness condition of a preview picture, a shooting object, and state data (such as gyroscope data and acceleration data) of the terminal equipment, and sending the identified scene to the decision module. The sensing module can count the gray level histogram of the preview picture and the gray level condition of the pixel points in the preview picture based on the automatic exposure control module.
The decision module is used for matching proper frame rates for different shooting scenes based on the corresponding relation between the shooting scenes and the frame rates, and instructing corresponding sensors in the camera to output image sequences corresponding to the frame rates.
In a possible implementation manner, the positions of the sensing module, the decision module, and the automatic exposure module may be set in other layers according to requirements, which is not limited in this embodiment of the application.
In a possible implementation manner, the hardware abstraction layer may further include: an audio interface, a video interface, a telephony interface, and a Global Positioning System (GPS) interface, etc. (not shown in fig. 3).
The kernel layer is a layer between hardware and software. The inner core layer may include: display driving, camera driving, and the like.
In the embodiment of the application, different image sensor setting (sensor setting) strategies can be set in the camera drive, so that the terminal device can realize the switching of the sensor setting at different frame rates. Exemplarily, table 1 is a schematic diagram of a sensor setting provided in an embodiment of the present application.
TABLE 1 sensor setting schematic table
Figure BDA0003525174750000101
As shown in table 1, the sensor setting may include: DCG setting information, and binning setting information. Wherein, the map data that the DCG setting information can support may include: the frame rate is 30fps, the data storage and output format of 12-bit is supported, the format is RAW12, and the phase focusing PDAF is supported; the graph data that the binning setting information can support may include: the frame rate is 1-60 fps, the data storage and output format supporting 12 bits is RAW12, and the PDAF is supported.
It is understood that the binning setting information can support both: the sensor operation mode is a frame rate of 24fps corresponding to the binding 24, the sensor operation mode is a frame rate of 30fps corresponding to the binding 30, and the sensor operation mode is a frame rate of 60fps corresponding to the binding 60.
It is understood that the binding only has data at the upper 10 bits, so the binding needs to be supplemented with the lower two bits to ensure 12-bit data storage.
In a possible implementation manner, the graph data that can be supported in any set of sensor setting may further include: map resolution, map rate, and data transfer rate, etc.
It can be understood that the consistency of the data transmission rate in the DCG setting information and the binning setting information in the sensor setting and the number of bits supporting data storage can realize the seamless switching between the sensor operating mode corresponding to the DCG and the sensor operating mode corresponding to the binning in the image sensor. When the mode switching is performed in the sensor operating mode corresponding to the DCG and the sensor operating mode corresponding to the combining, 10 ms to 20 ms of a frame may be lengthened, and a frame shift (frame shift) is generated at the anti-shake position in the image post-processing.
In a possible implementation manner, table 1 is only used as an example of the sensor setting, and in the possible implementation manner, the sensor setting may further include more setting information, which is not limited in this embodiment of the present application.
It is understood that binning and DCG may correspond to different dynamic range DR. For example, fig. 4 is a diagram illustrating image sensitivity (ISO value) and dynamic range gain according to an embodiment of the present disclosure. In the corresponding embodiment of fig. 4, the horizontal axis is the ISO value and the vertical axis is DR (db).
The lower the ISO value, the higher the luminance of the captured scene, and the higher the DR of the images obtained by the binning and DCG, respectively, the higher the DR of the DCG in the scene with the higher luminance than the binning and DCG, and thus the DCG is more suitable for use in the scene with the higher luminance.
The higher the ISO value, the darker the shooting scene, the shorter the exposure time, the lower the DR of the images obtained based on the binning and DCG, respectively, and since the noise of the binning and DCG is higher in the darker scene, the frame rate is usually 30fps to obtain the image data of the dark scene, whereas the power consumption of the binning is lower than that of the DCG compared to the binning and DCG, so the terminal device may tend to obtain the image data in the dark scene using the binning.
The following describes an exemplary workflow of software and hardware of the first device with reference to a video generation scenario and the corresponding embodiment of fig. 3.
When a touch sensor receives a starting operation of a user for a video recording mode in a camera application, a corresponding hardware interrupt is sent to a kernel layer, the kernel layer processes the touch operation into an original input event (including information such as touch coordinates and a time stamp of the touch operation), and the original input event is stored in the kernel layer. And the application program framework layer acquires the original input event from the kernel layer and identifies the control corresponding to the input event. And then the camera application calls an interface of the application framework layer and starts a video recording mode in the camera application. The camera application carries out scene recognition on a shot picture through a camera API in a program framework layer, a camera service in a system library and a perception module in an indication hardware abstraction layer and sends the recognized scene to a decision module; the decision-making module determines a proper sensor working mode for the current shooting scene based on the corresponding relation between the shooting scene and the sensor working mode, and sends the sensor working mode to a camera driver in the kernel layer, so that the camera driver can acquire an image sequence based on the sensor setting corresponding to the sensor working mode; the camera drive divides the collected image sequence into a preview stream and a video stream, the preview stream is sent to the surface view, and the video stream is sent to the media recording for encoding and storing in the terminal equipment.
The following describes the technical solutions of the present application and how to solve the above technical problems with specific embodiments. The following embodiments may be implemented independently or in combination, and details of the same or similar concepts or processes may not be repeated in some embodiments.
It can be understood that, in order to ensure that the terminal device can automatically adjust the frame rate in the video recording process according to the shooting scene, the terminal device can set an automatic video frame rate.
For example, fig. 5 is a schematic interface diagram for setting a video frame rate according to an embodiment of the present application. In the embodiment corresponding to fig. 5, a terminal device is taken as an example for description, and this example does not limit the embodiment of the present application.
When the terminal device receives an operation of opening the camera application by a user, the terminal device may display an interface shown by a in fig. 5, where the interface may be a main interface (or an interface understood to correspond to a photographing mode) of the camera application. As shown in fig. 5 a, one or more of the following may be included in the interface, for example: a photographing control corresponding to the photographing mode, a preview image, a control for starting an Artificial Intelligence (AI) photographing function, a control for turning on or off a flash lamp, a setting control 501 for setting a camera application, a control for adjusting a photographing multiple, a control for turning over a camera, a control for opening a gallery, and the like. The interface shown in a in fig. 5 may further include a plurality of function controls in a level one menu of the camera application, for example: a control for starting a night scene mode, a control for starting a portrait mode, a control for starting a photographing mode, a control 502 for starting a video recording mode, a control for starting a movie mode, and the like. The control for opening the gallery can be used for opening the gallery application program. The gallery application program is an application program for managing pictures on electronic devices such as a smart phone and a tablet computer, and may also be referred to as an "album", and the name of the application program is not limited in this embodiment. The gallery application may support various operations, such as browsing, editing, deleting, selecting, etc., by the user on videos stored on the terminal device.
In the interface shown as a in fig. 5, when the terminal device receives an operation of the setting control 501 by the user, the terminal device may display the interface shown as b in fig. 5. As shown in b in fig. 5, the interface may be a setting interface corresponding to a camera application, and the interface may include a function control corresponding to a photographing function, for example: proportion function control (if support 4 the proportion of shooing), the voice control function control of shooing, the gesture function control of shooing, smiling face snapshot function control etc. wherein this gesture function of shooing can only support prepositioning, and the gesture is towards the cell-phone trigger, and automatic shooting when this smiling face snapshot function can detect smiling face. The interface may further include function controls corresponding to the video functions, for example: a video resolution functionality control, a video frame rate functionality control 503, a high efficiency video format functionality control that can save 35% of space and that a user may not be able to play this format video on other devices, and an AI movie hue functionality control that can intelligently identify shot content matching LUT hues and that is only supported in non-4K HDR. The video resolution may be 4K or 1080P, the aspect ratio of the video may be 21.
In a possible implementation manner, the terminal device may also enter the setting interface shown as b in fig. 5 based on an interface other than the interface shown as a in fig. 5, which is not limited in this embodiment of the present application.
In the interface shown in b in fig. 5, when the terminal device receives the operation of the video frame rate function control 503 by the user, the terminal device may display the interface shown in c in fig. 5. As shown in c in fig. 5, a video frame rate prompt box 504 may be included in the interface, where the video frame rate prompt box 504 may include: an option for setting the frame rate to 50fps, an option for setting the frame rate to 60fps, an option for setting the frame rate to automatic 505, and the like. In the interface shown in c in fig. 5, the option for setting the frame rate to 50fps may be in a selected state, and other contents displayed in the interface may be similar to the interface shown in b in fig. 5, and are not described again.
In the interface shown as c in fig. 5, when the terminal device receives a user operation for the option 505 for setting the frame rate to automatic, the terminal device may display the interface shown as d in fig. 5. As shown in d in fig. 5, an option 505 for setting the frame rate to be automatic in the interface may be in a selected state, and other contents displayed in the interface may be similar to the interface shown in d in fig. 5, and are not described again here.
In a possible implementation manner, under the condition that the frame rate is set to be automatic, the zoom range which can be supported by the video recording mode in the camera application is 1x-10x; the video mode can realize that the functions of beautifying, filtering and the like cannot be supported under the scene corresponding to the video resolution of 4K; the video recording mode can support main shooting work and does not support multi-shooting switching.
It can be understood that, in the case that the terminal device sets the frame rate to be automatic based on the embodiment corresponding to fig. 5, the terminal device may match a suitable frame rate for a shooting scene when starting video recording based on the following video processing method corresponding to fig. 6, so that the terminal device may record a video with a better video picture effect.
Fig. 6 is a schematic flowchart of a video processing method according to an embodiment of the present disclosure.
As shown in fig. 6, the video processing method may include the steps of:
s601, when the terminal device receives the operation that the user starts video recording in the camera application, the terminal device determines a shooting scene by using the sensing module.
In the embodiment of the application, the camera application may be an application supported by a system of the terminal device, or the camera application may also be an application having a video recording function, and the like; the operation of starting video recording may be a voice operation, or may also be a click operation or a slide operation for a control used for starting shooting in a video recording mode; the position and the function of the sensing module can refer to the description in the corresponding embodiment of fig. 2, and are not described herein again.
Fig. 7 is a schematic view of an interface for starting video recording according to an embodiment of the present application. In the interface shown in a in fig. 5, when the terminal device receives an operation of the control 502 for turning on the video recording mode by the user, the terminal device may display the interface shown in a in fig. 7. The interface shown in a in fig. 7 may include: the other contents displayed in the interface may be similar to the interface shown in a in fig. 5, and details are not repeated here.
In the interface shown as a in fig. 7, when the terminal device receives an operation of the user for the control 701 for starting video recording, the terminal device may display the interface shown as b in fig. 7. The interface shown in b in fig. 7 may include: a control 702 for ending video recording, a control for pausing video recording, a control for shooting during video recording, digital information for indicating the video shooting time, a control for turning on or off a flash, a control for adjusting lens magnification during shooting, and the like.
Further, in a case where the terminal device starts video recording based on the embodiment corresponding to fig. 7, the terminal device may determine a shooting scene by using the sensing module.
For example, the sensing module of the terminal device may sense the state of the terminal device based on the gyroscope sensor and the acceleration sensor, for example, the terminal device is in a handheld state or the terminal device is in a tripod state. Further, in a handheld state or a tripod state, the sensing module of the terminal device may also count the brightness of a preview picture based on the automatic exposure control module, and determine whether the current situation meets an HDR scene, a highlight scene, a dim scene, or the like; or, the sensing module of the terminal device may further determine whether the preset ticker scene is currently met based on whether the shooting scene has a preset pattern of the ticker. The marquee scene can be a scene with a preset pattern of the marquee in a shooting scene; the preset pattern of the ticker is a pattern for cyclic display of the test video frame rate, such as a cyclically lit bulb pattern.
Specifically, one possible implementation that the sensing module of the terminal device senses the state of the terminal device based on the gyroscope sensor and the acceleration sensor may be: the sensing module of the terminal equipment can acquire angular acceleration data detected based on the gyroscope sensor and acceleration data detected based on the acceleration sensor, and the shaking of a user is obvious in a handheld state due to the fact that the terminal equipment is compared with a tripod state, so that the detected angular velocity data and the numerical value of the angular velocity data are large. Therefore, when the sensing module determines that the angular acceleration data is greater than the angular acceleration threshold and/or the acceleration data is greater than the acceleration threshold, the terminal device may determine that the terminal device is currently in a handheld state; alternatively, when the sensing module determines that the angular acceleration data is less than or equal to the angular acceleration threshold, and/or the acceleration data is less than or equal to the acceleration threshold, the terminal device may determine that the tripod state is currently present.
Specifically, the sensing module of the terminal device may count the brightness of the preview image, and determine whether the current HDR scene, the highlight scene, or the dim scene is satisfied, which may be implemented as follows: the terminal device may perform down-sampling on the preview picture by 4 times to obtain a preview small picture, and determine whether the ratio of the number of highlight pixels in the preview small picture to all pixels in the preview small picture is greater than a first pixel threshold corresponding to the HDR scene, or whether the ratio of the number of highlight pixels in the preview small picture to all pixels in the preview small picture is greater than a second pixel threshold corresponding to the highlight scene, or whether the ratio of the number of dim pixels in the preview small picture to all pixels in the preview small picture is greater than a third pixel threshold corresponding to the dim scene. Or, the terminal device may set a grayscale histogram corresponding to each of a typical HDR scene, a highlight scene, a dim light scene, and the like, so the terminal device may obtain the grayscale histogram corresponding to the preview picture, determine a similarity between the grayscale histogram corresponding to the preview picture and the grayscale histograms corresponding to each of the typical HDR scene, the highlight scene, the dim light scene, and the like, and further determine the current shooting scene. Both the highlight scene and the dim scene may be non-HDR scenes. The highlight scene may also be determined based on the brightness of the captured scene, for example, when the captured scene does not satisfy the HDR scene and the brightness of the captured scene is greater than (or equal to or greater than) the brightness threshold, the captured scene is a highlight scene; or, when the shooting scene does not satisfy the HDR scene and the brightness of the shooting scene is less than (or equal to or less than) the brightness threshold, the shooting scene is a dim light scene.
Specifically, one possible implementation that the sensing module of the terminal device determines whether the preset ticker scene is currently met based on whether the preset pattern of the ticker exists in the shooting scene may be: the terminal equipment can identify the object in the current shooting scene based on the preset AI model, and when the terminal equipment identifies the preset pattern containing the ticker such as a billboard in the shooting scene based on the preset AI model, the current meeting of the ticker scene can be determined.
In a possible implementation manner, besides the scene of actually shooting the video, the video recording process of the electronic device may also be used in some test processes, the test process uses a specific preset pattern for testing, and in order to better match the specific test process, the sensor operating mode may be switched based on the preset pattern in combination with the scene and the highlight condition.
It can be understood that the sensing module of the terminal device may also identify the state of the terminal device, the current brightness scene, the ticker scene, and the like based on other methods, which is not specifically limited in this embodiment of the application.
And S602, the terminal equipment determines a sensor working mode corresponding to the shooting scene by using a decision module.
In the embodiment of the present application, the decision module may store a corresponding relationship between the shooting scene and the sensor operating mode.
In a possible implementation manner, the decision module may further match the corresponding sensor operating mode based on the temperature of the terminal device, for example, a high temperature state corresponding to the case where the temperature of the terminal device is greater than the temperature threshold. Therefore, the terminal equipment can avoid the influence of high temperature on the terminal equipment by reducing the frame rate.
Exemplarily, table 2 is a schematic table of a correspondence between a shooting scene and a sensor operating mode provided in an embodiment of the present application.
Table 2 schematic table of correspondence between shooting scene and sensor operation mode
Figure BDA0003525174750000141
Figure BDA0003525174750000151
It is understood that the relationship between the dynamic range yield DR of DCG (or binning) and the image sensitivity (ISO value) may be: the higher the ISO value (or the more dim scene is understood), the smaller the DR value of DCG (or binning), so the graph mode with a frame rate of 30fps can be adopted; also, since the DCG has a large noise in a dark light scene, the binning30 may be employed in the dark light scene.
The following describes an example of switching between sensor operating modes in different scenes, with reference to the correspondence between the shooting scenes and the sensor operating modes corresponding to table 2.
In the embodiment of the present application, when the image sensor is started, the image sensor may start the DCG30 by default.
Further, in the case where the image sensor starts up the DCG30 by default, the terminal device may determine different operation modes by recognizing the hand-held state and the tripod state. The method for determining the hand-held state and the tripod state may refer to the description in the step shown in S601, and is not described herein again.
In one implementation, when the terminal device determines that the terminal device is currently in a handheld state and meets the HDR scene, the terminal device may determine that the DGC30 can be adopted for the current scene, and further instruct the image sensor, so that the image sensor may continue to maintain the operating mode of the DCG30.
In another implementation, when the terminal device determines that the current scene is the handheld state and satisfies the highlight scene, the terminal device may determine that the current scene may adopt the binding 60, and further instruct the image sensor, so that the image sensor may be switched from the DCG30 to the binding 60.
In a possible implementation manner, in the case that the image sensor is binding 60, when the terminal device determines that the current scene is the handheld state and the HDR scene is satisfied, the terminal device may determine that the DCG30 may be adopted by the current scene, and further instruct the image sensor. When the image sensor receives the DCG30, the binning60 may be first reduced to the binning30, and then the binning30 is switched to the DCG30. It can be understood that the image sensor is switched from the binding 60 to the binding 30 and then from the binding 30 to the DCG30, so as to avoid the DCG60 of the image sensor and improve the stability of the image sensor for image output.
In another implementation, when the terminal device determines that the current scene is the handheld state and meets the dark scene, the terminal device may determine that the current scene may adopt the binning30, and further instruct the image sensor, so that the image sensor may be switched from the DCG30 to the binning30.
In yet another implementation, when the terminal device determines that the current scene is a tripod state and satisfies the ticker scene, the terminal device may determine that the current scene may adopt the binding 60, and then instruct the image sensor so that the image sensor may be switched from the DCG30 to the binding 60.
In yet another implementation, when the terminal device determines that the current state is a tripod state and does not satisfy the ticker scene (or is understood to satisfy a non-ticker scene), the terminal device may determine that the current scene may employ the DCG30, and then instruct the image sensor so that the image sensor may continue to maintain the operating mode of the DCG30.
In yet another implementation, when the terminal device determines that the temperature of the current terminal device is greater than the temperature threshold, then the terminal device may determine that the current scene may employ the binding 24, and further instruct the image sensor so that the image sensor may be switched from the DCG30 to the binding 24.
It is understood that when the terminal device determines that the current temperature of the terminal device is greater than the temperature threshold, the terminal device may determine to use the operating mode of the binding 24 without recognizing other scenes.
Further, the decision module may send the sensor operation mode to the image sensor.
And S603, the terminal equipment obtains an image sequence by shooting with a sensor setting matched with the sensor working mode.
In this embodiment of the application, a camera of the terminal device may determine a sensor setting corresponding to the working mode based on table 1, and acquire an image sequence based on the image data in the sensor setting.
It can be understood that, when the decision module of the terminal device determines the sensor working mode, the working mode corresponding to the DCG is switched to the working mode corresponding to the binning, or the working mode corresponding to the binning is switched to the working mode corresponding to the DCG, the terminal device may obtain the image sequence based on the sensor setting corresponding to the switched working mode; further, parameters involved in the processing processes such as the automatic exposure module, the pre-processing of the image (or called front-end processing of the image signal processor), and the post-processing of the image (or called back-end processing of the image signal processor) can be adjusted according to the switched sensor operating mode.
As shown by the dotted line frame corresponding to the camera in fig. 6, the camera is used for collecting an image, and when the reflected light of the object passes through the lens and is refracted on the lens, the reflected light is converged on the image sensor. The image sensor may convert the optical signal into an analog electrical signal. The analog electric signal is output from the front end of the sensor and then output through the digital-to-analog converter. It will be appreciated that the digital-to-analog sensor outputs a RAW digital image, i.e. an image in RAW format, that is acquired by the camera.
And S604, the terminal equipment performs image preprocessing on the image sequence to obtain the image sequence after the image preprocessing.
In the embodiment of the present application, the image preprocessing is used to process an image in RAW (or referred to as RAW image data) format acquired by a camera into an image in YUV (or understood as luminance and chrominance) format.
It will be appreciated that the image pre-processing process may include one or more of the following, for example: the image preprocessing process is not limited in the embodiment of the present application, and may include a dead pixel removal correction process, a RAW domain noise reduction process, a black level correction process, an optical shading correction process, an automatic white balance process, a color interpolation process, a color correction process, a global tone mapping process, or an image conversion process.
As shown by the dashed line box corresponding to the image signal processor in fig. 6, the image signal processor is configured to perform correlation processing on the image in RAW format from the camera and generate an image to be displayed. Further, the image signal processor can send the image to be displayed to the display screen for displaying. The image signal processor may include: image preprocessing corresponding to the front end of the image processor, image post-processing corresponding to the back end of the image processor, and the like.
And S605, the terminal equipment respectively takes the image sequence after the image preprocessing as a preview stream and a video stream, and performs image post-processing to obtain a first image sequence after the image post-processing corresponding to the preview stream and a second image sequence after the image post-processing corresponding to the video stream.
In the embodiment of the present application, the image post-processing may include one or more of the following, for example: image correction and adjustment processing, local tone mapping processing, gamma (Gamma) correction processing, and the like.
In the image rectification and adjustment process, the terminal device may perform an anti-shake process on the current image, for example, crop the current image data, so that the cropped image may counteract the influence caused by shake of the terminal device. Illustratively, the terminal device may obtain angular acceleration data by using a gyroscope sensor, and obtain a transformation (warp) matrix corresponding to the current image through an electronic anti-shake process. Further, the terminal device may cut the current image data by using a warp matrix, for example, 10% to 20% of the image data is respectively cut in the horizontal and vertical directions of the current image data, so as to counteract the influence caused by shaking.
In the local tone mapping process, the terminal device can adjust the overall brightness of the image, so that the brightness-adjusted picture can be closer to the brightness presented in the real world. In the Gamma correction process, the terminal equipment can adjust the brightness of the image, so that the terminal equipment can keep more details of bright parts and dark parts, compress the contrast and keep more color information.
In a possible implementation manner, the terminal device may send the first image sequence to the display screen for displaying, so that the first image sequence may be displayed in a preview interface of the terminal device, for example, an interface shown as b in fig. 7. And, when the terminal device receives an operation of the user for the control 702 for ending video recording in the interface shown in b in fig. 7, the terminal device may encode and store the second image sequence as a video, so that the terminal device may play the video upon receiving the operation of the user viewing the video.
Based on the method, the terminal device can match a proper frame rate for the shooting scene, and dynamically adjust the frame rate according to the change of the shooting scene, so that the terminal device can obtain videos with better shooting effects based on different frame rates.
It should be understood that the interface provided in the embodiments of the present application is only an example, and is not to be construed as limiting the embodiments of the present application.
Based on the content described in the foregoing embodiments, in order to better understand the embodiments of the present application, fig. 8 is a schematic flowchart of another video processing method provided in the embodiments of the present application.
S801, the terminal device receives an operation for starting video recording.
The operation for starting the video may be a triggering operation for the control 701 for starting video recording in the interface shown in a in fig. 7.
S802, responding to the operation of starting video recording, and starting video recording by the terminal equipment.
And S803, the terminal equipment acquires a first image sequence of a shooting scene by using the first shooting parameters.
And S804, the terminal equipment adjusts the shooting parameters according to the shooting scene to obtain second shooting parameters.
The first shooting parameter is used for indicating the shooting parameter adopted when the terminal equipment records the video based on the double conversion gain DCG. It is understood that the first shooting parameter may be DCG setting information in the embodiment of the present application; the second shooting parameter may be binding setting information in the embodiment of the present application, and the DCG setting parameter and the binding setting operation may be described in the embodiment corresponding to table 1, which is not described herein again.
And S805, the terminal equipment acquires a second image sequence of the shooting scene by using the second shooting parameter.
S806, the terminal device obtains a video processing result based on the first image sequence and the second image sequence.
The terminal device may process the first image sequence and the second image sequence based on S604-S605 in the embodiment corresponding to fig. 6, so as to obtain a video processing result.
It should be understood that the first image sequence and the second image sequence may be image sequences corresponding to a preview stream, or may be image sequences corresponding to a video stream.
Therefore, the terminal equipment can match proper shooting parameters for the shooting scene, and dynamically adjust the shooting parameters according to the change of the shooting scene, so that the terminal equipment can obtain videos with better shooting effects based on different shooting parameters.
In one possible implementation manner, before S801, the method further includes: the terminal equipment receives an operation for checking a setting item corresponding to video recording; responding to the operation of viewing the setting item corresponding to the video recording, and displaying a first interface by the terminal equipment; the first interface comprises: a control for setting a video frame rate; the method comprises the steps that the terminal equipment receives operation aiming at a control for setting the video frame rate; responding to the operation of the control for setting the video frame rate, and displaying a second interface by the terminal equipment; wherein, the second interface includes: a control for setting the video frame rate to automatic; the terminal equipment receives operation for starting video recording, and the operation comprises the following steps: in the case where the video frame rate is automatic, the terminal apparatus receives an operation for starting video recording.
The operation for viewing the setting item corresponding to the video recording may be an operation for a setting control 501 in the interface illustrated in a in fig. 5; the first interface may be the interface shown as b in fig. 5; the control for setting the video frame rate may be a video frame rate functionality control 503 shown in b of fig. 5. The second interface may be the interface shown as c in fig. 5; the control for setting the video frame rate to automatic may be an option 505 for setting the frame rate to automatic as shown in c in fig. 5; in the case where the video frame rate is automatic, it is understood that the option 505 for setting the frame rate to automatic shown by d in fig. 5 may be in a selected state.
In a possible implementation manner, the second shooting parameter is used for indicating a shooting parameter adopted by the terminal device when performing video recording based on merging; the first photographing parameters may include: and the parameter is used for indicating the image sensor to acquire the image data at the first frame rate.
In one possible implementation, S804 includes: when the terminal equipment determines that the state of the terminal equipment meets a first preset state and the brightness of a shooting scene is greater than a brightness threshold value, the terminal equipment adjusts shooting parameters to obtain second shooting parameters; the second shooting parameters may include: parameters for instructing the image sensor to acquire images at a second frame rate; the second frame rate is greater than the first frame rate; or when the terminal device determines that the state of the terminal device meets a first preset state and the brightness of the shooting scene is less than or equal to a brightness threshold, the terminal device adjusts the shooting parameters to obtain second shooting parameters; the second shooting parameters may include: and the parameter is used for indicating the image sensor to acquire the image at the first frame rate.
The first preset state can be a handheld state in the embodiment of the application; the first frame rate may be 30fps in the embodiment of the present application; the second frame rate may be 60fps in the embodiment of the present application; the shooting scene with the brightness less than or equal to the brightness threshold value can be a highlight scene in the embodiment of the application.
In one possible implementation, S804 includes: when the terminal equipment determines that the state of the terminal equipment meets a second preset state and detects a preset pattern of the marquee in a shooting scene, the terminal equipment adjusts shooting parameters to obtain second shooting parameters; the second photographing parameters may include: and the parameter is used for indicating the image sensor to acquire the image at the second frame rate.
The second preset state may be a tripod state in the embodiment of the present application.
In one possible implementation, the method further comprises: when the terminal device determines that the state of the terminal device meets a first preset state and the shooting scene meets a High Dynamic Range (HDR) scene, or when the terminal device determines that the state of the terminal device meets a second preset state and a preset pattern of the marquee is not detected in the shooting scene, the terminal device reduces a second frame rate in the second shooting parameters to the first frame rate; and the terminal equipment adjusts the shooting parameters to obtain first shooting parameters.
In one possible implementation, S804 includes: when the terminal equipment determines that the temperature of the terminal equipment is greater than the temperature threshold value, the terminal equipment adjusts the shooting parameters to obtain second shooting parameters; the second photographing parameters may include: parameters for instructing the image sensor to acquire images at a third frame rate; the third frame rate is less than the first frame rate.
In a possible implementation manner, the second shooting parameter may further include: a parameter for indicating that the number of data storage bits is 12 bits, a parameter for indicating that the output format is the RAW data format RAW12, and a parameter for indicating that phase focusing is supported; the first shooting parameter may further include: a parameter for indicating that the number of data storage bits is 12 bits, a parameter for indicating that the output format is the RAW data format RAW12, and a parameter for indicating that phase focusing is supported.
In one possible implementation, S806 includes: the terminal equipment receives an operation for finishing video recording; and responding to the operation of finishing video recording, and the terminal equipment obtains a video processing result based on the first image sequence and the second image sequence.
The operation for ending the video recording may be an operation on a control 702 for ending the video recording in the interface shown in b in fig. 7.
In a possible implementation manner, the obtaining, by the terminal device, a video processing result based on the first image sequence and the second image sequence includes: the terminal equipment respectively carries out image preprocessing on the first image sequence and the second image sequence to obtain a first image sequence after the image preprocessing and a second image sequence after the image preprocessing; the terminal equipment respectively carries out image post-processing on the first image sequence after the image pre-processing and the second image sequence after the image pre-processing to obtain a first image sequence after the image post-processing and a second image sequence after the image post-processing; and the terminal equipment obtains a video processing result based on the first image sequence after the image post-processing and the second image sequence after the image post-processing.
In one possible implementation, the image post-processing includes one or more of: image correction and adjustment processing, local tone mapping processing, or gamma correction processing.
The description of the specific process in the image preprocessing may refer to S604 in the embodiment corresponding to fig. 6, and the description of the specific process in the image postprocessing may refer to S605 in the embodiment corresponding to fig. 6, which are not described again here.
The method provided by the embodiment of the present application is explained above with reference to fig. 5 to 8, and the apparatus provided by the embodiment of the present application for performing the method is described below. As shown in fig. 9, fig. 9 is a schematic structural diagram of a video processing apparatus provided in this embodiment of the present application, where the video processing apparatus may be a terminal device in this embodiment of the present application, and may also be a chip or a chip system in the terminal device.
As shown in fig. 9, the video processing apparatus 90 may be used in a communication device, circuit, hardware component, or chip, and includes: a display unit 901, and a processing unit 902. Wherein the display unit 901 is used to support the steps of displaying performed by the video processing method; the processing unit 902 is used to support the steps of the video processing apparatus to perform information processing.
The processing unit 902 and the display unit 901 may be integrated, and the processing unit 902 and the display unit 901 may be in communication.
In one possible implementation, the video processing apparatus may further include: a storage unit 903. The storage unit 903 may include one or more memories, which may be devices in one or more devices or circuits for storing programs or data.
The storage unit 903 may be separate and coupled to the processing unit 902 through a communication bus. The storage unit 903 may also be integrated with the processing unit 902.
Taking a video processing apparatus as an example of a chip or a chip system of a terminal device in this embodiment, the storage unit 903 may store a computer-executable instruction of a method of the terminal device, so that the processing unit 902 executes the method of the terminal device in the above embodiment. The storage unit 903 may be a register, a cache, a Random Access Memory (RAM), or the like, and the storage unit 903 may be integrated with the processing unit 902. The storage unit 903 may be a read-only memory (ROM) or other type of static storage device that may store static information and instructions, and the storage unit 903 may be separate from the processing unit 902.
In one possible implementation, the video processing apparatus may further include: a communication unit 904. Wherein the communication unit 904 is used to support the video processing apparatus to interact with other devices. Illustratively, when the video processing apparatus is a terminal device, the communication unit 904 may be a communication interface or an interface circuit. When the video processing apparatus is a chip or a system of chips in a terminal device, the communication unit 904 may be a communication interface. For example, the communication interface may be an input/output interface, a pin or a circuit, or the like.
The apparatus of this embodiment may be correspondingly used to perform the steps performed in the above method embodiments, and the implementation principle and the technical effect are similar, which are not described herein again.
Fig. 10 is a schematic diagram of a hardware structure of another terminal device according to an embodiment of the present disclosure, and as shown in fig. 10, the terminal device includes a processor 1001, a communication line 1004, and at least one communication interface (an exemplary case of the communication interface 1003 in fig. 10 is described as an example).
The processor 1001 may be a general-purpose Central Processing Unit (CPU), a microprocessor, an application-specific integrated circuit (ASIC), or one or more ics for controlling the execution of programs in accordance with the present disclosure.
The communication lines 1004 may include circuitry to communicate information between the above-described components.
Communication interface 1003, using any transceiver or the like, may be used to communicate with other devices or communication networks, such as ethernet, wireless Local Area Networks (WLAN), etc.
Possibly, the terminal device may further comprise a memory 1002.
The memory 1002 may be, but is not limited to, a read-only memory (ROM) or other type of static storage device that may store static information and instructions, a Random Access Memory (RAM) or other type of dynamic storage device that may store information and instructions, an electrically erasable programmable read-only memory (EEPROM), a compact disk read-only memory (CD-ROM) or other optical disk storage, optical disk storage (including compact disk, laser disk, optical disk, digital versatile disk, blu-ray disk, etc.), a magnetic disk storage medium or other magnetic storage device, or any other medium that can be used to carry or store desired program code in the form of instructions or data structures and that can be accessed by a computer. The memory may be stand alone and coupled to the processor via communication line 1004. The memory may also be integrated with the processor.
The memory 1002 is used for storing computer-executable instructions for executing the present invention, and is controlled by the processor 1001. The processor 1001 is configured to execute computer-executable instructions stored in the memory 1002 to implement the methods provided by the embodiments of the present application.
Possibly, the computer executed instructions in the embodiments of the present application may also be referred to as application program codes, which are not specifically limited in the embodiments of the present application.
In particular implementations, processor 1001 may include one or more CPUs, such as CPU0 and CPU1 in fig. 10, as one embodiment.
In particular implementations, a terminal device may include multiple processors, such as processor 1001 and processor 1005 of fig. 10, as an example. Each of these processors may be a single-core (single-CPU) processor or a multi-core (multi-CPU) processor. A processor herein may refer to one or more devices, circuits, and/or processing cores for processing data (e.g., computer program instructions).
Exemplarily, fig. 11 is a schematic structural diagram of a chip provided in an embodiment of the present application. Chip 110 includes one or more (including two) processors 1120 and a communication interface 1130.
In some embodiments, memory 1140 stores the following elements: an executable module or a data structure, or a subset thereof, or an expanded set thereof.
In an embodiment of the present application, the memory 1140 may include a read-only memory and a random access memory and provide instructions and data to the processor 1120. A portion of memory 1140 may also include non-volatile random access memory (NVRAM).
In the illustrated embodiment, the memory 1140, communication interface 1130, and processor 1120 are coupled via a bus system 1110. The bus system 1110 may include a power bus, a control bus, a status signal bus, and the like, in addition to a data bus. For ease of description, the various buses are labeled as bus system 1110 in FIG. 11.
The method described in the embodiments of the present application may be applied to the processor 1120, or implemented by the processor 1120. Processor 1120 may be an integrated circuit chip having signal processing capabilities. In implementation, the steps of the above method may be performed by instructions in the form of hardware, integrated logic circuits, or software in the processor 1120. The processor 1120 can be a general-purpose processor (e.g., a microprocessor or a conventional processor), a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an FPGA (field-programmable gate array) or other programmable logic device, discrete gate, transistor logic device or discrete hardware component, and the processor 1120 can implement or perform the methods, steps and logic blocks disclosed in the embodiments of the present invention.
The steps of the method disclosed in connection with the embodiments of the present application may be directly implemented by a hardware decoding processor, or implemented by a combination of hardware and software modules in the decoding processor. The software module may be located in a storage medium mature in the field, such as a random access memory, a read only memory, a programmable read only memory, or a charged erasable programmable memory (EEPROM). The storage medium is located in the memory 1140, and the processor 1120 reads the information in the memory 1140, and combines the hardware thereof to complete the steps of the above method.
In the above embodiments, the instructions stored by the memory for execution by the processor may be implemented in the form of a computer program product. The computer program product may be written in the memory in advance, or may be downloaded in the form of software and installed in the memory.
The computer program product includes one or more computer instructions. The procedures or functions according to the embodiments of the present application are all or partially generated when the computer program instructions are loaded and executed on a computer. The computer may be a general purpose computer, a special purpose computer, a network of computers, or other programmable device. Computer instructions may be stored in or transmitted from one computer-readable storage medium to another, e.g., from one website, computer, server, or data center to another website, computer, server, or data center via wire (e.g., coaxial cable, fiber optics, digital Subscriber Line (DSL), or wireless (e.g., infrared, wireless, microwave, etc.) computer-readable storage media may be any available media that a computer can store or a data storage device including one or more servers, data centers, etc. integrated with available media.
The embodiment of the application also provides a computer readable storage medium. The methods described in the above embodiments may be implemented in whole or in part by software, hardware, firmware, or any combination thereof. Computer-readable media may include both computer storage media and communication media, and may include any medium that can transfer a computer program from one place to another. A storage media may be any target media that can be accessed by a computer.
As one possible design, the computer-readable medium may include a compact disk read-only memory (CD-ROM), RAM, ROM, EEPROM, or other optical disk storage; the computer readable medium may include a disk memory or other disk storage device. Also, any connecting line may also be properly termed a computer-readable medium. For example, if the software is transmitted from a website, server, or other remote source using a coaxial cable, fiber optic cable, twisted pair, DSL, or wireless technologies such as infrared, radio, and microwave, then the coaxial cable, fiber optic cable, twisted pair, DSL, or wireless technologies such as infrared, radio, and microwave are included in the definition of medium. Disk and disc, as used herein, includes Compact Disc (CD), laser disc, optical disc, digital Versatile Disc (DVD), floppy disk and blu-ray disc where disks usually reproduce data magnetically, while discs reproduce data optically with lasers.
Combinations of the above should also be included within the scope of computer-readable media. The above description is only for the specific embodiments of the present invention, but the scope of the present invention is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present invention, and the changes or substitutions should be covered within the scope of the present invention. Therefore, the protection scope of the present invention shall be subject to the protection scope of the claims.

Claims (14)

1. A video processing method is applied to a terminal device, and the method comprises the following steps:
the terminal equipment receives operation for starting video recording;
responding to the operation of starting video recording, and starting the video recording by the terminal equipment;
the terminal equipment acquires a first image sequence of a shooting scene by using the first shooting parameters; the first shooting parameter is used for indicating the shooting parameter adopted by the terminal equipment when video recording is carried out based on the double conversion gain DCG;
the terminal equipment adjusts shooting parameters according to the shooting scene to obtain second shooting parameters;
the terminal equipment acquires a second image sequence of the shooting scene by using the second shooting parameters;
and the terminal equipment obtains a video processing result based on the first image sequence and the second image sequence.
2. The method according to claim 1, wherein before the terminal device receives an operation for starting video recording, the method further comprises:
the terminal equipment receives an operation for checking a setting item corresponding to the video recording;
responding to the operation of viewing the setting item corresponding to the video recording, and displaying a first interface by the terminal equipment; the first interface comprises: a control for setting a video frame rate;
the terminal equipment receives operation aiming at the control for setting the video frame rate;
responding to the operation of the control for setting the video frame rate, and displaying a second interface by the terminal equipment; wherein the second interface comprises: a control for setting the video frame rate to automatic;
the terminal equipment receives operation for starting video recording, and the operation comprises the following steps: and under the condition that the video frame rate is automatic, the terminal equipment receives an operation for starting video recording.
3. The method of claim 1, wherein the second shooting parameter is used to indicate a shooting parameter used by the terminal device for video recording based on merge binning; the first photographing parameters may include: and the parameter is used for indicating the image sensor to acquire the image data at the first frame rate.
4. The method according to claim 3, wherein the terminal device adjusts the shooting parameters according to the shooting scene to obtain second shooting parameters, and the method comprises:
when the terminal equipment determines that the state of the terminal equipment meets a first preset state and the brightness of the shooting scene is greater than a brightness threshold value, the terminal equipment adjusts shooting parameters to obtain second shooting parameters; the second shooting parameter may include: parameters for instructing the image sensor to acquire images at a second frame rate; the second frame rate is greater than the first frame rate;
or when the terminal device determines that the state of the terminal device meets the first preset state and the brightness of the shooting scene is smaller than or equal to the brightness threshold, the terminal device adjusts the shooting parameters to obtain the second shooting parameters; wherein, the second shooting parameters may include: parameters for instructing an image sensor to acquire images at the first frame rate.
5. The method according to claim 3, wherein the terminal device adjusts the shooting parameters according to the shooting scene to obtain second shooting parameters, and the method comprises:
when the terminal equipment determines that the state of the terminal equipment meets a second preset state and a preset pattern of the marquee is detected in the shooting scene, the terminal equipment adjusts shooting parameters to obtain the second shooting parameters; the second shooting parameters may include: and the parameters are used for indicating the image sensor to acquire the images at the second frame rate.
6. The method of claim 5, further comprising:
when the terminal device determines that the state of the terminal device meets a first preset state and the shooting scene meets a High Dynamic Range (HDR) scene, or when the terminal device determines that the state of the terminal device meets a second preset state and a preset pattern of the ticker is not detected in the shooting scene, the terminal device reduces the second frame rate in second shooting parameters to the first frame rate;
and the terminal equipment adjusts shooting parameters to obtain the first shooting parameters.
7. The method according to claim 3, wherein the terminal device adjusts shooting parameters according to the shooting scene to obtain second shooting parameters, and the method comprises:
when the terminal equipment determines that the temperature of the terminal equipment is greater than the temperature threshold value, the terminal equipment adjusts shooting parameters to obtain second shooting parameters; the second shooting parameter may include: parameters for instructing the image sensor to acquire images at a third frame rate; the third frame rate is less than the first frame rate.
8. The method according to any one of claims 3 to 7, wherein the second shooting parameter further comprises: a parameter for indicating that the number of data storage bits is 12 bits, a parameter for indicating that the output format is the RAW data format RAW12, and a parameter for indicating that phase focusing is supported; the first shooting parameter may further include: a parameter for indicating that the number of data storage bits is 12 bits, a parameter for indicating that the output format is the RAW data format RAW12, and a parameter for indicating that phase focusing is supported.
9. The method according to any one of claims 1 to 8, wherein the obtaining, by the terminal device, a video processing result based on the first image sequence and the second image sequence comprises:
the terminal equipment receives an operation for finishing the video recording;
and responding to the operation of finishing the video recording, and the terminal equipment obtains the video processing result based on the first image sequence and the second image sequence.
10. The method according to claim 9, wherein the obtaining, by the terminal device, the video processing result based on the first image sequence and the second image sequence comprises:
the terminal equipment respectively carries out image preprocessing on the first image sequence and the second image sequence to obtain a first image sequence after image preprocessing and a second image sequence after image preprocessing;
the terminal equipment respectively carries out image post-processing on the first image sequence after the image pre-processing and the second image sequence after the image pre-processing to obtain a first image sequence after the image post-processing and a second image sequence after the image post-processing;
and the terminal equipment obtains the video processing result based on the first image sequence after the image post-processing and the second image sequence after the image post-processing.
11. The method of claim 10, wherein the image post-processing comprises one or more of: an image correction and adjustment process, a local tone mapping process, or a gamma correction process.
12. An electronic device comprising a memory, a processor, and a computer program stored in the memory and executable on the processor, wherein the processor, when executing the computer program, causes the electronic device to perform the method of any of claims 1 to 11.
13. A computer-readable storage medium, in which a computer program is stored which, when executed by a processor, causes a computer to carry out the method according to any one of claims 1 to 11.
14. A computer program product, comprising a computer program which, when executed, causes a computer to perform the method of any one of claims 1 to 11.
CN202210193591.3A 2022-02-28 2022-02-28 Video processing method and device Active CN115526787B (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
CN202311673982.6A CN117911299A (en) 2022-02-28 2022-02-28 Video processing method and device
CN202210193591.3A CN115526787B (en) 2022-02-28 2022-02-28 Video processing method and device
PCT/CN2023/071381 WO2023160285A1 (en) 2022-02-28 2023-01-09 Video processing method and apparatus

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210193591.3A CN115526787B (en) 2022-02-28 2022-02-28 Video processing method and device

Related Child Applications (1)

Application Number Title Priority Date Filing Date
CN202311673982.6A Division CN117911299A (en) 2022-02-28 2022-02-28 Video processing method and device

Publications (2)

Publication Number Publication Date
CN115526787A true CN115526787A (en) 2022-12-27
CN115526787B CN115526787B (en) 2023-10-20

Family

ID=84694950

Family Applications (2)

Application Number Title Priority Date Filing Date
CN202311673982.6A Pending CN117911299A (en) 2022-02-28 2022-02-28 Video processing method and device
CN202210193591.3A Active CN115526787B (en) 2022-02-28 2022-02-28 Video processing method and device

Family Applications Before (1)

Application Number Title Priority Date Filing Date
CN202311673982.6A Pending CN117911299A (en) 2022-02-28 2022-02-28 Video processing method and device

Country Status (2)

Country Link
CN (2) CN117911299A (en)
WO (1) WO2023160285A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116567407A (en) * 2023-05-04 2023-08-08 荣耀终端有限公司 Camera parameter configuration method and electronic equipment
WO2023160285A1 (en) * 2022-02-28 2023-08-31 荣耀终端有限公司 Video processing method and apparatus
CN117119291A (en) * 2023-02-06 2023-11-24 荣耀终端有限公司 Picture mode switching method and electronic equipment

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140204244A1 (en) * 2013-01-18 2014-07-24 Samsung Electronics Co., Ltd. Method and apparatus for photographing in portable terminal
CN108121524A (en) * 2017-12-19 2018-06-05 广东欧珀移动通信有限公司 The adjusting method and device, electronic equipment of electronic equipment image display preview frame per second
CN111107292A (en) * 2019-02-28 2020-05-05 华为技术有限公司 Video frame rate control method and related device
US20200284575A1 (en) * 2014-11-04 2020-09-10 Pixart Imaging Inc. Camera having two exposure modes and imaging system using the same
CN112584030A (en) * 2019-09-27 2021-03-30 中移物联网有限公司 Driving video recording method and electronic equipment
CN113382169A (en) * 2021-06-18 2021-09-10 荣耀终端有限公司 Photographing method and electronic equipment
CN113727016A (en) * 2021-06-15 2021-11-30 荣耀终端有限公司 Shooting method and electronic equipment
CN114079762A (en) * 2020-08-11 2022-02-22 三星电子株式会社 Mobile electronic device with multiple camera modules

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117911299A (en) * 2022-02-28 2024-04-19 荣耀终端有限公司 Video processing method and device

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140204244A1 (en) * 2013-01-18 2014-07-24 Samsung Electronics Co., Ltd. Method and apparatus for photographing in portable terminal
US20200284575A1 (en) * 2014-11-04 2020-09-10 Pixart Imaging Inc. Camera having two exposure modes and imaging system using the same
CN108121524A (en) * 2017-12-19 2018-06-05 广东欧珀移动通信有限公司 The adjusting method and device, electronic equipment of electronic equipment image display preview frame per second
CN111107292A (en) * 2019-02-28 2020-05-05 华为技术有限公司 Video frame rate control method and related device
CN113411528A (en) * 2019-02-28 2021-09-17 华为技术有限公司 Video frame rate control method and related device
CN112584030A (en) * 2019-09-27 2021-03-30 中移物联网有限公司 Driving video recording method and electronic equipment
CN114079762A (en) * 2020-08-11 2022-02-22 三星电子株式会社 Mobile electronic device with multiple camera modules
CN113727016A (en) * 2021-06-15 2021-11-30 荣耀终端有限公司 Shooting method and electronic equipment
CN113382169A (en) * 2021-06-18 2021-09-10 荣耀终端有限公司 Photographing method and electronic equipment

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023160285A1 (en) * 2022-02-28 2023-08-31 荣耀终端有限公司 Video processing method and apparatus
CN117119291A (en) * 2023-02-06 2023-11-24 荣耀终端有限公司 Picture mode switching method and electronic equipment
CN116567407A (en) * 2023-05-04 2023-08-08 荣耀终端有限公司 Camera parameter configuration method and electronic equipment
CN116567407B (en) * 2023-05-04 2024-05-03 荣耀终端有限公司 Camera parameter configuration method and electronic equipment

Also Published As

Publication number Publication date
WO2023160285A1 (en) 2023-08-31
CN115526787B (en) 2023-10-20
CN117911299A (en) 2024-04-19
WO2023160285A9 (en) 2024-03-14

Similar Documents

Publication Publication Date Title
WO2020186969A1 (en) Multi-path video recording method and device
US11669242B2 (en) Screenshot method and electronic device
CN113810601B (en) Terminal image processing method and device and terminal equipment
CN115526787B (en) Video processing method and device
CN113810600B (en) Terminal image processing method and device and terminal equipment
CN112532892B (en) Image processing method and electronic device
CN110830730B (en) Apparatus and method for generating moving image data in electronic device
US20230162324A1 (en) Projection data processing method and apparatus
CN113810604B (en) Document shooting method, electronic device and storage medium
CN114489533A (en) Screen projection method and device, electronic equipment and computer readable storage medium
EP4318383A1 (en) Video processing method and apparatus
CN113572948B (en) Video processing method and video processing device
CN112954251A (en) Video processing method, video processing device, storage medium and electronic equipment
CN115086567A (en) Time-delay shooting method and device
CN113630558B (en) Camera exposure method and electronic equipment
CN113497851B (en) Control display method and electronic equipment
CN115529411B (en) Video blurring method and device
CN115631250B (en) Image processing method and electronic equipment
CN115767290A (en) Image processing method and electronic device
CN115150542A (en) Video anti-shake method and related equipment
CN111294509A (en) Video shooting method, device, terminal and storage medium
CN115460343B (en) Image processing method, device and storage medium
WO2024082863A1 (en) Image processing method and electronic device
CN116095512B (en) Photographing method of terminal equipment and related device
CN111294905B (en) Image processing method, image processing device, storage medium and electronic apparatus

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant