CN110381276B - Video shooting method and electronic equipment - Google Patents

Video shooting method and electronic equipment Download PDF

Info

Publication number
CN110381276B
CN110381276B CN201910606551.5A CN201910606551A CN110381276B CN 110381276 B CN110381276 B CN 110381276B CN 201910606551 A CN201910606551 A CN 201910606551A CN 110381276 B CN110381276 B CN 110381276B
Authority
CN
China
Prior art keywords
frame
picture
shooting
electronic device
shot
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910606551.5A
Other languages
Chinese (zh)
Other versions
CN110381276A (en
Inventor
陈彬
朱聪超
李远友
胡斌
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Publication of CN110381276A publication Critical patent/CN110381276A/en
Application granted granted Critical
Publication of CN110381276B publication Critical patent/CN110381276B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/62Control of parameters via user interfaces
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/667Camera operation mode switching, e.g. between still and video, sport and normal or high- and low-resolution modes
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • H04N23/81Camera processing pipelines; Components thereof for suppressing or minimising disturbance in the image signal generation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording

Abstract

The application provides a video shooting method and electronic equipment, relates to the technical field of terminals, and can reduce the phenomenon of picture flicker in the video shooting process and improve the video shooting quality and the user experience. The method comprises the following steps: the electronic device opens a camera application; responding to the recording operation of a user for recording a video, and displaying a shooting interface of the camera application by the electronic equipment, wherein the shooting interface comprises a first view finding window which is used for displaying a shooting picture in the video recording process; the electronic equipment acquires continuous N1 frames of shooting pictures; when the brightness of the N1 frame shot fluctuates, the electronic equipment acquires continuous M1 frame shot; and after the electronic equipment conducts deflickering processing on the M1 frame shot picture, the M1 frame shot picture is displayed in the first view finding window, and the brightness of the M1 frame shot picture is stabilized in a first value range after deflickering processing.

Description

Video shooting method and electronic equipment
The present application claims priority of a chinese patent application entitled "a method for removing flicker in pictures in video recorded at high frame rate" filed by the national intellectual property office on 06/05/2019 under the application number 201910374503.8, the entire contents of which are incorporated herein by reference.
Technical Field
The application relates to the technical field of terminals, in particular to a video shooting method and electronic equipment.
Background
Generally, after a user starts a video shooting function of a terminal such as a mobile phone, the mobile phone may record a video at a certain frame rate (FPS). For example, the frame rate is 60Hz, when the mobile phone shoots a video, 60 frames of shooting pictures can be shot every second, and the continuous multiple frames of shooting pictures can form a section of video after being coded. When the mobile phone records videos in slow motion and other modes, the frame rate of the video shot by the mobile phone can be as high as about 1000 Hz.
At present, the frequency of the alternating current used by us is usually 50Hz, that is, the alternating current fluctuates in light intensity at a frequency of 50 alternations per second. Obviously, the frame rate of the mobile phone when shooting the video is much higher than the fluctuation frequency of the light, so that the user can see obvious flicker of the picture from the shot picture when shooting the video by the mobile phone, and the shot video also has the phenomenon of flicker of the picture.
Disclosure of Invention
The application provides a video shooting method and electronic equipment, which can reduce the phenomenon of picture flicker in the video shooting process and improve the shooting quality of videos and the use experience of users.
In order to achieve the purpose, the technical scheme is as follows:
in a first aspect, the present application provides a video shooting method, including: the electronic equipment can use the camera to collect shot pictures after opening the camera application; responding to the recording operation of a user for recording a video in a camera application, the electronic equipment can display a shooting interface of the camera application, wherein the shooting interface comprises a first view finding window, and the first view finding window is used for displaying a shooting picture in the video recording process; subsequently, the electronic device may acquire consecutive N1(N1 is an integer greater than 1) frames of captured pictures; when the brightness of the N1 frame shot pictures fluctuates, which indicates that the flicker phenomenon occurs in the current recording scene, the electronic equipment can continuously acquire continuous M1(M1 is an integer larger than 1) frame shot pictures; furthermore, the electronic device can perform deflickering processing on the M1 frame shot picture, display the M1 frame shot picture after deflickering processing in the first view finding window, and stabilize the brightness of the M1 frame shot picture after deflickering processing in the first value range, thereby reducing or eliminating the phenomenon of picture flicker observed by a user in the video shooting process, and improving the shooting quality of the video and the shooting experience of the user.
That is to say, when a user watches a video being recorded in a video shooting interface, all shot pictures presented to the user by the electronic device are shot pictures subjected to flicker removal processing, so that the user does not obviously feel a flicker phenomenon in the shot pictures, and the shooting quality of the video and the use experience of the user are improved when the video is recorded.
It should be noted that the continuous N1 frames of shot pictures may refer to continuous N1 frames of shot pictures collected by the camera; alternatively, the continuous N1 frames of shot pictures may be N1 frames of shot pictures which are extracted from multiple frames of shot pictures collected by the camera and need to be displayed in the first viewfinder window by the electronic device.
In one possible implementation manner, after the electronic device opens the camera application, the method further includes: the method comprises the steps that the electronic equipment displays a preview interface of a camera application, wherein the preview interface comprises a second view finding window, and the second view finding window is used for displaying a preview picture before video recording; subsequently, the electronic device may acquire consecutive N2(N2 is an integer greater than 1) frame preview screens; when the brightness of the N2 frame preview pictures fluctuates, which indicates that a flicker phenomenon occurs in the current preview scene, the electronic equipment continues to acquire continuous M2(M2 is an integer greater than 1) frame preview pictures; similar to the M1 frames, after the electronic device performs the deflicker processing on the M2 frames, the M2 frames of preview images can be displayed in the second viewfinder window, and the brightness of the M2 frames of preview images is stabilized within the second value range.
Therefore, when a user watches the preview picture in the preview interface of the video, the preview pictures presented to the user by the electronic equipment are all shot pictures subjected to flicker removal processing, so that the user can not obviously feel the flicker phenomenon in the preview pictures, and the use experience of the user in preview shooting is improved.
Similarly, the continuous N2 frames of preview pictures may refer to continuous N2 frames of shot pictures collected by the camera; alternatively, the continuous N1 frames of preview images may be N2 frames of captured images that are extracted from multiple frames of captured images captured by the camera and need to be displayed in the second viewfinder window.
In one possible implementation manner, after the electronic device displays the shooting interface of the camera application, the method further includes: the electronic equipment starts to collect each frame of shooting picture according to the first shooting frequency; similarly, after the electronic device displays the preview interface of the camera application, the method further includes: the electronic equipment starts to collect each frame of preview picture according to a second shooting frequency, and the first shooting frequency is the same as or different from the second shooting frequency. That is to say, the shooting frequency of the electronic device for acquiring the shot picture in the previewing process is the same as or different from the shooting frequency of the electronic device for acquiring the shot picture in the shooting process.
In one possible implementation manner, before the electronic device acquires the continuous N1 frames of shot pictures, the method further includes: and the electronic equipment displays continuous K frames of shot pictures in the first view finding window, the brightness of the K frames of shot pictures is stabilized in a third value range, and K is an integer greater than 1. That is to say, when the electronic device displays the K frames of shot pictures, the shot pictures do not flicker, and at this time, the electronic device does not need to perform the flicker removal processing on the shot pictures.
In one possible implementation manner, after the electronic device acquires the continuous N1 frame shooting pictures, the method further includes: the electronic equipment calculates the variance of the brightness of the N1 frame shooting picture; if the variance is larger than the threshold value, the electronic equipment determines that the brightness of the N1 frame shooting picture fluctuates, and the shooting scene at the moment is a scene with flickering light.
In one possible implementation manner, after the electronic device acquires the continuous M1 frame shooting pictures, the method further includes: the electronic equipment adjusts the brightness of each frame of the M1 frame shooting pictures to be within a first value range.
For example, the electronic device adjusts the brightness of each of the M1 frames of captured pictures to be within a first value range, including: for any one of the M1 frames (i.e., the X-th frame), the electronic device may obtain a luminance average of the latest W frames, where W is an integer greater than 1; furthermore, the electronic device may adjust the brightness of the X-th frame captured image to the brightness average value, where the X-th frame captured image is the last frame of the W-frame captured images.
Therefore, the electronic equipment can compensate the brightness value of each frame of shot picture in the shot scene with the flickering light, so that the brightness of each shot picture tends to be stable, and the flickering phenomenon in the shot picture is weakened or eliminated.
In a possible implementation manner, after the electronic device adjusts the brightness of the shot picture of the xth frame to the brightness mean, the method further includes: and the electronic equipment performs image fusion on the X frame shooting picture and the latest Y frame shooting picture to obtain an X 'frame shooting picture, wherein the X' frame shooting picture is the X frame shooting picture after the deflickering treatment, and Y is an integer greater than 0. After image fusion, the brightness fluctuation of a certain area in the shot picture of the X frame among continuous multiple frames can be weakened or eliminated, and the shot picture of the X' frame after flicker removal is obtained.
In a possible implementation manner, after the electronic device performs image fusion on the X-th frame captured picture and the latest Y-frame captured picture to obtain an X' -th frame captured picture, the method further includes: the electronic equipment determines whether the motion area where the moving object is located is included in the shooting picture of the Xth' frame; if the motion area is included, the motion area can cause the afterimage of the moving object to be generated in the X 'th frame shooting picture generated after the fusion, and for this reason, because the afterimage of the moving object does not exist in the X' th frame shooting picture, the electronic device can use the target area in the X 'th frame shooting picture to replace the motion area in the X' th frame shooting picture, and the target area in the X 'th frame shooting picture corresponds to the motion area in the X' th frame shooting picture. In this way, the X' frame shooting picture finally output by the electronic equipment not only eliminates the flicker phenomenon in the shooting picture, but also eliminates the afterimage phenomenon in the shooting picture.
In one possible implementation manner, the electronic device determining whether a motion area in which a moving object is located is included in the shot picture of the X' th frame includes: the electronic equipment respectively compares the X frame shooting picture with the Y frame shooting picture so as to determine whether a moving object exists in the X frame shooting picture and the Y frame shooting picture; and if the moving object exists, the electronic equipment determines the area of the moving object in the X' th frame shooting picture as a motion area.
In a possible implementation manner, the method further includes: and responding to the operation that the user finishes recording the video, and the electronic equipment encodes all the shot pictures collected within the recording time into the recorded video and stores the video.
In a possible implementation manner, before the electronic device encodes all captured pictures collected within the recording time into the recorded video and stores the video, the method further includes: and for each frame of shot picture collected in the recording time, the electronic equipment carries out deflicker processing on the shot picture. Therefore, the picture flicker phenomenon does not occur in the video finally stored by the electronic equipment.
In a possible implementation manner, since the flicker phenomenon is more likely to occur when the photographing frequency is high, the first photographing frequency and the second photographing frequency may be set to be greater than 120 frames/second.
In a second aspect, the present application provides a video shooting method, including: after the electronic equipment opens the camera application, entering a slow motion video shooting mode; responding to the operation of starting to record the slow motion video by a user, displaying a shooting interface of the slow motion video by the electronic equipment, wherein the shooting interface comprises a view finding window, and the view finding window is used for displaying a shooting picture in the recording process of the slow motion video; responding to the operation that the user finishes recording the slow motion video, automatically generating the slow motion video by the electronic equipment, wherein the playing time length of the slow motion video is longer than the recording time length; the slow motion video comprises continuous N frames of shooting pictures, and the brightness of the N frames of shooting pictures fluctuates; the slow motion video also comprises continuous M frames of shooting pictures positioned behind the N frames of shooting pictures, the brightness of the M frames of shooting pictures is stabilized in a first value range, and both N and M are integers larger than 1.
That is to say, after the user finishes recording the slow motion video, the electronic device can automatically detect whether the captured shooting picture has a flicker phenomenon. If the brightness of the continuous N frames of shooting pictures fluctuates, the phenomenon of flicker is indicated. Furthermore, the electronic device can perform deflicker processing on each frame of shot picture after the N frames of shot pictures, so that the electronic device can automatically generate the slow motion video without the deflicker phenomenon when shooting the slow motion video in the deflicker scene. Therefore, when the user subsequently opens the slow motion video, the electronic equipment can play each frame of shooting picture subjected to the flicker removal processing to the user, and therefore the shooting quality of the video in the high-frame-rate recording scene is improved.
Certainly, in the process of previewing and recording the slow motion video, the electronic device can still remove the flicker phenomenon in the displayed shooting picture according to the method, so that the user can not obviously feel the flicker phenomenon in the shooting picture when previewing and recording the slow motion video, and the shooting quality of the video in the high-frame-rate recording scene and the use experience of the user are improved.
In a third aspect, the present application provides an electronic device, comprising: a touchscreen, one or more processors, one or more cameras, memory, and one or more computer programs; the processor is coupled with the touch screen, the camera and the memory, the one or more computer programs are stored in the memory, and when the electronic device runs, the processor executes the one or more computer programs stored in the memory, so that the electronic device executes any one of the video shooting methods.
In a fourth aspect, the present application provides a computer storage medium comprising computer instructions which, when run on an electronic device, cause the electronic device to perform the video capturing method of any of the first aspects.
In a fifth aspect, the present application provides a computer program product for causing an electronic device to perform the video capturing method according to any one of the first aspect when the computer program product is run on the electronic device.
It is to be understood that the electronic device according to the third aspect, the computer storage medium according to the fourth aspect, and the computer program product according to the fifth aspect are all configured to execute the corresponding method provided above, and therefore, the beneficial effects achieved by the electronic device can refer to the beneficial effects in the corresponding method provided above, and are not described herein again.
Drawings
Fig. 1 is a first schematic structural diagram of an electronic device according to an embodiment of the present disclosure;
fig. 2 is a schematic view of an operating principle of a camera provided in an embodiment of the present application;
fig. 3 is a schematic diagram illustrating a recording principle of a slow motion video according to an embodiment of the present application;
fig. 4 is a schematic flowchart of a video shooting method according to an embodiment of the present disclosure;
fig. 5 is a schematic view of an application scenario of a video shooting method according to an embodiment of the present application;
fig. 6 is a schematic view of an application scene of a video shooting method according to an embodiment of the present application;
fig. 7 is a schematic view of an application scene three of a video shooting method according to an embodiment of the present application;
fig. 8 is a schematic view of an application scene of a video shooting method according to an embodiment of the present application;
fig. 9 is a schematic view of an application scene of a video shooting method according to an embodiment of the present application;
fig. 10 is a schematic view six of an application scenario of a video shooting method according to an embodiment of the present application;
fig. 11 is a schematic view seventh of an application scene of a video shooting method according to an embodiment of the present application;
fig. 12 is an application scene schematic diagram eight of a video shooting method according to an embodiment of the present application;
fig. 13 is a schematic view nine of an application scenario of a video shooting method according to an embodiment of the present application;
fig. 14 is a schematic view of an application scenario of a video shooting method according to an embodiment of the present application;
fig. 15 is an eleventh application scenario schematic diagram of a video shooting method according to an embodiment of the present application;
fig. 16 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
Embodiments of the present embodiment will be described in detail below with reference to the accompanying drawings.
For example, the video shooting method provided in the embodiment of the present application may be applied to electronic devices such as a mobile phone, a tablet computer, a notebook computer, an ultra-mobile personal computer (UMPC), a handheld computer, a netbook, a Personal Digital Assistant (PDA), a wearable electronic device, and a virtual reality device, and the embodiment of the present application does not limit the present application in any way.
Fig. 1 shows a schematic structural diagram of an electronic device 100.
The electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a Universal Serial Bus (USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, a key 190, a motor 191, an indicator 192, a camera 193, a display screen 194, a Subscriber Identification Module (SIM) card interface 195, and the like.
It is to be understood that the illustrated structure of the embodiment of the present invention does not specifically limit the electronic device 100. In other embodiments of the present application, electronic device 100 may include more or fewer components than shown, or some components may be combined, some components may be split, or a different arrangement of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Processor 110 may include one or more processing units, such as: the processor 110 may include an Application Processor (AP), a modem processor, a Graphics Processing Unit (GPU), an Image Signal Processor (ISP), a controller, a video codec, a Digital Signal Processor (DSP), a baseband processor, and/or a neural-Network Processing Unit (NPU), etc. The different processing units may be separate devices or may be integrated into one or more processors.
A memory may also be provided in processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that have just been used or recycled by the processor 110. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Avoiding repeated accesses reduces the latency of the processor 110, thereby increasing the efficiency of the system.
In some embodiments, processor 110 may include one or more interfaces. The interface may include an integrated circuit (I2C) interface, an integrated circuit built-in audio (I2S) interface, a Pulse Code Modulation (PCM) interface, a universal asynchronous receiver/transmitter (UART) interface, a Mobile Industry Processor Interface (MIPI), a general-purpose input/output (GPIO) interface, a Subscriber Identity Module (SIM) interface, and/or a Universal Serial Bus (USB) interface, etc.
The charging management module 140 is configured to receive charging input from a charger. The charger may be a wireless charger or a wired charger. In some wired charging embodiments, the charging management module 140 may receive charging input from a wired charger via the USB interface 130. In some wireless charging embodiments, the charging management module 140 may receive a wireless charging input through a wireless charging coil of the electronic device 100. The charging management module 140 may also supply power to the electronic device through the power management module 141 while charging the battery 142.
The power management module 141 is used to connect the battery 142, the charging management module 140 and the processor 110. The power management module 141 may receive input from the battery 142 and/or the charge management module 140 to power the processor 110, the internal memory 121, the display 194, the camera 193, the wireless communication module 160, and the like.
The power management module 141 may be configured to monitor performance parameters such as battery capacity, battery cycle count, battery charging voltage, battery discharging voltage, battery state of health (e.g., leakage, impedance), and the like. In some other embodiments, the power management module 141 may also be disposed in the processor 110. In other embodiments, the power management module 141 and the charging management module 140 may be disposed in the same device.
The wireless communication function of the electronic device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the electronic device 100 may be used to cover a single or multiple communication bands. Different antennas can also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed as a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 150 may provide a solution including 2G/3G/4G/5G wireless communication applied to the electronic device 100. The mobile communication module 150 may include one or more filters, switches, power amplifiers, Low Noise Amplifiers (LNAs), and the like. The mobile communication module 150 may receive the electromagnetic wave from the antenna 1, filter, amplify, etc. the received electromagnetic wave, and transmit the electromagnetic wave to the modem processor for demodulation. The mobile communication module 150 may also amplify the signal modulated by the modem processor, and convert the signal into electromagnetic wave through the antenna 1 to radiate the electromagnetic wave. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the processor 110. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the same device as at least some of the modules of the processor 110.
The wireless communication module 160 may provide a solution for wireless communication applied to the electronic device 100, including Wireless Local Area Networks (WLANs) (e.g., wireless fidelity (Wi-Fi) networks), Bluetooth (BT), Global Navigation Satellite System (GNSS), Frequency Modulation (FM), Near Field Communication (NFC), Infrared (IR), and the like. The wireless communication module 160 may be one or more devices that integrate one or more communication processing modules. The wireless communication module 160 receives electromagnetic waves via the antenna 2, performs frequency modulation and filtering processing on electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, perform frequency modulation and amplification on the signal, and convert the signal into electromagnetic waves through the antenna 2 to radiate the electromagnetic waves.
In some embodiments, antenna 1 of electronic device 100 is coupled to mobile communication module 150 and antenna 2 is coupled to wireless communication module 160 so that electronic device 100 can communicate with networks and other devices through wireless communication techniques. The wireless communication technology may include global system for mobile communications (GSM), General Packet Radio Service (GPRS), code division multiple access (code division multiple access, CDMA), Wideband Code Division Multiple Access (WCDMA), time-division code division multiple access (time-division code division multiple access, TD-SCDMA), Long Term Evolution (LTE), LTE, BT, GNSS, WLAN, NFC, FM, and/or IR technologies, etc. The GNSS may include a Global Positioning System (GPS), a global navigation satellite system (GLONASS), a beidou navigation satellite system (BDS), a quasi-zenith satellite system (QZSS), and/or a Satellite Based Augmentation System (SBAS).
The electronic device 100 implements display functions via the GPU, the display screen 194, and the application processor. The GPU is a microprocessor for image processing, and is connected to the display screen 194 and an application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. The processor 110 may include one or more GPUs that execute program instructions to generate or alter display information.
The display screen 194 is used to display images, video, and the like. The display screen 194 includes a display panel. The display panel may adopt a Liquid Crystal Display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (active-matrix organic light-emitting diode, AMOLED), a flexible light-emitting diode (FLED), a miniature, a Micro-oeld, a quantum dot light-emitting diode (QLED), and the like. In some embodiments, the electronic device 100 may include 1 or N display screens 194, with N being a positive integer greater than 1.
The electronic device 100 may implement a shooting function through the ISP, the camera 193, the video codec, the GPU, the display 194, the application processor, and the like.
The ISP is used to process the data fed back by the camera 193. For example, when a photo is taken, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electrical signal, and the camera photosensitive element transmits the electrical signal to the ISP for processing and converting into an image visible to naked eyes. The ISP can also carry out algorithm optimization on the noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in camera 193.
The camera 193 is used to capture still images or video. In some embodiments, the handset 100 may include 1 or N cameras, N being a positive integer greater than 1. The camera 193 may be a front camera or a rear camera. As shown in fig. 2, the camera 193 generally includes a lens (lens) and a photosensitive element (sensor), which may be any photosensitive device such as a CCD (charge-coupled device) or a CMOS (complementary metal oxide semiconductor).
Still as shown in fig. 2, in the process of taking a picture, the reflected light of the object to be taken may generate an optical image after passing through the lens, the optical image is projected onto the photosensitive element, the photosensitive element converts the received optical Signal into an electrical Signal, and the camera 193 sends the obtained electrical Signal to a DSP (Digital Signal Processing) module for Digital Signal Processing, so as to finally obtain a frame Digital image.
Similarly, during the process of capturing video, the camera 193 may capture a plurality of digital images continuously at a certain frame rate, and the plurality of digital images are encoded to form a video segment. Due to the special physiological structure of the human eye, when the frame rate of the viewed pictures is higher than 16 frames per second (fps), the human eye considers the viewed pictures to be coherent, and this phenomenon can be referred to as visual retention. To ensure the consistency of the video watched by the user, the mobile phone can capture each frame of image at a frame rate higher than 16 frames/second (e.g. 30fps or 60fps) when shooting the video. In this way, the user feels that the pictures in the video are coherent when watching the video shot by the mobile phone.
The image or video captured by the camera 193 may be output on the mobile phone 100 through the display 194, or the digital image may be stored in the internal memory 121 (or the external memory 120), which is not limited in this embodiment of the present application.
The digital signal processor is used for processing digital signals, and can process digital image signals and other digital signals. For example, when the electronic device 100 selects a frequency bin, the digital signal processor is used to perform fourier transform or the like on the frequency bin energy.
Video codecs are used to compress or decompress digital video. The electronic device 100 may support one or more video codecs. In this way, the electronic device 100 may play or record video in a variety of encoding formats, such as: moving Picture Experts Group (MPEG) 1, MPEG2, MPEG3, MPEG4, and the like.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to extend the memory capability of the electronic device 100. The external memory card communicates with the processor 110 through the external memory interface 120 to implement a data storage function. For example, files such as music, video, etc. are saved in an external memory card.
Internal memory 121 may be used to store one or more computer programs, including instructions. The processor 110 may execute the above-mentioned instructions stored in the internal memory 121, so as to enable the electronic device 100 to execute the method for intelligent contact recommendation provided in some embodiments of the present application, and various functional applications and data processing. The internal memory 121 may include a program storage area and a data storage area. Wherein, the storage program area can store an operating system; the storage area may also store one or more application programs (e.g., gallery, contacts, etc.), etc. The storage data area may store data (such as photos, contacts, etc.) created during use of the electronic device 101, and the like. In addition, the internal memory 121 may include a high-speed random access memory, and may also include a nonvolatile memory, such as one or more magnetic disk storage devices, flash memory devices, Universal Flash Storage (UFS), and the like. In other embodiments, the processor 110 may cause the electronic device 100 to execute the method of intelligently recommending numbers provided in the embodiments of the present application, and various functional applications and data processing by executing instructions stored in the internal memory 121 and/or instructions stored in a memory provided in the processor.
The electronic device 100 may implement audio functions via the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the headphone interface 170D, and the application processor. Such as music playing, recording, etc.
The audio module 170 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal. The audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be disposed in the processor 110, or some functional modules of the audio module 170 may be disposed in the processor 110.
The speaker 170A, also called a "horn", is used to convert the audio electrical signal into an acoustic signal. The electronic apparatus 100 can listen to music through the speaker 170A or listen to a handsfree call.
The receiver 170B, also called "earpiece", is used to convert the electrical audio signal into an acoustic signal. When the electronic apparatus 100 receives a call or voice information, it can receive voice by placing the receiver 170B close to the ear of the person.
The microphone 170C, also referred to as a "microphone," is used to convert sound signals into electrical signals. When making a call or transmitting voice information, the user can input a voice signal to the microphone 170C by speaking the user's mouth near the microphone 170C. The electronic device 100 may be provided with one or more microphones 170C. In other embodiments, the electronic device 100 may be provided with two microphones 170C to achieve a noise reduction function in addition to collecting sound signals. In other embodiments, the electronic device 100 may further include three, four or more microphones 170C to collect sound signals, reduce noise, identify sound sources, perform directional recording, and so on.
The headphone interface 170D is used to connect a wired headphone. The headset interface 170D may be the USB interface 130, or may be a 3.5mm open mobile electronic device platform (OMTP) standard interface, a cellular telecommunications industry association (cellular telecommunications industry association of the USA, CTIA) standard interface.
The sensor 180 may include a pressure sensor, a gyroscope sensor, an air pressure sensor, a magnetic sensor, an acceleration sensor, a distance sensor, a proximity light sensor, a fingerprint sensor, a temperature sensor, a touch sensor, an ambient light sensor, a bone conduction sensor, and the like, which is not limited in this embodiment.
Of course, the electronic device 100 provided in this embodiment of the application may further include one or more devices such as a key 190, a motor 191, an indicator 192, and a SIM card interface 195, which is not limited in this embodiment of the application.
To facilitate a clear understanding of the following embodiments, a brief description of the related art is first given:
slow motion photography, which may also be referred to as high frame rate photography, allows the contents of a photographed image to be played back at a slow speed. In general, the electronic apparatus 100 has a high frame rate for shooting when performing slow motion shooting. For example, after turning on slow motion photography, as shown in fig. 3, the electronic apparatus 100 may capture each frame of the photographing screen at a photographing frame rate of 960 fps. That is, the electronic apparatus 100 may continuously capture 960 frames of photographed pictures within a photographing time of 1 second. Meanwhile, during the 1 second shooting process, the user can preview the picture being shot in real time in the shooting interface of the camera application. For example, as also shown in fig. 3, the electronic apparatus 100 may extract and display a photographing screen at a display frame rate of 60 fps. That is, the electronic apparatus 100 may extract 60 captured pictures from 960 captured pictures per second to be displayed in the view window of the capturing interface, thereby ensuring that pictures seen in the view window by the user when capturing slow-motion video are coherent.
Subsequently, when playing the 1 second slow-motion video, the electronic apparatus 100 may encode the 960 frame shot picture at an encoding rate lower than 960 fps. For example, the electronic apparatus 100 may encode the captured 960 frames of shot pictures at a coding rate of 120 fps. At this time, the original 1 second video content can be slowly played as a segment of video of 8 seconds (i.e. 960fps/120fps), so as to realize the function of slow motion playing.
In the indoor lighting scene, the frequency of the alternating current used by the indoor lighting is usually 50Hz, that is, the lighting periodically changes brightness and darkness 50 times per second. The frame rate of shooting used by the electronic apparatus 100 when performing slow motion photography is much higher than 50 Hz. Therefore, when slow-motion photography is performed in a scene with indoor lighting, each shot picture acquired by the electronic device 100 also shows periodic brightness change along with brightness change of the lighting, so that a picture flickering phenomenon also occurs in a slow-motion video finally recorded by the electronic device 100.
Currently, some mobile phone manufacturers provide a function option of removing flicker after the video recording is finished in order to eliminate the picture flicker phenomenon in the slow motion video. If the flicker removing function option selected by the user is detected, the mobile phone can use a corresponding flicker removing algorithm to carry out flicker removing processing on each collected frame of shot picture, and then the shot picture after flicker removing is coded into the slow motion video. Therefore, the mobile phone can eliminate the phenomenon of picture flicker when playing the slow motion video.
Although the method can delete the picture flicker phenomenon in the finally formed slow motion video, in the process of previewing and shooting the slow motion video, the shot picture displayed in the viewfinder window of the mobile phone is also extracted from the collected shot picture, so that a user can obviously feel the flicker phenomenon in the shot picture when previewing the shot picture shot in the slow motion shooting, and the shooting quality and the use experience of the user when shooting the high frame rate video are influenced.
In contrast, in the embodiment of the present application, the electronic device 100 can remove the flicker phenomenon in the shot image in the viewfinder window in real time in the recording scene of the high frame rate video such as slow motion shooting. Therefore, the user cannot be influenced by the flicker of the picture in the shooting process of watching the slow-motion video, and the shooting quality of the video and the shooting experience of the user are improved.
A video shooting method provided in an embodiment of the present application will be described in detail below by taking a mobile phone as an example of the electronic device 100, and as shown in fig. 4, the method includes steps S301 to S307.
S301, after the mobile phone enters a slow-motion shooting mode, detecting that a user inputs a recording operation.
Generally, a camera of a mobile phone is applied with one or more shooting modes such as photos, panoramas, videos, slow motion or delayed shooting. After detecting that the user opens the camera application, the mobile phone may call the camera application to open the camera, and as shown in fig. 5, the mobile phone may display a preview interface 401. The preview interface 401 is provided with an option 402 of a shooting mode of slow motion. If the user is detected to select the function option 402, the phone may enter a slow-motion capture mode. Certainly, one or more shooting modes such as a photo, a portrait, a panorama, a video mode, or a delayed shooting mode may also be set in the preview interface 401, which is not limited in this embodiment of the application.
After the mobile phone enters the slow-motion shooting mode, the mobile phone can display a shooting picture 403 currently captured by the camera in the preview interface 401. Since recording of the slow motion video has not yet started at this time, the shooting screen 403 displayed in real time on the preview interface 401 may be referred to as a preview screen. In addition, preview interface 401 also includes a record button 404 for slow motion video. If it is detected that the user clicks the record button 404 in the preview interface 401, which indicates that the user performs the recording operation in the slow-motion shooting mode, at this time, the mobile phone may continue to perform the following steps S302-S307 to record the slow-motion video.
It should be noted that the frame rate of the shot pictures acquired by the mobile phone before the user clicks the recording button 404 may be the same as or different from the frame rate of the shot pictures acquired by the mobile phone after the user clicks the recording button 404. For example, when the shooting frame rate for recording the slow-motion video is 960fps, after detecting that the mobile phone enters the slow-motion shooting mode, before the user clicks the recording button 404, the mobile phone may collect each shooting frame at a shooting frame rate lower than 480 fps. Or, after detecting that the mobile phone enters the slow-motion shooting mode, before the user clicks the recording button 404, the mobile phone may collect each shooting frame according to the shooting frame rate of 960 fps. When the shot pictures are collected, the mobile phone can extract the corresponding shot pictures from the collected shot pictures according to the display frame rate of 30fps or 60fps and display the corresponding shot pictures in the preview interface 401.
In some embodiments, after the mobile phone enters the slow-motion shooting mode, if the shooting frame rate for the mobile phone to capture the shot picture is high (for example, greater than 120fps), the mobile phone may be triggered to start executing the following steps S302-S307, so that the mobile phone may remove the picture flicker phenomenon occurring in the viewfinder window in real time in the preview interface 401.
In other embodiments, after the mobile phone enters the slow-motion shooting mode, the mobile phone may start to perform the following steps S302-S307 after detecting that the user performs the recording operation (e.g., clicks the recording button 404), so that the mobile phone may remove the image flickering phenomenon occurring in the viewfinder window in real time during the shooting of the slow-motion video.
In addition, the following embodiments are exemplified in a usage scenario in which a slow motion video is captured. It is understood that when the mobile phone records a video at a higher frame rate (e.g., a frame rate greater than 120fps), the mobile phone can remove the flicker phenomenon in the captured picture by performing the following steps S302-S307.
And S302, responding to the recording operation, and enabling the mobile phone to start to acquire each frame of shooting picture acquired by the camera.
Still taking an example in which the user clicks the recording button 404 in the preview interface 401 in fig. 5, the mobile phone may respond to the operation of the user clicking the recording button 404, invoke the camera to collect each frame of shooting picture according to the preset shooting frame rate, and start to record the slow motion video. With the lapse of recording time, the number of frames of the shot pictures collected by the mobile phone is gradually accumulated, and the mobile phone can finally form a slow motion video obtained by the slow motion photography based on the shot pictures.
For example, with the preset shooting frame rate of 960fps, after detecting that the user clicks the recording button 404 in the slow-motion shooting mode, the mobile phone may use the camera to capture each frame of shot picture at the speed of capturing 960 frames per 1 second. And the mobile phone can process the collected shooting picture according to the following method and display the shooting picture after image processing in a view finding window of the shooting interface.
In some embodiments, the mobile phone may perform image processing on each captured frame according to the following method. Or, because the number of frames of the shot pictures collected by the mobile phone is large, the shot pictures actually displayed in the view window are extracted from the shot pictures collected by the mobile phone. For example, the mobile phone may capture a captured picture at a capturing frame rate of 960fps, but the actual mobile phone may perform frame-decimation display from the captured picture at a display frame rate of 60 fps. In other embodiments, the mobile phone may perform image processing on each frame of a shot picture to be displayed according to the following method, which is not limited in this application.
S303, the mobile phone determines whether the current recording scene has the picture flicker phenomenon according to the brightness of the continuous N frames of shot pictures, wherein N is an integer larger than 1.
After the mobile phone starts to record the slow motion video, the continuous N frames of shooting pictures recently collected by the camera can be obtained in real time. Or, while the mobile phone collects each frame of shot picture, the mobile phone may further extract the nearest consecutive N frames of shot pictures from the collected shot pictures according to a certain display frame rate (e.g., 30fps or 60fps), where the N frames of shot pictures are shot pictures waiting to be displayed in the shooting interface.
The mobile phone can determine whether the flicker phenomenon exists in the current recorded scene based on the brightness of the N frames of shot pictures, namely determine whether the N frames of shot pictures have brightness fluctuation. If the picture flicker phenomenon exists, the mobile phone can continue to execute the following steps S304-S306 to remove the flicker phenomenon in the shot picture in real time. If the picture flicker phenomenon does not exist, the mobile phone can display the acquired shooting picture in a view finding window of the shooting interface in real time, and continuously record the slow motion video according to the existing video recording method.
For example, as shown in fig. 6, after the mobile phone starts to record the slow motion video, the 1 st shot picture, the 2 nd shot picture and the 3 rd shot picture are respectively acquired. Further, the cellular phone can calculate the luminance Y1 of the 1 st shot picture, the luminance Y2 of the 2 nd shot picture, and the luminance Y3 of the 3 rd shot picture, respectively.
Taking the example of calculating the brightness Y1 of the 1 st frame of shot picture, the mobile phone can convert the 1 st frame of shot picture into the image 1 in YUV format through a preset color coding algorithm. At this time, each pixel point in the image 1 can be represented by three values, i.e., Y (luminance), U (chrominance), and V (chrominance). Further, the mobile phone can calculate the average value of the brightness of all the pixels in the image 1, which is the brightness Y1 of the 1 st frame of the shot picture. The unit of brightness is not limited in this embodiment.
Still as shown in fig. 6, after acquiring the luminances Y1-Y3 of the 1 st to 3 rd frames of photographed pictures, the cell phone can calculate the variance F of the luminance Y1, luminance Y2, and luminance Y3. If the variance F is larger than the preset value, the brightness of the pictures shot from the 1 st frame to the 3 rd frame is fluctuated, and the pictures are likely to be in a flickering scene of light at the moment, so that the mobile phone can determine that the picture flickering phenomenon exists in the current recording scene.
Correspondingly, if the variance F is less than or equal to the preset value, it indicates that the brightness of the shot pictures of the 1 st frame to the 3 rd frame is relatively stable, so that the mobile phone can determine that no picture flicker phenomenon exists in the current recording scene. At this time, the cellular phone can continue to acquire the 2 nd to 4 th photographed pictures, the 3 rd to 5 th photographed pictures, … …. Therefore, the mobile phone can determine whether the picture flicker phenomenon exists in the current recording scene in real time based on the recently acquired 3-frame shooting picture according to the method.
It should be noted that, when the mobile phone determines whether the picture flicker phenomenon exists currently according to the 1 st to 3 rd captured pictures, the mobile phone may display the 1 st to 3 rd captured pictures as recorded pictures in a view finding window of the capturing interface in real time. At this time, the mobile phone does not determine that the shot pictures have the flicker phenomenon, so the mobile phone does not perform the flicker removing processing on the shot pictures of the 1 st frame to the 3 rd frame. Then, the brightness fluctuation phenomenon also occurs in the 1 st to 3 rd captured pictures in the slow motion video recorded by the mobile phone, but the flicker is hard to be recognized by the human eyes in the 1 st to 3 rd captured pictures due to the too short time.
S304, if the flicker phenomenon exists, after the mobile phone collects the X frame shooting picture, the brightness of the X frame shooting picture is adjusted according to the brightness mean value of the latest M frames of shooting pictures.
Wherein, the M frames of shot pictures comprise the Xth frame of shot pictures. For example, the X-th frame photographed picture may be the last frame photographed picture among M frame photographed pictures that are captured recently (or to be displayed).
Because the flicker phenomenon appearing in the shot picture is mainly caused by the fluctuation of the brightness among all the shot pictures, when the flicker phenomenon appears in the shot picture, the mobile phone can adjust the brightness of the shot picture, so that the brightness of all the shot pictures tends to be stable, and the flicker phenomenon appearing in the shot picture is weakened or eliminated.
Still taking the shooting scene illustrated in fig. 6 as an example, after the mobile phone determines that the picture flicker phenomenon currently exists based on the 1 st to 3 rd frame shooting pictures, taking the 4 th frame shooting picture as an example, as illustrated in fig. 7, after the mobile phone acquires the 4 th frame shooting picture (the 4 th frame shooting picture is the latest frame), the luminance average value of the latest 4 frame shooting pictures can be calculated. For example, the cellular phone may calculate the luminance Y1 of the 1 st shot, the luminance Y2 of the 2 nd shot, the luminance Y3 of the 3 rd shot, and the luminance Y4 of the 4 th shot, respectively, according to the method described in step S303. Further, the mobile phone can calculate the average value P of the luminance Y1, the luminance Y2, the luminance Y3 and the luminance Y4, which is the luminance average value of the latest 4 frames of the photographed pictures.
For example, the luminance Y4 of the 4 th frame captured picture is 105, and the luminance average P of the latest 4 frame captured pictures is 110. Then, in order to stabilize the brightness of each shot, the mobile phone may adjust the brightness Y4 of the shot of the 4 th frame from 105 to 110, i.e. adjust the brightness of the shot of the X th frame to the average brightness of the shot of the latest M frames.
Illustratively, still taking the example of adjusting the luminance Y4 of the 4 th frame captured picture, the cell phone may multiply the luminance of each pixel in the 4 th frame captured picture by an adjustment factor. When the brightness Y4 is greater than the brightness average value, the adjustment coefficient is set to be 0 < 1; when the luminance Y4 is smaller than the above luminance average value, an adjustment coefficient > 1 may be set. In this way, by adjusting the luminance of each pixel in the 4 th frame photographed picture, the luminance of the entire 4 th frame photographed picture can be adjusted.
The above embodiment is exemplified by adjusting the brightness of the 4 th frame of captured image, and further exemplified by X being 4, when the mobile phone acquires the 5 th frame of captured image, the mobile phone may continue to adjust the brightness of the 5 th frame of captured image according to the above method based on the brightness average of the latest 4 th frame of captured image (i.e. the 2 nd frame to the 5 th frame of captured image).
Similarly, after the mobile phone determines the flicker phenomenon appearing in the shot picture, the brightness of each latest shot picture can be adjusted according to the method, so that the mobile phone can compensate the brightness value of each shot picture in the shot scene with the flicker of the light, the brightness of each shot picture tends to be stable, and the flicker phenomenon appearing in the shot picture is weakened or eliminated.
In addition, after the mobile phone adjusts the brightness of the shot picture of the X-th frame, the adjusted shot picture of the X-th frame can be displayed in a view finding window of the shooting interface as a recording picture. Alternatively, the mobile phone may continue to perform steps S305 to S306 on the X-th frame captured image after brightness adjustment.
S305, the mobile phone carries out image fusion on the X frame shooting picture after brightness adjustment and the latest Y frame shooting picture to obtain an X' frame shooting picture.
In some embodiments, the flicker phenomenon occurring in the photographed picture is related not only to the luminance fluctuation of the entire photographed picture but also to the luminance fluctuation of a partial region in the photographed picture. For example, in a continuous multi-frame shot picture, shot contents in the area a have periodic brightness fluctuations from light to dark and from dark to light, which causes a flicker phenomenon in the area a of the shot picture when recording a slow-motion video.
In contrast, in step S305, taking the 4 th frame shot picture as an example, after the mobile phone adjusts the brightness of the 4 th frame shot picture in step S304, as shown in fig. 8, the mobile phone can perform image fusion on the 4 th frame shot picture and the Y frame shot pictures (for example, the 1 st frame to the 3 rd frame shot pictures) before the 4 th frame shot picture to obtain the 4 th frame shot picture (i.e., the X' th frame shot picture) newly generated after the fusion.
Illustratively, the 1 st frame to the 4 th frame shot pictures respectively include a × b pixel points, and each pixel point has a corresponding pixel value. When image fusion is carried out, for each pixel point, the mobile phone can calculate the average value of the pixel point in the 1 st frame-4 th frame shooting picture, and the average value is used as the pixel value of the pixel point in the Xth' frame shooting picture. For example, as shown in fig. 8, taking the point a' at the top left corner of the 4 th captured picture newly generated after the fusion as an example, the mobile phone can calculate the pixel average value of 4 pixels, i.e., the point a1 at the top left corner of the 1 st captured picture, the point a2 at the top left corner of the 2 nd captured picture, the point A3 at the top left corner of the 3 rd captured picture, and the point a4 at the top left corner of the 4 th captured picture. And then, the pixel average value is the pixel value of the corresponding A' point in the 4 th frame of the fused shot picture.
Then, after image fusion, the pixel value of each pixel point in the newly generated 4 th frame shooting picture is the pixel average value of the pixel point in continuous multiple frames, so that the brightness fluctuation of a certain area in the 4 th frame shooting picture among the continuous multiple frames is weakened or eliminated, and the flicker phenomenon appearing in the 4 th frame shooting picture is further eliminated.
Of course, the above embodiment is exemplified by generating a new 4 th frame of captured image through image fusion, and then, for each frame of captured image after brightness adjustment, the fused captured image can be generated according to the above method. For example, after the mobile phone acquires the 5 th frame shot picture, the 5 th frame shot picture and the 2 nd to 4 th frame shot pictures may be subjected to image fusion to generate a fused 5 th frame shot picture. In this case, the 4 th frame captured picture used in the image fusion may be the 4 th frame captured picture newly generated after the image fusion has been performed.
S306, if the X ' th frame shooting picture contains a motion area, the mobile phone uses the motion area in the X ' th frame shooting picture to replace the motion area in the X ' th frame shooting picture.
Still taking the above 4 th frame shot picture as an example, after the mobile phone generates a new 4 th frame shot picture through image fusion, although the flicker phenomenon in the shot picture can be eliminated, as shown in fig. 8, when a moving object (for example, a person 701) is included in the multi-frame shot picture subjected to image fusion, an afterimage of the person 701 is generated in the 4 th frame shot picture newly generated after fusion, which affects the shooting effect.
Then, in step S306, the mobile phone may first detect whether or not a moving object is included in the multi-frame captured picture subjected to image fusion in step S305. Still take the 4 th frame shot picture and the 1 st to 3 rd frame shot pictures to perform image fusion, for example, the mobile phone can respectively use the 4 th frame shot picture before fusion to compare with the 1 st to 3 rd frame shot pictures.
As shown in fig. 9 (a), the mobile phone may compare each pixel point in the 4 th frame shot picture with each pixel point in the 1 st frame shot picture, and if the difference value between the pixel points at the same position in the 4 th frame shot picture and the 1 st frame shot picture is greater than the threshold, the mobile phone may regard the pixel point as a pixel point of the moving object. For example, the mobile phone can recognize a moving object 701 in the 4 th frame captured picture and a moving object 702 in the 1 st frame captured picture.
Similarly, as shown in (b) of fig. 9, by comparing each pixel point in the 4 th frame captured picture with each pixel point in the 2 nd frame captured picture, the mobile phone can recognize a moving object 701 in the 4 th frame captured picture and a moving object 703 in the 2 nd frame captured picture. As shown in (c) in fig. 9, the mobile phone can recognize a moving object 701 in the 4 th frame captured picture and a moving object 704 in the 3 rd frame captured picture by comparing each pixel point in the 4 th frame captured picture and the 3 rd frame captured picture.
Then, as shown in fig. 10, the mobile phone can determine the motion region 901 where the moving objects 701 and 704 are located in the 4 th frame captured picture newly generated after the fusion according to the moving objects 701 and 704 identified in the 1 st to 4 th frame captured pictures. In order to eliminate the afterimage in the motion region 901, as shown in fig. 10, the mobile phone may deduct a picture in a region 902 corresponding to the motion region 901 from the 4 th captured picture before the fusion, and replace the picture in the motion region 901 in the 4 th captured picture after the fusion with the picture in the region 902. Since there is no afterimage of a moving object in the region 902 corresponding to the motion region 901 in the 4 th frame captured picture, after replacing the motion region 901 with the region 902, the afterimage in the fused 4 th frame captured picture can be eliminated. At this time, the 4 th frame of the shot picture finally output by the mobile phone eliminates the flicker phenomenon in the shot picture and also eliminates the afterimage phenomenon in the shot picture.
Similarly, when the mobile phone generates a new 5 th frame shot picture, a new 6 th frame shot picture, and a new … … X th frame shot picture through image fusion, the afterimage phenomenon in the shot pictures can be eliminated according to the method.
Of course, if the mobile phone detects that the multi-frame shot picture for image fusion does not include a moving object, it indicates that the shot picture of the X 'th frame does not include a moving area at this time, i.e., there is no afterimage in the shot picture of the X' th frame. Then, after the mobile phone obtains the X' th frame shot picture newly generated after the fusion, the following step S307 can be directly executed without executing step S306.
S307, the mobile phone displays the shooting picture of the Xth' frame in the shooting interface.
Still taking the 4 th frame shot picture as an example, in a shooting scene with a flicker phenomenon, the mobile phone can remove the flicker phenomenon in the 4 th frame shot picture by executing the above steps S304-S306 to obtain a newly generated 4 th frame shot picture (i.e., the X' th frame shot picture). Then, in step S307, as shown in fig. 11, after the mobile phone obtains the newly generated 4 th frame shot picture, the newly generated 4 th frame shot picture can be displayed in the view window of the shooting interface 1002 as the recording picture 1001 in real time.
Similarly, the subsequent mobile phone can remove the flicker phenomenon in the shot picture for each frame of shot picture to be displayed according to the method, and display the newly generated X' th frame of shot picture after flicker removal in the view window of the shooting interface 1002 in real time.
Therefore, when the user watches the recorded slow-motion video in the shooting interface 1002, all the shot pictures presented to the user by the mobile phone are the shot pictures subjected to the flicker removal processing, so that the user can not obviously feel the flicker phenomenon in the shot pictures, and the shooting quality of the video in the high-frame-rate recording scene and the use experience of the user are improved.
In some embodiments, after detecting that the user inputs the recording operation in the slow motion shooting mode, the mobile phone may capture each frame of shooting picture at a preset shooting frame rate (e.g., 960 fps).
After the captured pictures are collected, on one hand, the mobile phone can extract corresponding shot pictures from the collected shot pictures according to a preset display frame rate (for example, 60 fps). Furthermore, the mobile phone can remove the flicker phenomenon in each extracted frame of shot picture according to the method described in the above steps S303 to S307, and display each frame of shot picture after flicker removal in the shooting interface 1002 in real time.
On the other hand, the mobile phone can store each frame of shot picture collected in the recording process. If an operation of ending the recording input by the user is detected, for example, the user clicks a button 401 in the shooting interface 1002, the mobile phone may save all shot pictures acquired during the recording process in a slow-motion video format. Taking the recording time with the duration of 3s as an example, after detecting that the user inputs the operation of ending the recording, the mobile phone acquires 2880 frames (960fps × 3) of shooting pictures in the recording time of 3 s. Furthermore, the mobile phone may perform the deflicker processing on each captured frame of the shot picture according to the method described in the above steps S303 to S307. Subsequently, the mobile phone can encode each shot picture after the flicker is removed into a slow motion video with a period of time longer than 3 seconds, and store the slow motion video. Therefore, when the user opens the slow motion video in the photo album subsequently, the mobile phone can play each frame of shooting picture subjected to flicker removal processing to the user, and therefore the shooting quality of the video in a high-frame-rate recording scene is improved.
That is to say, in a recording scene with a high shooting frequency, the mobile phone can only perform deflicker processing on a shot picture to be displayed in the previewing and shooting processes, and does not need to perform deflicker processing on each collected shot picture in real time, so that the phenomenon of picture deflicker can be removed in the previewing and shooting processes, and the processing efficiency and the calculation speed of the mobile phone in the previewing and shooting processes are improved.
In other embodiments, after the mobile phone acquires each captured picture at a preset capturing frame rate (e.g., 240fps), the mobile phone may perform the deflicker processing on each captured picture in real time according to the method described in the above steps S303 to S307. Furthermore, the mobile phone can extract a corresponding shooting picture from the shooting pictures after the flicker removal and display the corresponding shooting picture in the shooting interface 1002. And after detecting that the user inputs the operation of finishing recording, the mobile phone can code each shot picture after flicker removal into a section of slow motion video for storage.
In this way, because the mobile phone performs the deflicker processing on each captured frame of shot picture in real time, the mobile phone does not need to perform the deflicker processing on the shot picture again when generating the slow motion video of this time, and thus the calculation amount of the mobile phone in the process of recording the slow motion video once can be reduced.
In the above embodiment, the example that the user starts recording the slow motion video in the lighting scene is illustrated, and it can be understood that, when the mobile phone records the video with the high frame rate, if the flicker phenomenon of the picture in the current recording scene is detected at any time, the mobile phone can remove the flicker phenomenon in the collected shot picture according to the above method.
For example, the mobile phone may enter a slow motion capture mode in a light scene.
After the mobile phone enters the slow motion shooting mode, although the recording of the slow motion video is not started, the mobile phone needs to display a preview picture in a view finding window of a preview interface. Therefore, the mobile phone can start to capture each frame of the shot picture at a preset shooting frequency (e.g., 480 fps). Furthermore, the mobile phone can extract the shot pictures to be displayed from the collected shot pictures according to the display frame rate of 60 fps.
As shown in fig. 12, in the indoor lighting scene, after the mobile phone extracts the first N frames of shot pictures, as described in step S303, the mobile phone may determine whether there is a picture flicker phenomenon in the current preview scene according to the brightness of the first N frames of shot pictures. Illustratively, as also shown in fig. 12, if the brightness of the first N frames of the photographed picture fluctuates greatly, the cell phone can determine that there is a picture flickering phenomenon in the current preview scene. And, the mobile phone can display the former N frames of shooting pictures in a view finding window of the preview picture.
Further, from the extracted N +1 th frame of the shot picture, as shown in fig. 12, the mobile phone may perform the flicker removal processing on each of the extracted frames of the shot picture (for example, the N +1 th frame of the shot picture, the N +2 th frame of the shot picture, … …) according to the method described in the above steps S304 to S307, and display each of the frames of the shot picture without flicker in the preview interface in real time. Therefore, the brightness of each frame of shot picture after the flicker removing processing is gradually stabilized at a certain value. That is, before recording the slow motion video, the flicker phenomenon is suppressed when the shot picture is previewed in the lighting scene, and the shot picture viewed by the user in the preview interface does not have obvious flicker phenomenon.
Subsequently, the user can switch the mobile phone to a scene without light flicker to start recording the video.
As shown in fig. 13 (a) - (c), in an outdoor unlighted scene, the mobile phone may start recording a slow motion video in response to a recording operation input by the user in the preview interface. For example, the mobile phone may capture each frame of shot pictures at a shooting frequency of 480fps, and the mobile phone may extract K frames of shot pictures to be displayed from the captured shot pictures at a display frame rate of 60 fps. Meanwhile, as stated in step S303, the mobile phone may determine whether a picture flicker phenomenon exists in the current recording scene according to the recently extracted N frames of captured pictures in real time.
For example, the mobile phone can calculate the brightness of each frame of the K frames of shot pictures respectively. Since no light flickers in the current shooting scene, the brightness of each frame of shooting picture tends to be stable. For example, as shown in (a) to (c) of fig. 13, the luminance value of the K frame shooting screen is stabilized at about 95. Then, the variance of the brightness of the latest K-frame shot picture is relatively small, and further, the mobile phone can determine that the picture flicker phenomenon does not exist in the current recording scene. At this time, the mobile phone does not need to perform deflicker processing on the K frames of shooting pictures, each extracted frame of shooting picture can be directly displayed in the shooting interface as a recording picture, and a user can not feel the deflicker phenomenon when watching the video being recorded in the shooting interface.
Subsequently, the user can switch the recording scene of the slow motion video from the outdoor scene without light to the indoor scene with light to continue recording. After entering an indoor lighting scene, the mobile phone can continue to collect each frame of shot picture according to the shooting frequency of 480fps, and extract the shot picture to be displayed according to the display frame rate of 60 fps.
As shown in (a) - (c) of fig. 14, after entering the room, the mobile phone can extract the K +1 th frame shot picture, the K +2 th frame shot picture, and the K +3 th frame shot picture. Because the indoor light flickers with the frequency of the alternating current, the brightness of the K +1 frame shot picture, the K +2 frame shot picture and the K +3 frame shot picture fluctuates. For example, as shown in (a) - (c) of fig. 14, the mobile phone can calculate the luminance Y (K +1) of the K +1 th frame shot as 110, the luminance Y (K +2) of the K +2 th frame shot as 120, and the luminance Y (K +3) of the K +3 th frame shot as 116. Further, as still described in step S303, the mobile phone can determine that there is a flicker phenomenon in the current recorded scene according to the brightness of the 3 consecutive frames of the captured pictures.
Meanwhile, the mobile phone can display the extracted K +1 frame shooting picture, the K +2 frame shooting picture and the K +3 frame shooting picture in a shooting interface in real time. At this time, the mobile phone does not perform flicker removal processing on the K +1 frame-K +3 frame shooting pictures. Then, the brightness fluctuation phenomenon also occurs in the shooting pictures of the (K +1) th frame to the (K +3) th frame in the slow motion video recorded by the mobile phone, but the time is too short, so that the human eyes can hardly recognize the picture flicker in the shooting pictures at the moment.
Furthermore, as shown in (a) - (c) of fig. 15, after determining that the picture flickering phenomenon exists in the recorded scene, the mobile phone may perform the flicker removal processing on each extracted frame of captured picture (for example, the K +4 th frame of captured picture, the K +5 th frame of captured picture, … …) according to the method described in the above steps S304-S307, and display each frame of captured picture without flicker in the capturing interface in real time, at this time, the user will not obviously feel the flickering phenomenon in the captured picture.
For example, as shown in (a) in fig. 15, after the mobile phone acquires the K +4 th frame of shot picture, a new K +4 th frame of shot picture may be generated according to the method described in the above steps S304-S307, the flicker phenomenon in the newly generated K +4 th frame of shot picture is removed, and the mobile phone may display the newly generated K +4 th frame of shot picture in the shooting interface. At this time, the luminance Y (K +4) of the K +4 th frame captured picture newly generated is 113.
Further, as shown in (b) of fig. 15, after the mobile phone acquires the K +5 th frame shooting picture, a new K +5 th frame shooting picture may be generated in the same manner as in the above-mentioned steps S304-S307, the flicker phenomenon in the newly generated K +5 th frame shooting picture is removed, and the mobile phone may display the newly generated K +5 th frame shooting picture in the shooting interface. At this time, the luminance Y (K +4) of the newly generated K +5 th frame captured picture is 114.
Similarly, as shown in (c) in fig. 15, after the mobile phone acquires the K +6 th frame shooting picture, a new K +6 th frame shooting picture may be generated according to the method described in the above steps S304-S307, the flicker phenomenon in the newly generated K +6 th frame shooting picture is removed, and the mobile phone may display the newly generated K +6 th frame shooting picture in the shooting interface. At this time, the luminance Y (K +4) of the newly generated K +6 th frame captured picture is 114.
Therefore, after the picture flicker phenomenon exists in the recording scene, the mobile phone conducts flicker removal processing on each frame of shot picture to be displayed, so that the brightness of each frame of shot picture is stabilized at about 114, the flicker phenomenon of the shot picture in the lighting scene is inhibited, and the shooting quality of the video in the high-frame-rate recording scene and the use experience of a user are improved.
Subsequently, when it is detected that the user finishes the recording operation of the slow motion video input, the mobile phone can perform frame-by-frame deflickering processing on the shot pictures collected within the recording time according to the method, and finally codes each deflickered shot picture into the slow motion video for storage. Therefore, when the user opens the slow motion video in the photo album subsequently, the mobile phone can play each frame of shooting picture subjected to flicker removal processing to the user, and therefore the shooting quality of the video in a high-frame-rate recording scene is improved.
The embodiment of the application discloses electronic equipment, which comprises a processor, and a memory, input equipment and output equipment which are connected with the processor. In which an input device and an output device may be integrated into one device, for example, a touch sensor may be used as the input device, a display screen may be used as the output device, and the touch sensor and the display screen may be integrated into a touch screen.
At this time, as shown in fig. 16, the electronic device may include: a touch screen 1601, the touch screen 1601 including a touch sensor 1606 and a display 1607; one or more processors 1602; one or more cameras 1608; a memory 1603; one or more application programs (not shown); and one or more computer programs 1604, which may be connected via one or more communication buses 1605. Wherein the one or more computer programs 1604 are stored in the memory 1603 and configured to be executed by the one or more processors 1602, the one or more computer programs 1604 comprise instructions that may be used to perform the steps in the embodiments. All relevant contents of the steps related to the above method embodiment may be referred to the functional description of the corresponding entity device, and are not described herein again.
For example, the processor 1602 may specifically be the processor 110 shown in fig. 1, the memory 1603 may specifically be the internal memory 116 and/or the external memory 120 shown in fig. 1, the camera 1608 may specifically be the camera 193 shown in fig. 1, the display 1607 may specifically be the display 194 shown in fig. 1, and the touch sensor 1606 may specifically be a touch sensor in the sensor module 180 shown in fig. 1, which is not limited in this embodiment of the present invention.
Through the above description of the embodiments, it is clear to those skilled in the art that, for convenience and simplicity of description, the foregoing division of the functional modules is merely used as an example, and in practical applications, the above function distribution may be completed by different functional modules according to needs, that is, the internal structure of the device may be divided into different functional modules to complete all or part of the above described functions. For the specific working processes of the system, the apparatus and the unit described above, reference may be made to the corresponding processes in the foregoing method embodiments, and details are not described here again.
Each functional unit in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solutions of the embodiments of the present application may be essentially implemented or make a contribution to the prior art, or all or part of the technical solutions may be implemented in the form of a software product stored in a storage medium and including several instructions for causing a computer device (which may be a personal computer, a server, or a network device) or a processor to execute all or part of the steps of the methods described in the embodiments of the present application. And the aforementioned storage medium includes: flash memory, removable hard drive, read only memory, random access memory, magnetic or optical disk, and the like.
The above description is only a specific implementation of the embodiments of the present application, but the scope of the embodiments of the present application is not limited thereto, and any changes or substitutions within the technical scope disclosed in the embodiments of the present application should be covered by the scope of the embodiments of the present application. Therefore, the protection scope of the embodiments of the present application shall be subject to the protection scope of the claims.

Claims (25)

1. A video capture method, comprising:
the electronic equipment opens a camera application and enters a shooting mode of the slow motion video;
responding to the recording operation of a user for recording a video, and displaying a shooting interface of the camera application by the electronic equipment, wherein the shooting interface comprises a first view finding window which is used for displaying a shooting picture in the video recording process;
the electronic equipment acquires continuous N1 frames of shooting pictures, wherein N1 is an integer larger than 1;
the electronic equipment calculates the mean value and the variance of the brightness of the N1 frame shooting pictures;
if the variance is larger than a threshold value, the electronic equipment determines that the brightness of the N1 frame shooting picture fluctuates;
when the brightness of the N1 frame shot fluctuates, the electronic equipment acquires continuous M1 frames of shot, wherein M1 is an integer larger than 1;
and after the electronic equipment conducts deflickering processing on the M1 frame shot picture, the M1 frame shot picture is displayed in the first view finding window, and the brightness of the M1 frame shot picture is stabilized in a first value range after deflickering processing.
2. The method of claim 1, after the electronic device opens the camera application, further comprising:
the electronic equipment displays a preview interface of the camera application, wherein the preview interface comprises a second view finding window which is used for displaying a preview picture before video recording;
the electronic equipment acquires continuous N2 frames of preview pictures, wherein N2 is an integer larger than 1; when the brightness of the N2 frame preview picture fluctuates, the electronic equipment acquires continuous M2 frame preview pictures, wherein M2 is an integer larger than 1;
and after the electronic equipment carries out deflickering treatment on the M2 frame shooting picture, the M2 frame preview picture is displayed in the second view finding window, and the brightness of the M2 frame preview picture is stabilized in a second value range.
3. The method of claim 2, further comprising, after the electronic device displays a capture interface of the camera application:
the electronic equipment starts to collect each frame of shooting picture according to a first shooting frequency;
after the electronic device displays the preview interface of the camera application, further comprising:
the electronic equipment starts to collect each frame of preview picture according to a second shooting frequency, and the first shooting frequency is the same as or different from the second shooting frequency.
4. The method according to any one of claims 1-3, wherein before the electronic device obtains the continuous N1 frames of pictures, further comprising:
and the electronic equipment displays continuous K frames of shooting pictures in the first view finding window, the brightness of the K frames of shooting pictures is stabilized in a third value range, and K is an integer greater than 1.
5. The method according to any one of claims 1-3, wherein after the electronic device acquires the continuous M1 frame pictures, further comprising:
and the electronic equipment adjusts the brightness of each frame of shooting picture in the M1 frame shooting pictures to be within the first value range.
6. The method of claim 5, wherein the electronic device adjusts the brightness of each of the M1 frames to be within the first value range, and comprises:
the electronic equipment acquires the brightness mean value of a latest W frame shooting picture, wherein W is an integer larger than 1;
the electronic equipment adjusts the brightness of the shot picture of the X frame to be the brightness mean value, the shot picture of the X frame is the last frame in the shot pictures of the W frame, and the shot picture of the X frame is one frame in the shot pictures of the M1 frames.
7. The method according to claim 6, wherein after the electronic device adjusts the brightness of the X-th frame captured picture to the brightness mean, the method further comprises:
and the electronic equipment performs image fusion on the X-frame shooting picture and the latest Y-frame shooting picture to obtain an X '-frame shooting picture, wherein the X' -frame shooting picture is the X-frame shooting picture after the flicker removal processing, and Y is an integer greater than 0.
8. The method according to claim 7, wherein after the electronic device performs image fusion on the X-th frame shot and the latest Y-frame shot to obtain an X' -th frame shot, the method further comprises:
the electronic equipment determines whether a motion area where a moving object is located is included in the X' th frame shooting picture;
and if the motion area is included, the electronic equipment replaces the motion area in the shooting picture of the X 'th frame with a target area in the shooting picture of the X' th frame, and the target area in the shooting picture of the X 'th frame corresponds to the motion area in the shooting picture of the X' th frame.
9. The method of claim 8, wherein the electronic device determining whether a motion region in which a moving object is located is included in the shot of the X' th frame comprises:
the electronic equipment respectively compares the X frame shooting picture with the Y frame shooting picture to determine whether a moving object exists in the X frame shooting picture and the Y frame shooting picture;
and if the moving object exists, the electronic equipment determines the area of the moving object in the X' th frame shooting picture as the motion area.
10. The method according to any one of claims 1-3, further comprising:
and responding to the operation that the user finishes recording the video, and the electronic equipment encodes all the shot pictures collected within the recording time into the recorded video and stores the video.
11. The method according to claim 10, before the electronic device encodes all captured pictures collected during the recording time into the recorded video and stores the encoded video, further comprising:
and for each frame of shot picture collected in the recording time, the electronic equipment carries out deflicker processing on the shot picture.
12. The method of claim 3, wherein the first capture frequency and the second capture frequency are each greater than 120 frames/second.
13. An electronic device, comprising:
the touch screen comprises a display screen and a touch sensor;
one or more cameras;
one or more processors;
a memory;
wherein the memory has stored therein one or more computer programs, the one or more computer programs comprising instructions, which when executed by the electronic device, cause the electronic device to perform the steps of:
after a camera application is opened, entering a shooting mode of a slow motion video, and starting the camera;
responding to the recording operation of a user for recording videos, and displaying a shooting interface of the camera application, wherein the shooting interface comprises a first view finding window which is used for displaying shooting pictures in the video recording process;
acquiring continuous N1 frames of shot pictures, wherein N1 is an integer greater than 1;
the electronic equipment calculates the mean value and the variance of the brightness of the N1 frame shooting pictures;
if the variance is larger than a threshold value, the electronic equipment determines that the brightness of the N1 frame shooting picture fluctuates;
when the brightness of the N1 frame shot pictures fluctuates, acquiring continuous M1 frame shot pictures, wherein M1 is an integer larger than 1;
and after the flicker removal processing is carried out on the M1 frame shot picture, the M1 frame shot picture is displayed in the first view finding window, and the brightness of the M1 frame shot picture is stabilized in a first value range after the flicker removal processing.
14. The electronic device of claim 13, wherein after the electronic device opens the camera application, the electronic device is further configured to perform:
displaying a preview interface of the camera application, wherein the preview interface comprises a second view finding window which is used for displaying a preview picture before video recording;
acquiring continuous N2 frames of preview pictures, wherein N2 is an integer larger than 1; when the brightness of the N2 frame preview picture fluctuates, acquiring continuous M2 frame preview pictures, wherein M2 is an integer larger than 1;
and after the flicker removal processing is carried out on the M2 frame shooting picture, the M2 frame preview picture is displayed in the second view finding window, and the brightness of the M2 frame preview picture is stabilized in a first value range.
15. The electronic device of claim 14, wherein after the electronic device displays the camera application's capture interface, the electronic device is further configured to perform:
starting to collect each frame of shooting picture according to a first shooting frequency;
after the electronic device displays the preview interface of the camera application, the electronic device is further configured to perform:
and starting to acquire each frame of preview picture according to a second shooting frequency, wherein the first shooting frequency is the same as or different from the second shooting frequency.
16. The electronic device of any of claims 13-15, wherein before the electronic device obtains consecutive N1 frames of shots, the electronic device is further configured to perform:
and displaying continuous K frames of shooting pictures in the first view finding window, wherein the brightness of the K frames of shooting pictures is stabilized in a third value range, and K is an integer greater than 1.
17. The electronic device according to any of claims 13-15, wherein after the electronic device obtains consecutive M1 frame shots, the electronic device is further configured to perform:
and adjusting the brightness of each frame of the M1 frame shot pictures to be within the first value range.
18. The electronic device according to claim 17, wherein the electronic device adjusts the brightness of each of the M1 frames to be within the first value range, and specifically includes:
acquiring the brightness mean value of a latest W frame shooting picture, wherein W is an integer larger than 1;
and adjusting the brightness of the shot picture of the X frame to be the average brightness value, wherein the shot picture of the X frame is the last frame in the shot pictures of the W frames, and the shot picture of the X frame is one frame in the shot pictures of the M1 frames.
19. The electronic device of claim 18, wherein after the electronic device adjusts the brightness of the X-th frame captured picture to the brightness mean, the electronic device is further configured to perform:
and carrying out image fusion on the X-frame shooting picture and the latest Y-frame shooting picture to obtain an X '-frame shooting picture, wherein the X' -frame shooting picture is the X-frame shooting picture after the deflickering treatment, and Y is an integer larger than 0.
20. The electronic device of claim 19, wherein after the electronic device performs image fusion on the X-th frame captured picture and the latest Y-frame captured picture to obtain an X' -th frame captured picture, the electronic device is further configured to perform:
determining whether the X' frame shooting picture comprises a motion area where a moving object is located;
and if the motion area is included, replacing the motion area in the X ' frame shooting picture with a target area in the X frame shooting picture, wherein the target area in the X ' frame shooting picture corresponds to the motion area in the X ' frame shooting picture.
21. The electronic device according to claim 20, wherein the electronic device determines whether a motion region in which a moving object is located is included in the X' th frame captured picture, and specifically includes:
respectively comparing the X frame shooting picture with the Y frame shooting picture to determine whether a moving object exists in the X frame shooting picture and the Y frame shooting picture;
and if the moving object exists, determining the area of the moving object in the X' th frame shooting picture as the motion area.
22. The electronic device of any of claims 13-15, wherein the electronic device is further configured to perform:
and responding to the operation that the user finishes recording the video, and coding all the shot pictures collected within the recording time into the recorded video for storage.
23. The electronic device according to claim 22, wherein before the electronic device encodes all captured pictures collected during the recording time into the recorded video and stores the recorded video, the electronic device is further configured to perform:
and carrying out deflicker processing on each shot picture collected in the recording time.
24. The electronic device of claim 15, wherein the first capture frequency and the second capture frequency are each greater than 120 frames/second.
25. A computer-readable storage medium having instructions stored therein, which when run on an electronic device, cause the electronic device to perform the video capture method of any of claims 1-12.
CN201910606551.5A 2019-05-06 2019-07-05 Video shooting method and electronic equipment Active CN110381276B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201910374503 2019-05-06
CN2019103745038 2019-05-06

Publications (2)

Publication Number Publication Date
CN110381276A CN110381276A (en) 2019-10-25
CN110381276B true CN110381276B (en) 2021-08-13

Family

ID=68252161

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910606551.5A Active CN110381276B (en) 2019-05-06 2019-07-05 Video shooting method and electronic equipment

Country Status (1)

Country Link
CN (1) CN110381276B (en)

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113132588B (en) * 2020-01-15 2022-09-27 思特威(上海)电子科技股份有限公司 Method for quickly starting imaging system and imaging system using same
CN115379112A (en) * 2020-09-29 2022-11-22 华为技术有限公司 Image processing method and related device
CN113038093B (en) * 2021-03-26 2022-12-16 华东理工大学 System, method, apparatus and storage medium for video image flicker suppression processing
CN113422903B (en) * 2021-06-16 2023-01-03 荣耀终端有限公司 Shooting mode switching method, equipment and storage medium
CN115529419B (en) * 2021-06-24 2024-04-16 荣耀终端有限公司 Shooting method under multiple artificial light sources and related device
CN114363659A (en) * 2021-12-15 2022-04-15 深圳万兴软件有限公司 Method, device, equipment and storage medium for reducing video flicker
GB2620117A (en) * 2022-06-24 2024-01-03 Sony Interactive Entertainment Inc Data processing apparatus and method
CN116743978B (en) * 2023-08-14 2023-11-03 深圳思谋信息科技有限公司 Video processing method, device, computer equipment and computer readable storage medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1946143A (en) * 2006-11-07 2007-04-11 北京中星微电子有限公司 Method and device for clearing explosure flash
CN105120133A (en) * 2015-09-14 2015-12-02 浙江大华技术股份有限公司 Method and device for eliminating image flicker phenomenon
CN105139810A (en) * 2015-09-28 2015-12-09 京东方科技集团股份有限公司 Display driving method and device, and display device
CN108111767A (en) * 2018-01-24 2018-06-01 努比亚技术有限公司 A kind of image pickup method, terminal and computer readable storage medium

Family Cites Families (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3892172B2 (en) * 1998-07-24 2007-03-14 松下電器産業株式会社 Solid-state imaging device
CN100405206C (en) * 2004-05-11 2008-07-23 北京中星微电子有限公司 Method for determining indoor illumination environment being as 50Hz or 60Hz
KR100672328B1 (en) * 2005-01-18 2007-01-24 엘지전자 주식회사 Apparatus for estimation noise level of video signal
JP4207922B2 (en) * 2005-04-19 2009-01-14 ソニー株式会社 Flicker correction method, flicker correction apparatus, and imaging apparatus
JP4335849B2 (en) * 2005-06-13 2009-09-30 富士通マイクロエレクトロニクス株式会社 Imaging device capable of flicker detection
CN100440022C (en) * 2005-09-23 2008-12-03 北京中星微电子有限公司 Determining and eliminating method for scintillation under line exposure mode
JP2009135792A (en) * 2007-11-30 2009-06-18 Olympus Imaging Corp Imaging device and image signal processing method
CN103196550A (en) * 2012-01-09 2013-07-10 西安智意能电子科技有限公司 Method and equipment for screening and processing imaging information of launching light source
CN104104882B (en) * 2013-04-09 2017-08-11 展讯通信(上海)有限公司 Image flicker detection method and device, image capture device
CN103945089A (en) * 2014-04-18 2014-07-23 上海复控华龙微系统技术有限公司 Dynamic target detection method based on brightness flicker correction and IP camera
KR102177725B1 (en) * 2015-06-29 2020-11-12 엘지디스플레이 주식회사 Organic Light Emitting Diode Display Device Including Peak Luminance Control Unit And Method Of Driving The Same
CN105007429B (en) * 2015-08-10 2018-01-19 广东欧珀移动通信有限公司 A kind of method, system and mobile terminal for eliminating flicker
JPWO2017064829A1 (en) * 2015-10-15 2018-08-02 ソニー株式会社 Video signal processing apparatus, video signal processing method and program
CN107220931B (en) * 2017-08-02 2020-08-18 安康学院 High dynamic range image reconstruction method based on gray level mapping
CN107396019B (en) * 2017-08-11 2019-05-17 维沃移动通信有限公司 A kind of slow motion video method for recording and mobile terminal
CN107578760B (en) * 2017-10-25 2019-11-26 深圳市华星光电技术有限公司 Obtain the method and gray level compensation method, system of gray scale compensation gamma value
CN109151255B (en) * 2018-08-31 2021-05-25 惠州华阳通用电子有限公司 Camera flicker elimination method and device based on photoelectric detection
CN109274984B (en) * 2018-10-16 2021-07-16 深圳开阳电子股份有限公司 Method and device for detecting light source flicker in video sequence and image processing equipment

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1946143A (en) * 2006-11-07 2007-04-11 北京中星微电子有限公司 Method and device for clearing explosure flash
CN105120133A (en) * 2015-09-14 2015-12-02 浙江大华技术股份有限公司 Method and device for eliminating image flicker phenomenon
CN105139810A (en) * 2015-09-28 2015-12-09 京东方科技集团股份有限公司 Display driving method and device, and display device
CN108111767A (en) * 2018-01-24 2018-06-01 努比亚技术有限公司 A kind of image pickup method, terminal and computer readable storage medium

Also Published As

Publication number Publication date
CN110381276A (en) 2019-10-25

Similar Documents

Publication Publication Date Title
CN110381276B (en) Video shooting method and electronic equipment
CN110086985B (en) Recording method for delayed photography and electronic equipment
CN110381345B (en) Screen projection display method and electronic equipment
US20230396886A1 (en) Multi-channel video recording method and device
CN113727016A (en) Shooting method and electronic equipment
CN113810601B (en) Terminal image processing method and device and terminal equipment
CN113810600B (en) Terminal image processing method and device and terminal equipment
CN111050062B (en) Shooting method and electronic equipment
US11700452B2 (en) Photographing method and electronic device
CN111526407B (en) Screen content display method and device
CN115086567A (en) Time-delay shooting method and device
CN112532903B (en) Intelligent video recording method, electronic equipment and computer readable storage medium
CN113726949A (en) Video processing method and electronic equipment
CN113721826A (en) Information screen display method and electronic equipment
CN115412678B (en) Exposure processing method and device and electronic equipment
WO2023077939A1 (en) Camera switching method and apparatus, and electronic device and storage medium
CN113852755A (en) Photographing method, photographing apparatus, computer-readable storage medium, and program product
CN116095509B (en) Method, device, electronic equipment and storage medium for generating video frame
CN115297269B (en) Exposure parameter determination method and electronic equipment
CN115129215B (en) Screen-extinguishing display method and electronic equipment
RU2789447C1 (en) Method and apparatus for multichannel video recording
CN115696067B (en) Image processing method for terminal, terminal device and computer readable storage medium
CN117593236A (en) Image display method and device and terminal equipment
CN115209062A (en) Image processing method and device
CN117440194A (en) Method and related device for processing screen throwing picture

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant