CN113810596B - Time-delay shooting method and device - Google Patents

Time-delay shooting method and device Download PDF

Info

Publication number
CN113810596B
CN113810596B CN202110849860.2A CN202110849860A CN113810596B CN 113810596 B CN113810596 B CN 113810596B CN 202110849860 A CN202110849860 A CN 202110849860A CN 113810596 B CN113810596 B CN 113810596B
Authority
CN
China
Prior art keywords
frame
stream data
terminal equipment
preview
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110849860.2A
Other languages
Chinese (zh)
Other versions
CN113810596A (en
Inventor
郑耀国
吴天航
李俊科
杨坤
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honor Device Co Ltd
Original Assignee
Honor Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honor Device Co Ltd filed Critical Honor Device Co Ltd
Priority to CN202110849860.2A priority Critical patent/CN113810596B/en
Publication of CN113810596A publication Critical patent/CN113810596A/en
Application granted granted Critical
Publication of CN113810596B publication Critical patent/CN113810596B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/62Control of parameters via user interfaces
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • H04N23/631Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters
    • H04N23/632Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters for displaying or modifying preview images prior to image capturing, e.g. variety of image resolutions or capturing parameters
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/741Circuitry for compensating brightness variation in the scene by increasing the dynamic range of the image compared to the dynamic range of the electronic image sensors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/95Computational photography systems, e.g. light-field imaging systems
    • H04N23/951Computational photography systems, e.g. light-field imaging systems by using two or more images to influence resolution, frame rate or aspect ratio

Abstract

In a first aspect, an embodiment of the present application provides a method and an apparatus for delayed photography, which relate to the technical field of terminals, and the method includes: the terminal equipment displays a first interface; the first interface comprises a first control, and the first interface is used for recording the delayed photography video; the terminal equipment receives a first operation aiming at the first control; responding to the first operation, the terminal equipment acquires preview stream data and photographing stream data acquired by a camera; the terminal equipment judges whether the preview stream data belongs to a high dynamic range HDR scene; when the terminal device determines that the preview stream data does not belong to the HDR scene, the terminal device extracts a first image frame from the preview stream data based on the degree of interframe change, and generates an image frame sequence including a plurality of first image frames. Therefore, the terminal equipment can perform self-adaptive sampling based on the difference of the contents between frames, and further better embodies the wonderful fragment in the time-delay shooting process.

Description

Time-delay shooting method and device
Technical Field
The application relates to the technical field of terminals, in particular to a time-delay photographing method and device.
Background
With the widespread use of terminal devices such as mobile phones and the popularization of recording methods such as short videos, more and more users begin to record what they see by means of videos. For example, the user can record contents such as a landscape or an event using a time-lapse photographing function in the terminal device. Among them, time-lapse photography (time-lag photography) is understood as a time-lapse photography, which is a technology of compressing time, and can combine images recorded for several minutes, hours or even days into a video, and reproduce a scene change in a short time.
In general, after the recording rate is determined, the terminal device may perform sampling at a set fixed frame rate during delayed shooting, and further synthesize the sampled data frames into a video for delayed shooting.
However, the above-mentioned delayed photography method has a serious loss of details of the picture, cannot capture the highlight moment in the photographed picture, and cannot acquire a delayed photography video with a good image quality.
Disclosure of Invention
The embodiment of the application provides a delayed shooting method and a delayed shooting device, which can perform self-adaptive sampling based on the obtained inter-frame content difference, and further better embody the wonderful fragments in the delayed shooting process.
In a first aspect, an embodiment of the present application provides a delay shooting method, where the method includes: the terminal equipment displays a first interface; the first interface comprises a first control, and the first interface is used for recording the delayed photography video; the terminal equipment receives a first operation aiming at a first control; responding to the first operation, the terminal equipment acquires preview stream data and photographing stream data acquired by a camera; the terminal equipment judges whether the preview stream data belongs to a high dynamic range HDR scene; when the terminal device determines that the preview stream data does not belong to the HDR scene, the terminal device extracts a first image frame from the preview stream data based on the degree of interframe change, and generates an image frame sequence including a plurality of first image frames. Therefore, the terminal equipment can perform self-adaptive sampling based on the difference of the contents between frames, and further better embodies the wonderful section in the delayed photography shooting process.
The first control can be a control for starting time-delay shooting; the first operation can be click operation or long-time press operation; the preview stream data can be used for generating delayed photography video in non-HDR scenes; the photo stream data can be used for generating a delayed photographic video in an HDR scene; the HDR scene may be understood as a scene corresponding to a situation where the ratio of HDR images in the multi-frame data acquired based on the preview thumbnail exceeds a first threshold.
In one possible implementation, the determining, by the terminal device, whether the preview stream data belongs to the high dynamic range HDR scene includes: terminal equipment carries out multiple down sampling on preview stream data to obtain a preview thumbnail; the terminal equipment acquires multi-frame preview data in the preview thumbnail; the terminal equipment judges whether preview stream data belong to an HDR scene or not based on the proportion of a first image in multi-frame preview data; the first image is an image with the proportion of highlight pixels exceeding a first threshold, and the highlight pixels are pixel points with gray values larger than a gray threshold. Therefore, the terminal equipment can judge whether the current scene is a high dynamic scene or not through the preview small picture, so that the algorithm can be optimized, and the occupation of the memory space during calculation is reduced.
Wherein the first adjacent frame may be an adjacent frame in the preview thumbnail.
In one possible implementation manner, the terminal device extracts a first image frame from preview stream data based on the degree of change between frames, and generates an image frame sequence including a plurality of first image frames, including: the terminal equipment calculates the interframe variation degree of a first adjacent frame in the preview thumbnail; the first adjacent frame includes a first frame and a second frame; the terminal device extracts a first image frame from preview stream data based on the degree of interframe variation of a first adjacent frame, and generates an image frame sequence including a plurality of first image frames. Therefore, the terminal equipment can perform self-adaptive sampling based on the interframe change degree, and further better embodies the wonderful fragment in the shooting process.
In one possible implementation manner, the terminal device extracts a first image frame from preview stream data based on a degree of interframe variation of a first adjacent frame, and generates an image frame sequence including a plurality of first image frames, including: the terminal equipment extracts a first image frame from preview stream data based on the interframe change degree of a first adjacent frame to obtain the preview stream data after sampling processing; the terminal equipment performs registration processing on the sampled preview stream data to obtain the preview stream data after the registration processing; and the terminal equipment performs brightness adjustment on the preview stream data after the registration processing to generate an image frame sequence containing a plurality of first image frames. Therefore, the terminal equipment can obtain the time-delay shooting video with better picture effect based on the processing procedures of sampling processing, registration processing, brightness adjustment and the like.
In a possible implementation manner, the method for extracting, by a terminal device, a first image frame from preview stream data based on a degree of interframe variation of a first adjacent frame to obtain sampled preview stream data includes: when the interframe change degree is larger than a preset threshold value, the terminal equipment extracts preview stream data corresponding to a second frame to obtain the preview stream data after sampling processing; or when the interframe change degree is smaller than a preset threshold value, the terminal equipment extracts the first image frame from the preview stream data according to a fixed sampling rate to obtain the preview stream data after sampling processing. Therefore, the terminal equipment can perform self-adaptive sampling based on the interframe change degree, and further better embodies the wonderful section in the shooting process.
The multi-frame data in the preview small picture corresponds to multi-frame preview stream data, and the preview stream data corresponding to the second frame can be understood as the preview stream data corresponding to the second frame image in the first adjacent frame in the multi-frame preview stream data when the degree of interframe change in the first adjacent frame of the preview small picture is greater than a preset threshold value.
In one possible implementation, the degree of interframe variation includes at least one of: a change value of a pixel point between first adjacent frames, a change value of a gray value between the first adjacent frames, or a change value of an average value of gray values between the first adjacent frames.
In a possible implementation manner, the registering, by a terminal device, the preview stream data after the sampling processing to obtain the preview stream data after the registering processing, includes: the terminal equipment acquires a second adjacent frame in the sampled and processed preview stream data; the second adjacent frame comprises a third frame and a fourth frame; the terminal equipment aligns the fourth frame to the third frame, cuts out the unaligned area in the fourth frame, and obtains the cut fourth frame; and the terminal equipment performs interpolation processing on the cut fourth frame based on the third frame to obtain the preview stream data after registration processing. Therefore, even if the situation that the terminal equipment shakes possibly exists, the picture jumps too much when the video is played, the terminal equipment can also ensure the smoothness of the picture in the time-delay shooting based on the registration processing.
The third frame may be understood as a reference frame, the fourth frame may be understood as a registration frame, and the interpolation process may be understood as restoring the clipped fourth frame to the picture size corresponding to the third frame by using an interpolation algorithm.
In one possible implementation manner, the method further includes: when the terminal device determines that the preview stream data belongs to the HDR scene, the terminal device processes the photographed stream data to generate an image frame sequence including a plurality of second image frames. In this way, HDR is introduced in time-lapse photography, and based on the processing of HDR images, more real pictures can be displayed in time-lapse photography, and further a better time-lapse photography video effect is obtained.
In one possible implementation, the processing, by the terminal device, of the photo stream data to generate an image frame sequence including a plurality of second image frames includes: the terminal equipment performs image fusion processing on the photographing stream data to obtain the photographing stream data after the fusion processing; the terminal equipment extracts a second image frame from the photo stream data after the fusion processing to obtain the photo stream data after the sampling processing; and the terminal equipment performs registration processing on the sampled photographing stream data to obtain an image frame sequence containing a plurality of second image frames. Therefore, sampling processing is carried out by adopting a fixed interval sampling method in a high dynamic scene, and the complexity of an algorithm in the high dynamic scene can be simplified.
In a possible implementation manner, the image fusion processing is performed on the photo stream data by the terminal device to obtain the photo stream data after the fusion processing, and the method includes: the terminal equipment processes the photographing stream data into data corresponding to the size of the picture of the preview stream data to obtain the photographing stream data after the picture size processing; when the terminal equipment determines that the picture brightness of the shot stream data after picture size processing exceeds a first brightness threshold value, the terminal equipment generates a first data frame and a second data frame; and the terminal equipment fuses the first data frame and the second data frame to obtain the photo stream data after fusion processing. Therefore, the terminal equipment can issue two different frames of images based on the difference of the image brightness, and generate an image frame with a better image effect based on the fusion of two frames of data, so that a more real image can be displayed in the time-delay photography, and a better time-delay photography video effect can be obtained.
In one possible implementation manner, the method further includes: when the terminal equipment determines that the picture brightness of the shot stream data after picture size processing does not exceed a second brightness threshold value, the terminal equipment generates a first data frame and a third data frame; the first brightness threshold is greater than the second brightness threshold; and the terminal equipment fuses the first data frame and the third data frame to obtain the photo stream data after fusion processing. Therefore, the terminal equipment can issue two different frames of images based on the difference of the image brightness, and generate an image frame with a better image effect based on the fusion of two frames of data, so that a more real image can be displayed in the time-delay photography, and a better time-delay photography video effect can be obtained.
In a possible implementation manner, the registering, by the terminal device, the sampled photographic stream data to obtain an image frame sequence including a plurality of second image frames includes: the terminal equipment acquires a third adjacent frame in the sampled photographed stream data; the third adjacent frame comprises a fifth frame and a sixth frame; the terminal equipment aligns the sixth frame to the fifth frame, cuts out the unaligned area in the sixth frame, and obtains the cut sixth frame; and the terminal equipment performs interpolation processing on the cut sixth frame based on the fifth frame to obtain an image frame sequence containing a plurality of second image frames. Therefore, even if the situation of shaking of the terminal equipment and the like possibly exists, the picture jumping is overlarge during video playing, the terminal equipment can also ensure the smoothness of the picture in time-delay shooting based on the registration processing.
In the embodiment of the present application, the fifth frame may be understood as a reference frame, the sixth frame may be understood as a registration frame, and the interpolation process may be understood as restoring the cropped sixth frame to the picture size corresponding to the fifth frame by using an interpolation algorithm.
In one possible implementation manner, the method further includes: the terminal equipment displays a second interface; the second interface comprises a second control; the terminal equipment receives a second operation aiming at the second control; in response to the second operation, the terminal device encodes an image frame sequence containing a plurality of first image frames, and/or an image frame sequence containing a plurality of second image frames, into the time-lapse photography video.
The second control can be a control for stopping the delayed photography; the second operation may be a click operation, a long-press operation, or the like.
In one possible implementation manner, the method further includes: receiving a third operation of opening the time-delay shooting video file by the user; the terminal equipment displays a third interface; the third interface comprises a delayed shooting file and a first identifier corresponding to the delayed shooting file; the first flag is used to indicate a video type of the delayed shooting video file. In this way, the terminal device can provide an identifier for identifying whether the video is processed in the high-dynamic scene, and then the user can clearly determine which video of the plurality of videos is obtained based on the processing of the high-dynamic scene.
The third operation is click operation, long-time press operation or the like; the third interface may be an interface as shown in the figure, and the first identifier may be an HDR identifier of a highly dynamic scene.
In a second aspect, an embodiment of the present application provides a time-lapse photographing apparatus, including: the display unit is used for displaying a first interface; the first interface comprises a first control, and the first interface is used for recording the delayed photography video; the processing unit is used for receiving a first operation aiming at the first control; responding to the first operation, and the processing unit is also used for acquiring preview stream data and photographing stream data acquired by the camera; the processing unit is further used for judging whether the preview stream data belongs to a high dynamic range HDR scene; when the terminal device determines that the preview stream data does not belong to the HDR scene, the processing unit is further configured to extract a first image frame from the preview stream data based on the degree of interframe change, and generate an image frame sequence including a plurality of first image frames.
In a possible implementation manner, the processing unit is specifically configured to perform multiple down-sampling on the preview stream data to obtain a preview thumbnail; the processing unit is further specifically used for acquiring multi-frame preview data in the preview thumbnail; the processing unit is further specifically configured to determine whether the preview stream data belongs to an HDR scene based on a proportion of the first image in the multi-frame preview data; the first image is an image with the proportion of highlight pixels exceeding a first threshold, and the highlight pixels are pixel points with gray values larger than a gray threshold.
In a possible implementation manner, the processing unit is specifically configured to calculate a degree of interframe variation of a first adjacent frame in the preview thumbnail; the first adjacent frame includes a first frame and a second frame; the processing unit is further specifically configured to extract a first image frame from the preview stream data based on the degree of interframe variation of the first adjacent frame, and generate an image frame sequence including a plurality of first image frames.
In a possible implementation manner, the processing unit is specifically configured to extract a first image frame from the preview stream data based on an interframe change degree of a first adjacent frame, so as to obtain preview stream data after sampling processing; the processing unit is further specifically used for performing registration processing on the preview stream data after sampling processing to obtain the preview stream data after registration processing; the processing unit is further specifically configured to perform brightness adjustment on the preview stream data after the registration processing, and generate an image frame sequence including a plurality of first image frames.
In a possible implementation manner, when the inter-frame variation degree is greater than a preset threshold, the processing unit is specifically configured to extract preview stream data corresponding to a second frame to obtain the preview stream data after sampling processing; or, when the inter-frame variation degree is smaller than the preset threshold, the processing unit is further specifically configured to extract the first image frame from the preview stream data according to the fixed sampling rate, so as to obtain the preview stream data after the sampling processing.
In one possible implementation, the degree of interframe variation includes at least one of: a change value of a pixel point between first adjacent frames, a change value of a gray value between the first adjacent frames, or a change value of an average value of gray values between the first adjacent frames.
In a possible implementation manner, the processing unit is specifically configured to acquire a second adjacent frame in the sampled preview stream data; the second adjacent frame comprises a third frame and a fourth frame; the processing unit is further specifically used for aligning the fourth frame to the third frame, cutting out the unaligned area in the fourth frame, and obtaining the cut fourth frame; and the processing unit is further specifically configured to perform interpolation processing on the clipped fourth frame based on the third frame to obtain preview stream data after registration processing.
In a possible implementation manner, when the terminal device determines that the preview stream data belongs to the HDR scene, the processing unit is further configured to process the shot stream data to generate an image frame sequence including a plurality of second image frames.
In a possible implementation manner, the processing unit is specifically configured to perform image fusion processing on the photo stream data to obtain the photo stream data after the fusion processing; processing unit, also in particular for
Extracting a second image frame from the photo stream data subjected to the fusion processing to obtain the photo stream data subjected to the sampling processing; the processing unit is further specifically configured to perform registration processing on the sampled photo stream data to obtain an image frame sequence including a plurality of second image frames.
In a possible implementation manner, the processing unit is specifically configured to process the photo stream data into data corresponding to a size of a picture of preview stream data, so as to obtain the photo stream data after the picture size processing; when the terminal device determines that the picture brightness of the shot stream data after the picture size processing exceeds a first brightness threshold, the processing unit is specifically configured to generate a first data frame and a second data frame; and the processing unit is specifically used for fusing the first data frame and the second data frame to obtain the fused photographing stream data.
In a possible implementation manner, when the terminal device determines that the picture brightness of the shot stream data after the picture size processing does not exceed the second brightness threshold, the processing unit is further configured to generate a first data frame and a third data frame; the first brightness threshold is greater than the second brightness threshold; and the processing unit is also used for fusing the first data frame and the third data frame to obtain the photo stream data after fusion processing.
In a possible implementation manner, the processing unit is specifically configured to acquire a third adjacent frame in the sampled and processed photo stream data; the third adjacent frame comprises a fifth frame and a sixth frame; the processing unit is further specifically configured to align the sixth frame to the fifth frame, and cut out an unaligned region in the sixth frame to obtain a cut sixth frame; and the processing unit is further specifically configured to perform interpolation processing on the clipped sixth frame based on the fifth frame to obtain an image frame sequence including a plurality of second image frames.
In a possible implementation manner, the display unit is further configured to display a second interface; the second interface comprises a second control; the processing unit is further used for receiving a second operation aiming at a second control; in response to the second operation, the processing unit is further configured to encode an image frame sequence comprising a plurality of first image frames, and/or an image frame sequence comprising a plurality of second image frames, into the time-lapse filming video.
In a possible implementation manner, the processing unit is further configured to receive a third operation of opening the time-lapse video file by the user; the display unit is also used for displaying a third interface by the terminal equipment; the third interface comprises a delayed shooting file and a first identifier corresponding to the delayed shooting file; the first flag is used to indicate a video type of the delayed shooting video file.
In a third aspect, an embodiment of the present application provides a time-lapse shooting device, including a processor and a memory, where the memory is used for storing code instructions; the processor is configured to execute the code instructions to cause the electronic device to perform the time-lapse photography method as described in the first aspect or any implementation manner of the first aspect.
In a fourth aspect, an embodiment of the present application provides a computer-readable storage medium storing instructions that, when executed, cause a computer to perform the time-lapse shooting method as described in the first aspect or any one of the implementation manners of the first aspect.
In a fifth aspect, a computer program product comprises a computer program which, when executed, causes a computer to perform a method of delayed photography as described in the first aspect or any implementation form of the first aspect.
It should be understood that the third aspect to the fifth aspect of the present application correspond to the technical solutions of the first aspect of the present application, and the beneficial effects obtained by the aspects and the corresponding possible implementations are similar and will not be described again.
Drawings
Fig. 1 is a schematic view of a scenario provided in an embodiment of the present application;
fig. 2 is a schematic structural diagram of a terminal device according to an embodiment of the present application;
fig. 3 is a schematic view of a shooting principle provided in an embodiment of the present application;
fig. 4 is a schematic flowchart of a time-lapse shooting method according to an embodiment of the present disclosure;
FIG. 5 is a schematic view of an interface provided by an embodiment of the present application;
fig. 6 is a schematic diagram of an adaptive sampling method according to an embodiment of the present application;
FIG. 7 is a schematic diagram of a fixed-interval sampling according to an embodiment of the present application;
FIG. 8 is a schematic diagram of another adaptive sampling scheme provided in an embodiment of the present application;
fig. 9 is a schematic diagram illustrating a principle of registration and smoothing of adjacent frame images according to an embodiment of the present application;
FIG. 10 is a schematic view of another interface provided by an embodiment of the present application;
fig. 11 is a schematic flowchart of another time-lapse shooting method according to an embodiment of the present disclosure;
fig. 12 is a schematic structural diagram of a time-lapse shooting device according to an embodiment of the present application;
fig. 13 is a schematic hardware structure diagram of a control device according to an embodiment of the present application;
fig. 14 is a schematic structural diagram of a chip according to an embodiment of the present application.
Detailed Description
In order to facilitate clear description of technical solutions of the embodiments of the present application, in the embodiments of the present application, words such as "first" and "second" are used to distinguish identical items or similar items with substantially the same functions and actions. For example, the first value and the second value are only used for distinguishing different values, and the order of the values is not limited. Those skilled in the art will appreciate that the terms "first," "second," etc. do not denote any order or quantity, nor do the terms "first," "second," etc. denote any order or importance.
It is noted that, in the present application, words such as "exemplary" or "for example" are used to mean exemplary, illustrative, or descriptive. Any embodiment or design described herein as "exemplary" or "such as" is not necessarily to be construed as preferred or advantageous over other embodiments or designs. Rather, use of the word "exemplary" or "such as" is intended to present relevant concepts in a concrete fashion.
In the present application, "at least one" means one or more, "a plurality" means two or more. "and/or" describes the association relationship of the associated objects, meaning that there may be three relationships, e.g., a and/or B, which may mean: a exists alone, A and B exist simultaneously, and B exists alone, wherein A and B can be singular or plural. The character "/" generally indicates that the former and latter associated objects are in an "or" relationship. "at least one of the following" or similar expressions refer to any combination of these items, including any combination of the singular or plural items. For example, at least one (one) of a, b, or c, may represent: a, b, c, a and b, a and c, b and c, or a, b and c, wherein a, b and c can be single or multiple.
With the widespread use of terminal devices such as mobile phones and the popularization of recording methods such as short videos, more and more users begin to record what they see by means of videos. For example, the user can record contents such as a landscape or an event using a time-lapse photographing function in the terminal device.
Exemplarily, fig. 1 is a schematic view of a scenario provided in an embodiment of the present application. As shown in fig. 1, a terminal device 101 having a delayed shooting function may be included in the scene, for example, the terminal device 101 may be a mobile phone or the like, and a screen 102 shot by a user using the terminal device 101 may include epiphyllum oxypetalum.
In general, when a terminal device receives an operation that a user triggers to turn on a control for delaying photography, the terminal device may acquire preview stream data of a picture 102 acquired by a camera, and perform fixed-interval sampling (or referred to as fixed-frame-rate sampling) on the preview stream data, for example, if the terminal device extracts one frame from every 120 frames of images for storage, the sampling rate is 1/120 (or may also be understood as a frame extraction interval of 120 x), or the sampling interval is 4 seconds when the frame rate is 30 frames/second (fps); furthermore, the terminal device can store the sampled data frames and play the data frames in sequence, so as to achieve the purpose of quick play. The sampling rate may be set by a user in the terminal device, for example, the terminal device may provide different sampling rates, and then the user may set corresponding sampling rates according to different scenes recorded by using the delayed photography. For example, when the user records a scene such as sunrise or sunset using time-lapse photography, the sampling rate may be set to 1/15, etc.; when the user records a scene at night with delayed photography, the sampling rate can be set to 1/120, 1/600, and the like; when the user records a scene of alternating days with time-lapse photography, the sampling rate may be set to 1/1000 or the like. Wherein, in general, the terminal device can support the frame extraction interval between 15x and 1800x, that is, the terminal device can support the shooting rate between 15x and 1800 x.
However, in the time-lapse photography based on the fixed-interval sampling, since the fixed-interval sampling cannot take care of different picture contents in the photographing, a scene change cannot be well reflected in the time-lapse photography, so that a depth feeling and a dynamic feeling of a picture are not strong, thereby affecting the final photographing effect of the time-lapse photography. Moreover, the constant interval sampling may cause a content difference between two frames stored after sampling to be large, so that a large jerky feeling exists when the picture is played, for example, when a scene with a large motion amplitude exists in the picture, such as a tree branch blowing with wind, or when a user does not use a tripod but holds a terminal device for shooting, a large jerky may exist in the shot picture, and the scene with a large motion amplitude or a jerky scene may further aggravate the jerky feeling of the picture when the video is played.
In addition, the delayed shooting method is obtained by the terminal device based on the processing of the preview stream data, and the adjustment degree of the preview stream data for the dynamic range of the picture is generally weak, the shot picture often has an area with too dark brightness or too exposed brightness, the dynamic range of the picture is low, and the picture seen by the user in shooting cannot be well shown, so that the final video shooting effect is influenced.
In view of this, the embodiment of the present application provides a delayed photography method, where a terminal device may perform adaptive sampling based on a difference between frames in obtained preview stream data, so as to better embody a highlight in a shooting process; in addition, a High Dynamic Range (HDR) processing algorithm is introduced into the time-lapse shooting, and through the processing of the HDR algorithm, the terminal device can present a more real picture in the time-lapse shooting, so as to obtain a better time-lapse shooting effect.
It is understood that the terminal device may also be referred to as a terminal (terminal), a User Equipment (UE), a Mobile Station (MS), a Mobile Terminal (MT), etc. The terminal device may be a mobile phone (mobile phone) having a delayed shooting function, etc., a smart tv, a wearable device, a tablet computer (Pad), a computer with a wireless transceiving function, a Virtual Reality (VR) terminal device, an Augmented Reality (AR) terminal device, a wireless terminal in industrial control (industrial control), a wireless terminal in self-driving (self-driving), a wireless terminal in remote surgery (remote medical supply), a wireless terminal in smart grid (smart grid), a wireless terminal in transportation safety (transportation safety), a wireless terminal in smart city (smart city), a wireless terminal in smart home (smart home), etc. The specific technology and the specific device form adopted by the terminal device are not limited in the embodiment of the application.
Therefore, in order to better understand the embodiments of the present application, the following describes the structure of the terminal device according to the embodiments of the present application. Exemplarily, fig. 2 is a schematic structural diagram of a terminal device provided in an embodiment of the present application.
The terminal device may include a processor 110, an external memory interface 120, an internal memory 121, a Universal Serial Bus (USB) interface 130, a charging management module 140, a power management module 141, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, a button 190, an indicator 192, a camera 193, a display screen 194, and the like. The sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
It is to be understood that the illustrated structure of the embodiments of the present application does not constitute a specific limitation to the terminal device. In other embodiments of the present application, a terminal device may include more or fewer components than shown, or some components may be combined, some components may be split, or a different arrangement of components may be used. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Processor 110 may include one or more processing units. Wherein, the different processing units may be independent devices or may be integrated in one or more processors. A memory may also be provided in processor 110 for storing instructions and data.
The USB interface 130 is an interface conforming to the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, or the like. The USB interface 130 may be used to connect a charger to charge the terminal device, and may also be used to transmit data between the terminal device and the peripheral device. And the earphone can also be used for connecting an earphone and playing audio through the earphone. The interface may also be used to connect other electronic devices, such as AR devices and the like.
The charging management module 140 is configured to receive charging input from a charger. The charger may be a wireless charger or a wired charger. The power management module 141 is used for connecting the charging management module 140 and the processor 110.
The wireless communication function of the terminal device can be realized by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, the modem processor, the baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Antennas in terminal devices may be used to cover single or multiple communication bands. Different antennas can also be multiplexed to improve the utilization of the antennas.
The mobile communication module 150 may provide a solution including 2G/3G/4G/5G wireless communication applied on the terminal device. The mobile communication module 150 may include at least one filter, a switch, a power amplifier, a Low Noise Amplifier (LNA), and the like. The mobile communication module 150 may receive the electromagnetic wave from the antenna 1, filter, amplify, etc. the received electromagnetic wave, and transmit the electromagnetic wave to the modem processor for demodulation.
The wireless communication module 160 may provide solutions for wireless communication applied to a terminal device, including Wireless Local Area Networks (WLANs) (e.g., wireless fidelity (Wi-Fi) networks), bluetooth (BT), global Navigation Satellite System (GNSS), frequency Modulation (FM), and the like.
The terminal device realizes the display function through the GPU, the display screen 194, and the application processor. The GPU is a microprocessor for image processing, and is connected to the display screen 194 and an application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering.
The display screen 194 is used to display images, video, and the like. The display screen 194 includes a display panel. In some embodiments, the terminal device may include 1 or N display screens 194, N being a positive integer greater than 1.
The terminal device may implement a photographing function through an Image Signal Processor (ISP), a camera 193, a video codec, a GPU, a display screen 194, an application processor, and the like.
The ISP is used to process the data fed back by the camera 193. For example, when a user takes a picture, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, an optical signal is converted into an electric signal, and the camera photosensitive element transmits the electric signal to the ISP for processing and converting the electric signal into an image visible to the naked eye. The ISP can also carry out algorithm optimization on the noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in camera 193.
The camera 193 is used to capture still images or video. In some embodiments, the terminal device may include 1 or N cameras 193, N being a positive integer greater than 1. The camera 193 may be a front camera or a rear camera. In this embodiment, the terminal device may obtain preview stream data based on the camera 193, and obtain the time-lapse photography based on adaptive sampling and other processing on the preview stream data.
For example, fig. 3 is a schematic diagram of a shooting principle provided by an embodiment of the present application. As shown in fig. 3, the camera 193 may include a lens (lens) and a photosensitive element (sensor), which may be any photosensitive device such as a charge-coupled device (CCD) or a Complementary Metal Oxide Semiconductor (CMOS).
As shown in fig. 3, during shooting, the reflected light of the object to be shot can generate an optical image through the lens, the optical image is projected onto the photosensitive element, the photosensitive element converts the received optical signal corresponding to the optical image into an electrical signal, and the camera 193 can send the obtained electrical signal to a Digital Signal Processing (DSP) module for digital signal processing, so as to finally obtain a frame of digital image.
Similarly, in the process of recording the video, the DSP can obtain continuous multi-frame digital images according to the shooting principle, and the continuous multi-frame digital images can form a section of video after being coded according to a certain frame rate. Due to the special physiological structure of the human eye, when the frame rate of the viewed pictures is higher than 16fps, the human eye considers the viewed pictures to be coherent, and this phenomenon is called visual retention. In order to ensure the consistency of video watching by the user, the terminal device can encode the multi-frame digital image output by the DSP according to a certain frame rate (for example, 24fps or 30 fps). For example, if the DSP acquires 300 frames of digital images through the camera 193, the terminal device may encode the 300 frames of digital images into a 10-second (300 frames/30fps = 10) video at a preset frame rate of 30 fps.
One or more frames of digital images output by the DSP may be output on the terminal device through the display screen 194, or the digital images may be stored in the internal memory 121 (or the external memory 120), which is not limited in this embodiment.
The digital signal processor is used for processing digital signals, and can process other digital signals besides digital image signals. For example, when the terminal device selects the frequency point, the digital signal processor is used for performing fourier transform and the like on the frequency point energy.
Video codecs are used to compress or decompress digital video. The terminal device may support one or more video codecs. Thus, the terminal device can play or record videos in various encoding formats, such as: moving Picture Experts Group (MPEG) 1, MPEG2, MPEG3, MPEG4, and the like.
The NPU is a neural-network (NN) computing processor that processes input information quickly by using a biological neural network structure, for example, by using a transfer mode between neurons of a human brain, and can also learn by itself continuously. The NPU can realize the intelligent cognition and other applications of the terminal equipment, such as: image recognition, face recognition, speech recognition, text understanding, and the like.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to extend the storage capability of the terminal device. The external memory card communicates with the processor 110 through the external memory interface 120 to implement a data storage function. For example, files such as music, video, etc. are saved in the external memory card.
The internal memory 121 may be used to store computer-executable program code, which includes instructions. The internal memory 121 may include a program storage area and a data storage area.
The terminal device can implement an audio function through the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the headphone interface 170D, and the application processor. Such as music playing, recording, etc.
The audio module 170 is used to convert digital audio information into analog audio signals for output, and also used to convert analog audio inputs into digital audio signals. The speaker 170A, also called a "horn", is used to convert the audio electrical signal into a sound signal. The terminal device can listen to music through the speaker 170A, or listen to a handsfree call. The receiver 170B, also called "earpiece", is used to convert the electrical audio signal into a sound signal. When the terminal device answers a call or voice information, it is possible to answer a voice by bringing the receiver 170B close to the human ear. The earphone interface 170D is used to connect a wired earphone. The microphone 170C, also referred to as a "microphone," is used to convert sound signals into electrical signals.
The pressure sensor 180A is used for sensing a pressure signal, and can convert the pressure signal into an electrical signal. In some embodiments, the pressure sensor 180A may be disposed on the display screen 194. The gyro sensor 180B may be used to determine the motion attitude of the terminal device. The air pressure sensor 180C is used to measure air pressure. The magnetic sensor 180D includes a hall sensor. The acceleration sensor 180E can detect the magnitude of the acceleration of the terminal device in various directions (generally, three axes). A distance sensor 180F for measuring a distance. The proximity light sensor 180G may include, for example, a Light Emitting Diode (LED) and a light detector, such as a photodiode. The ambient light sensor 180L is used to sense the ambient light level. The fingerprint sensor 180H is used to collect a fingerprint. The temperature sensor 180J is used to detect temperature. The touch sensor 180K is also called a "touch device". The touch sensor 180K may be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, which is also called a "touch screen". The bone conduction sensor 180M may acquire a vibration signal.
The keys 190 include a power-on key, a volume key, and the like. The keys 190 may be mechanical keys. Or may be touch keys. The terminal device may receive a key input, and generate a key signal input related to user setting and function control of the terminal device. Indicator 192 may be an indicator light that may be used to indicate a state of charge, a change in charge, or a message, missed call, notification, etc.
The software system of the terminal device may adopt a layered architecture, an event-driven architecture, a micro-core architecture, a micro-service architecture, or a cloud architecture, which will not be described herein again.
The words described in the embodiments of the present application are explained below. It is to be understood that the description is for the purpose of illustrating the embodiments of the present application more clearly and is not necessarily to be construed as limiting the embodiments of the present application.
The time-lapse photography described in the embodiment of the present application can be understood as follows: time-lapse photography, or time-lapse video recording, is a photographing technique that can compress time, and the time-lapse photography can reproduce a process in which a subject slowly changes in a short time. In the embodiment of the application, when the terminal device receives the operation of starting the delayed shooting function by the user, the terminal device can start to collect each frame of shooting picture captured by the camera. And the terminal equipment can extract M (M < N) frame shooting pictures from N (N > 1) frame shooting pictures captured by the camera according to a certain frame extraction frequency to serve as the time-delay shooting video of the time-delay shooting. Subsequently, when the terminal device receives the operation of opening the delayed video camera by the user, the terminal device may play the extracted M frames of shot pictures at a certain frame rate, so as to reproduce scene changes in the N frames of shot pictures actually shot by the terminal device through the M frames of shot pictures.
The preview stream data described in the embodiment of the present application may be understood as: and previewing data acquired in real time based on a camera of the terminal equipment. In the embodiment of the application, when the terminal device receives an operation of opening an application program related to photographing or video recording and the like by a user, the terminal device may collect each frame of picture captured by the camera and display the frame of picture in the interface of the application program in real time, and the preview data presented in the interface may be preview stream data.
The photo stream data described in the embodiment of the present application may be understood as: and acquiring data based on a photographing control in the terminal equipment, and performing related processing on the data to obtain photographing data. In the embodiment of the application, when the terminal device receives an operation that a user triggers a control corresponding to the delayed photography, the terminal device can acquire the photographing stream data obtained through multi-frame processing in a high dynamic scene, and convert the photographing stream data into the preview stream data.
The following describes the technical solution of the present application and how to solve the above technical problems in detail by specific embodiments. The following embodiments may be implemented independently or in combination, and details of the same or similar concepts or processes may not be repeated in some embodiments.
For example, a user can hold the terminal device by hand, and record the blooming process of the night-blooming cereus by using a delayed shooting function in the terminal device. In the shooting process, even if the terminal equipment shakes due to unstable holding by a user or other conditions affecting shooting such as other objects suddenly flash in the picture, the terminal equipment can output the delayed shooting video capable of completely recording the blooming process of the epiphyllum oxyphyllum based on the delayed shooting method provided by the embodiment of the application.
Fig. 4 is a schematic flowchart of a time-lapse shooting method according to an embodiment of the present disclosure. As shown in fig. 4, the time-lapse photographing method may include the steps of:
and S401, the terminal equipment starts time-lapse shooting.
Fig. 5 is a schematic interface diagram provided in an embodiment of the present application. In the embodiment corresponding to fig. 5, a terminal device is taken as an example for illustration, and the example does not limit the embodiment of the present application.
When the mobile phone receives an operation corresponding to the user opening the time-delay photographing function, the mobile phone may display an interface as shown in a in fig. 5, where the interface includes a time-delay photographing starting control 501, a control 502 for adjusting a magnification speed of a picture, a control 503 for turning over a camera, a gallery opening control, and the like, and a picture acquired in real time based on the camera may also be displayed in the interface, for example, a photographing picture 504, and the photographing picture 504 may include epiphyllum.
When the mobile phone receives an operation that the user triggers the start-up delayed photography control 501 in the interface shown in a in fig. 5, the mobile phone may display an interface shown in b in fig. 5. The interface includes shooting duration information 505, a delayed shooting stopping control 506, and the like, and a picture acquired in real time based on a camera, for example, a shooting picture 507, may be displayed in the interface, and the shooting picture 507 may include epiphyllum oxyphyllum. The shooting duration information 505 may display: 00/00, it can be understood that when the mobile phone shoots a 15-second video picture, the duration of the corresponding time-lapse shooting can be 1 second. The 15 seconds may be understood as the duration of the actually photographed video, and the 1 second may be understood as the duration of the time-lapse photography generated after the frame extraction. Furthermore, the mobile phone can collect each frame of shot picture according to the frame rate. For example, when the frame rate is 30fps, the mobile phone can collect 30 frames of shot pictures in each second, and the number of frames of the shot pictures collected by the mobile phone is gradually accumulated along with the lapse of the recording time, and further, the mobile phone can extract a plurality of frames of shot pictures from the collected shot pictures according to the sampling rate to form a delayed shooting video. Illustratively, as shown in b in fig. 5, the sampling rate at this time is 1/15, the mobile phone collects 30 frames of shot pictures per second, extracts 2 frames of shot pictures per second after frame extraction, generates a delayed shooting video including 30 frames of pictures after actual shooting for 15 seconds, and the time length of the delayed shooting video is 1 second.
It can be understood that, when the terminal device performs sampling based on a fixed sampling rate, the actual recording time length shown in b in fig. 5 and the time length of the time-lapse video generated after frame extraction may be displayed; when the terminal device is based on the adaptive sampling, the sampling frame may be increased, so that the recording time length and the video time length are different from the fixed sampling time length.
S402, the mobile phone judges whether the current scene is an HDR scene.
In this embodiment of the application, when the terminal device determines that the terminal device is not currently an HDR scene, the terminal device may execute the step shown in S403; alternatively, when the terminal device determines that the HDR scene is currently available, the terminal device may perform the steps shown in S407.
For example, one possible implementation of the terminal device determining whether the current HDR scene is the HDR scene may be: and the terminal equipment performs 4-time down-sampling on the preview stream data to obtain a preview small picture, and determines whether the ratio of the number of the highlight pixels in the preview small picture to the number of all the pixels in the preview small picture is larger than a preset pixel threshold value. The preview thumbnail can be obtained by reserving a row of pixel points in every two rows of pixel points in a picture corresponding to preview-based stream data, the obtained row of pixel points are stored in an interval manner, the highlight pixels can be determined based on a gray threshold of the pixel points, and the pixel threshold can be used for determining whether the current scene is a high-dynamic scene. The terminal device may determine whether the current scene is a high-dynamic scene based on one frame of data or multiple frames of data.
Taking the example of judging whether the current scene is the high dynamic scene based on the multi-frame data, the terminal device may take the multi-frame data in the preview thumbnail, and when the ratio of the number of the highlighted pixels in the multi-frame data to the number of all the pixels in the multi-frame data is greater than (or equal to) the set threshold value of the number of the pixels, or the ratio of the number of the pixels with the highlighted pixels greater than the threshold value to all the multi-frame data is greater than (or equal to) the preset ratio threshold value, the terminal device may determine that the scene corresponding to the current multi-frame data is the high dynamic scene; or, when the terminal device determines that the ratio of the number of the highlighted pixels in the multi-frame data to the number of all pixels in the multi-frame data is less than or equal to (or less than) the pixel threshold, or the ratio of the number of the highlighted pixels greater than the pixel threshold to all the multi-frame data is less than (or less than or equal to) a preset ratio threshold, the terminal device may determine that the current scene corresponding to the preview thumbnail is a common scene.
For example, when the terminal device photographs a scene with bright light such as outdoors, for example, a scene such as sunrise, sunset, or afternoon, the scene may be understood as an HDR scene. In general, the terminal device may obtain preview stream data and photo stream data corresponding to the scene, and when taking a photo, the terminal device may generate two frames of images and perform operations such as two frames of image fusion, so that the picture effect of the HDR image obtained through the above processing is better; when the time-lapse shooting is carried out, the stored data frames are the interval sampling of the preview stream data, and for the preview stream data with the conditions of over-dark or over-dark, etc., the preview stream data is directly stored without being processed, so that the preview stream data is difficult to contain more image details. In the embodiment of the application, whether the current scene is a high dynamic scene is judged based on the preview stream data, and if the judgment result is the high dynamic scene, the terminal equipment can process the photographing stream by adopting an HDR algorithm, so that a time-delay photographing video file which can reflect the visual effect in a real environment better is obtained.
And S403, the terminal equipment acquires the difference value between frames of the preview stream data.
In this embodiment of the present application, the difference value between adjacent frames of the preview thumbnail (or may also be referred to as a change degree between frames, or a change degree of picture content) may be understood as: the variation value of the pixel point between the adjacent frames in the preview thumbnail, the variation value of the gray value between the adjacent frames, or the difference value of the mean value of the gray values between the adjacent frames, etc. Taking the blooming process of the epiphyllum oxypetalum shot by the terminal equipment as an example, when the difference value between adjacent frames of the shot preview thumbnail is large, the epiphyllum oxypetalum petals can be understood to be in the process of blooming; when the difference between adjacent frames of the preview thumbnail is small, the epiphyllum oxypetalum can be understood as not entering the opening process, and the change degree of the epiphyllum oxypetalum is small in a certain time.
It is understood that the method for determining the difference value between frames of the preview stream data may include other contents according to the actual scene, which is not limited in this embodiment of the application.
S404, the terminal equipment carries out self-adaptive sampling on the preview stream data.
In the embodiment of the present application, the adaptive sampling may be understood as: reasonable sampling is carried out according to the change degree between adjacent frames in the preview small image, for example, when the change degree between the adjacent frames is greater than a preset threshold value, preview stream data corresponding to a second frame in the adjacent frames can be taken.
Illustratively, taking a change value of a pixel point between frames as a parameter for measuring a change degree between the frames as an example for illustration, when the change value of the pixel point between the frames in the preview thumbnail is greater than a preset threshold, the terminal device needs to perform adaptive sampling on preview stream data corresponding to the preview thumbnail, and a sampling time interval is adaptively reduced; when the variation value of the inter-frame pixel points in the preview thumbnail is smaller than the preset threshold, the terminal device does not sample preview stream data corresponding to the preview thumbnail, and the sampling time interval is adaptively increased so as to reduce the input of repeated pictures; when the variation value of the inter-frame pixel point in the preview thumbnail is continuously smaller than the preset threshold, in order to ensure the sufficiency of data duration acquisition, the terminal device may perform forced sampling on the preview stream corresponding to the fixed frame of the preview thumbnail, so as to obtain a sampling sequence of adaptive sampling. For example, sampling may be performed at a point where the change value of the pixel point between frames is continuously less than 1/50 or 1/60 of the preset threshold value, and the like.
For example, fig. 6 is a schematic diagram illustrating a principle of adaptive sampling according to an embodiment of the present application. In the embodiment corresponding to fig. 6, the terminal device may capture a video of the blooming process of the night-blooming cereus based on the delayed shooting function. For example, around 20. For example, a desirable range of sampling rates may be between 1/600 and 1/300.
At about 23. It can be understood that, since the sampling is performed a plurality of times when the degree of change of the adjacent frames is greater than the preset threshold value a plurality of times, the average sampling rate in this time period may be greater than the above-mentioned sampling rate 1/600-1/300 when 20.
Around 02. Because the period of blooming in the epiphyllum blooming process is about 01.
At around 04. At about 05 to about-00 to about-19, the terminal device can capture a night-blooming cereus with petals completely retracted and flower-shaped shapes as shown by f in fig. 6, and at this stage, the degree of change of adjacent frames is small, for example, the adjacent frames can be continuously smaller than a preset threshold value, so that the terminal device can perform large-interval data sampling based on a fixed sampling rate. For example, a desirable range of sampling rates may be between 1/600 and 1/300.
Illustratively, the following is illustrated by comparing the fixed-interval sampling (e.g., the embodiment corresponding to fig. 7) with the adaptive sampling (e.g., the embodiment corresponding to fig. 8).
Fig. 7 is a schematic diagram of a fixed-interval sampling principle provided in an embodiment of the present application. As shown in fig. 7, a coordinate axis and a shot at a different sampling point in the coordinate axis may be included in fig. 7. The horizontal axis of the coordinate axis may be a time axis, and the vertical axis of the coordinate axis may be an inter-frame variation degree. However, as shown in fig. 7, the height of the bar in the vertical axis is used to indicate the degree of interframe variation, and the number of bars is only an example and does not indicate the number of interframe variation degrees obtained from the actual number of frames, so that when sampling is performed based on the degree of interframe variation, preview stream data corresponding to the second frame of the two frames corresponding to the degree of interframe variation can be acquired.
As shown in fig. 7, taking a process of photographing a blooming cereus between 01. The following. It is understood that, as shown in fig. 7, 1 frame of picture is acquired every 20 seconds between 01.
Wherein, the terminal device can acquire the picture shown as a in fig. 7 at the position of about 01; the picture shown as b in fig. 7 is acquired at 01; the picture as shown in c in fig. 7 is acquired at 01; the picture as shown in d in fig. 7 is acquired at 02; and a picture or the like as shown by e in fig. 7 is acquired at 02.
It can be understood that, as shown in fig. 7, the process of blooming cerealoids can be recorded based on the fixed interval sampling, but the most wonderful pictures at the time points of, for example, 01. In contrast, in the whole shooting process of about 01.
Fig. 8 is a schematic diagram of another adaptive sampling principle provided in an embodiment of the present application. As shown in fig. 8, a coordinate axis, a dotted line 801 indicating a preset threshold corresponding to the degree of change between frames, and a photographed picture at a different sampling point in the coordinate axis may be included in fig. 8. The horizontal axis of the coordinate axis may be a time axis, and the vertical axis of the coordinate axis may be a degree of inter-frame variation. However, as shown in fig. 8, the height of the bar in the vertical axis is used to indicate the degree of interframe variation, and the number of bars is only an example and does not indicate the number of interframe variation degrees obtained from the actual number of frames, so that when sampling is performed based on the degree of interframe variation, preview stream data corresponding to the second frame of the two frames corresponding to the degree of interframe variation can be acquired.
As shown in fig. 8, taking a process of capturing a blooming cereus between 01.
Sampling is performed when the inter-frame transformation degree at the positions of the left and right parts 01.
When the inter-frame transformation degree is continuously less than the preset threshold value, based on a fixed sampling rate, such as 1/600, sampling is performed 1 time every 20 seconds, for example, on the left and right-01 of 01.
At 01.
At 02.
Sampling is performed at a position where the degree of inter-frame transformation is greater than a preset threshold value at around 02.
And, when the 02.
It can be understood that the adaptive sampling method shown in fig. 8 can not only exhibit the complete process of full blooming of the epiphyllum from full bloom, but also collect wonderful pictures and detail changes when the epiphyllum blooming is around 02.
Based on this, above-mentioned self-adaptation sampling can realize carrying out the intensive sampling of data when the change degree of interframes is great, carry out the large-interval data sampling in the interval that the change degree is less, thereby in video broadcast, the big section of change can have more frames to go the broadcast, thereby slow down the change degree of picture, go the broadcast with less frames in the interval that the change is less, unchanged or little change picture can be accelerated, thereby can change the broadcast rhythm of whole video, deepen the depth and the dynamic sense of video.
And S405, the terminal equipment performs registration/smoothing processing on the sampling sequence.
In the embodiment of the present application, the registration/smoothing may be understood as a process of rotating, translating, or matching two or more images acquired under different conditions.
It can be understood that due to the change of the content between frames or the jitter of the terminal device, there may be a problem of picture jump during video playing. Therefore, the terminal device can perform registration, smoothing and other processing on the sample sequence obtained through the adaptive sampling in S404, so as to ensure smooth picture during video playing.
In this embodiment of the application, the terminal device may perform registration/smoothing processing on the preview stream data after adaptive sampling by using a Speedup robust features (SURF) algorithm and other methods. For example, the SURF principle can be understood as extracting key points of adjacent frames, and performing operations such as rotating or translating the key points to align the registered frames to the reference frames. Wherein the reference frame may be a first frame of the two adjacent frames, and the registration frame may be a second frame of the two adjacent frames.
Fig. 9 is a schematic diagram illustrating a principle of registration and smoothing of adjacent frame images according to an embodiment of the present application. As shown in fig. 9, a in fig. 9 may be a reference frame in an adjacent frame, b in fig. 9 may be a registration frame in the adjacent frame, and c in fig. 9 may be a registration result after performing a registration operation on the registration frame with reference to the reference frame.
As shown in fig. 9, due to the change of the shooting angle, the registration frame may have a certain offset or rotation compared to the reference frame, so that the registration frame may be translated or rotated to be consistent with the reference frame. For example, c in fig. 9 may be a screen obtained when b in fig. 9 is rotated based on a in fig. 9 and rotated to be consistent with the image of a in fig. 9. The region where a in fig. 9 coincides with b in fig. 9 after rotation may be a region corresponding to a dashed line box 901 in c in fig. 9.
It can be understood that, when b in fig. 9 is rotated, a certain region may not be adapted to a in fig. 9, and therefore, the region outside the dashed-line frame 901 shown in c in fig. 9 cannot realize registration, the terminal device may crop the region that cannot realize registration, and recover the cropped region based on methods such as bilinear interpolation, so as to obtain a relatively smooth registered image. For example, the terminal device may restore the dashed line frame 901 shown as c in fig. 9 to the picture size shown as a in fig. 9, for example, the terminal device may take the pixel points outside the dashed line frame 901 shown as c in fig. 9 and the average value of the pixel points based on the picture outside the dashed line frame 901 shown as a in fig. 9 as the pixel points outside the dashed line frame 901 shown as c in fig. 9, thereby restoring the image with the same picture size as the picture size of a in fig. 9. In the above process of calculating the average value, the pixel point of the area outside the dashed line frame 901 as shown in c in fig. 9 may be 0.
It can be understood that, as shown by a dashed box 901 shown in c in fig. 9, since the above registration and smoothing process may have a certain loss of field of view (FOV), the terminal device may use a wide-angle lens or an ultra-wide-angle lens, or use a preview distortion algorithm after the step shown in S405, to ensure that larger FOV data is input, so that FOV data of the picture obtained after cropping a small area may be close to or better than FOV data of the picture when not cropped. Wherein the FOV is understood to be the range covered by the lens.
Further, to ensure that the input size of the video playing data is not changed, the terminal device may output the data after the above registration and smoothing processing (or understood as data in the size of a preview small picture) to the standard picture size of the video. The standard screen size may be 720p or 1080 p.
And S406, the terminal equipment adjusts the dynamic range of the data after the registration/smoothing processing.
In this embodiment of the application, since the data after the smoothing processing may have an overexposed region or an overexposed region, the terminal device may adopt dynamic range adjustment to enhance the information of the image. For example, the terminal device may optimize the brightness of a single frame image by using an Artificial Intelligence (AI) HDR or tone mapping (tone mapping) method, and further implement dynamic range adjustment of the picture.
For example, the overexposure area may be determined by whether the ratio of the number of the highlighted pixels in the image to the number of all pixels in the image is greater than a preset threshold value of the number of pixels; the too-dark area can be determined by whether the proportion of the number of the too-dark pixels in the image to all the pixels in the image is larger than a preset threshold value of the number of the other pixels. The too-dark pixel may be determined based on another gray threshold of the pixel, the another pixel threshold may be used to determine whether the current scene is a too-dark region, and a method for determining the highlight pixel and the pixel threshold is the same as the method for determining the highlight pixel and the pixel threshold in the step shown in S402, which is not described herein again.
S407, the terminal device obtains two HDR images.
In the embodiment of the present application, in two frames of images in an HDR scene, one frame may be normal frame data acquired by a terminal device based on delayed photography, and the other frame may be a data frame used for performing brightness adjustment on the normal frame data, for example, the data frame may be a short frame used for suppressing an overexposed area in the normal frame, or the data frame may also be a long frame used for increasing an overexposed area in the normal frame.
For example, in a high dynamic scene, the terminal device obtains the photo stream data based on time-lapse photography and samples the photo stream data to the size of the preview stream data, for example, 100 ten thousand pixels of photo stream data may be sampled to the size of the preview stream data of 720p or the like. Further, the terminal device generates two frames of HDR images with different exposure degrees based on the exposure degree of the preview stream data obtained by the sampling processing.
And S408, the terminal equipment performs exposure fusion on the two frames of HDR images.
In the embodiment of the present application, the exposure fusion may be understood as a method of fusing two frames of HDR images obtained by using different exposure parameters into one frame of HDR image. For example, the terminal device may perform exposure fusion based on an algorithm such as a brightness gradient method, a bilateral filtering method, or a laplacian pyramid, or the terminal device may also implement image exposure fusion based on a trained neural network model, which is not limited in the embodiment of the present application.
In a possible implementation manner, the terminal device may perform fixed-interval sampling on the data after the exposure fusion processing, or may also perform adaptive sampling on the data after the exposure fusion processing based on the method shown in S404 to obtain a sampling sequence. It can be understood that, in a high dynamic scene, the sampling processing is performed by adopting a fixed interval sampling method, so that the complexity of an algorithm in the high dynamic scene can be simplified.
And S409, the terminal equipment performs registration/smoothing processing on the data after the fusion processing.
In this embodiment of the application, the process of performing the registration/smoothing processing on the data after the fusion processing in the step shown in S409 is similar to the process of performing the registration/smoothing processing in the step shown in S405, and is not described herein again.
It can be understood that the video processing method based on the high dynamic scene shown in S407-S409 may be applied to the delayed shooting function, and may also be applied to other video recording functions for obtaining a better recorded picture, which is not limited in the embodiment of the present application.
And S410, the terminal equipment stores the delayed shooting data.
For example, as shown in b of fig. 5, when the terminal device receives an operation of stopping the delayed photography control 506 triggered by the user, the terminal device may encode the multi-frame photographed pictures obtained in the steps shown in S403-S406 in the normal scene or encode the multi-frame photographed pictures obtained in the steps shown in S407-S409 in the high-dynamic scene into the delayed photography in a time sequence.
Based on the method, the terminal equipment can perform processing processes such as self-adaptive sampling and the like based on the difference between adjacent frames in the preview stream data, so as to better embody the wonderful segments in the shooting process; in addition, HDR is introduced into time-lapse photography, and based on the processing of HDR images, more real pictures can be displayed in time-lapse photography, so that a better time-lapse video effect is obtained.
On the basis of the embodiment corresponding to fig. 4, in a possible implementation manner, when the user searches for a delayed shooting video in a photo, the terminal device may display an identifier of the delayed shooting obtained by the processing method based on the high dynamic scene.
For example, fig. 10 is another interface schematic diagram provided in the embodiment of the present application. In the embodiment corresponding to fig. 10, a terminal device is taken as an example for illustration, and the example does not limit the embodiment of the present application.
When the terminal device receives an operation of opening the photo function by the user, the terminal device may display an interface as shown in fig. 10, in which a control 1001 for opening more functions may be included, and a plurality of photos and videos, such as video 1 taken today, and photo 1, photo 2, and photo 3 taken yesterday, etc. In the method, a play control corresponding to the video 1 and an HDR identifier 1002 for indicating that the video 1 is a high-dynamic scene are displayed around the video 1.
Based on this, the terminal device can provide an identifier for identifying whether the video is processed in the high-dynamic scene, and then the user can clearly determine which video of the plurality of videos is obtained based on the processing of the high-dynamic scene.
It should be understood that the interface of the terminal device provided in the embodiment of the present application is only an example, and is not limited to the embodiment of the present application.
Based on the content described in the foregoing embodiments, to better understand the embodiments of the present application, fig. 11 is a schematic flowchart of another time-lapse shooting method provided in the embodiments of the present application.
As shown in fig. 11, the time-lapse photographing method may include the steps of:
s1101, displaying a first interface by the terminal equipment; the first interface comprises a first control, and the first interface is used for recording the delayed shooting video.
In this embodiment of the application, the first interface may be an interface as shown in a in fig. 5, and the first control may be an open time-lapse photography control 501 in the interface as shown in a in fig. 5.
S1102, the terminal device receives a first operation aiming at the first control.
In the embodiment of the present application, the first operation may be a click operation or a long-press operation.
S1103, in response to the first operation, the terminal equipment acquires preview stream data and photo stream data acquired by the camera.
In the embodiment of the application, the preview stream data can be used for generating a delayed shooting video in a non-HDR scene; the photo stream data can be used to generate a time-lapse video in an HDR scene.
S1104, the terminal device judges whether the preview stream data belongs to the HDR scene with the high dynamic range.
The HDR scene may be understood as a scene corresponding to a ratio of HDR images in the multi-frame data acquired based on the preview thumbnail exceeding a first threshold.
S1105, when the terminal device determines that the preview stream data does not belong to the HDR scene, the terminal device extracts the first image frame from the preview stream data based on the degree of interframe change, and generates an image frame sequence including a plurality of first image frames.
Optionally, S1104 includes: terminal equipment carries out multiple down sampling on preview stream data to obtain a preview thumbnail; the terminal equipment acquires multi-frame preview data in the preview thumbnail; the terminal equipment judges whether preview stream data belong to an HDR scene or not based on the proportion of a first image in multi-frame preview data; the first image is an image with the proportion of the highlight pixels exceeding a first threshold, and the highlight pixels are pixel points with the gray value larger than a gray threshold.
In the embodiment of the present application, the multiple sampling may be 4 times sampling, and the first image may be understood as an HDR image.
Optionally, S1105 includes: s11051, the terminal device calculates the interframe change degree of the first adjacent frame in the preview thumbnail. The first adjacent frame includes a first frame and a second frame. S11052, the terminal device extracts the first image frame from the preview stream data based on the degree of interframe variation of the first adjacent frame, and generates an image frame sequence including a plurality of first image frames.
In this embodiment, the first adjacent frame may be an adjacent frame in the preview thumbnail.
Optionally, S11052 includes: s110521, the terminal device extracts the first image frame from the preview stream data based on the interframe change degree of the first adjacent frame, and obtains the preview stream data after sampling processing. And S110522, the terminal equipment performs registration processing on the preview stream data after sampling processing to obtain the preview stream data after registration processing. S110523, the terminal device performs brightness adjustment on the preview stream data after the registration processing, and generates an image frame sequence including a plurality of first image frames.
Optionally, S110521 includes: when the interframe change degree is larger than a preset threshold value, the terminal equipment extracts preview stream data corresponding to a second frame to obtain the preview stream data after sampling processing; or when the interframe change degree is smaller than a preset threshold value, the terminal equipment extracts the first image frame from the preview stream data according to a fixed sampling rate to obtain the preview stream data after sampling processing.
It is to be understood that the plurality of frames of data in the preview thumbnail correspond to the plurality of frames of preview stream data, and the preview stream data corresponding to the second frame may be understood as the preview stream data corresponding to the second frame of image in the first adjacent frame in the plurality of frames of preview stream data when the degree of inter-frame variation in the first adjacent frame of the preview thumbnail is greater than the preset threshold.
Optionally, the degree of interframe variation includes at least one of: a change value of a pixel point between first adjacent frames, a change value of a gray value between the first adjacent frames, or a change value of an average value of gray values between the first adjacent frames.
Optionally, S110522 includes: the terminal equipment acquires a second adjacent frame in the sampled and processed preview stream data; the second adjacent frame comprises a third frame and a fourth frame; the terminal equipment aligns the fourth frame to the third frame, cuts out the unalignable area in the fourth frame, and obtains the cut fourth frame; and the terminal equipment performs interpolation processing on the cut fourth frame based on the third frame to obtain the preview stream data after registration processing.
In the embodiment of the present application, the third frame may be understood as a reference frame, the fourth frame may be understood as a registration frame, and the interpolation process may be understood as restoring the clipped fourth frame to the picture size corresponding to the third frame by using an interpolation algorithm.
Optionally, the method further includes: and S1106, when the terminal device determines that the preview stream data belongs to the HDR scene, the terminal device processes the shot stream data to generate an image frame sequence containing a plurality of second image frames.
Optionally, S1106 includes: s11061, the terminal equipment carries out image fusion processing on the photographing stream data to obtain the photographing stream data after the fusion processing; s11062, terminal equipment
Extracting a second image frame from the photo stream data after the fusion processing to obtain the photo stream data after the sampling processing; and S11063, the terminal equipment performs registration processing on the sampled photographing stream data to obtain an image frame sequence containing a plurality of second image frames.
Optionally, S11061 includes: the terminal equipment processes the photo stream data into data corresponding to the size of the picture of the preview stream data to obtain the photo stream data after the picture size processing; when the terminal equipment determines that the picture brightness of the shot stream data after picture size processing exceeds a first brightness threshold value, the terminal equipment generates a first data frame and a second data frame; and the terminal equipment fuses the first data frame and the second data frame to obtain the photo stream data after fusion processing.
Optionally, S11061 further includes: when the terminal equipment determines that the picture brightness of the shot stream data after picture size processing does not exceed a second brightness threshold value, the terminal equipment generates a first data frame and a third data frame; the first brightness threshold is greater than the second brightness threshold; and the terminal equipment fuses the first data frame and the third data frame to obtain the photo stream data after fusion processing.
Optionally, S11063 includes: the terminal equipment acquires a third adjacent frame in the sampled and processed photographing stream data; the third adjacent frame comprises a fifth frame and a sixth frame; the terminal equipment aligns the sixth frame to the fifth frame, cuts out the unaligned area in the sixth frame, and obtains the cut sixth frame; and the terminal equipment performs interpolation processing on the cut sixth frame based on the fifth frame to obtain an image frame sequence containing a plurality of second image frames.
In the embodiment of the present application, the fifth frame may be understood as a reference frame, the sixth frame may be understood as a registration frame, and the interpolation process may be understood as restoring the cropped sixth frame to the picture size corresponding to the fifth frame by using an interpolation algorithm.
Optionally, the method further includes: the terminal equipment displays a second interface; the second interface comprises a second control; the terminal equipment receives a second operation aiming at the second control; in response to the second operation, the terminal device encodes an image frame sequence containing a plurality of first image frames and/or an image frame sequence containing a plurality of second image frames into the time-lapse photography video.
In this embodiment, the second interface may be an interface shown in b in fig. 5, and the second control may be a stop delay shooting control 506; the second operation may be a click operation, a long press operation, or the like.
Optionally, the method further includes: receiving a third operation of opening the time-delay shooting video file by the user; the terminal equipment displays a third interface; the third interface comprises a delayed shooting file and a first identifier corresponding to the delayed shooting file; the first identifier is used to indicate a video type of the delayed photographic video file.
In the embodiment of the application, the third operation is a click operation, a long-time press operation, or the like; the third interface may be an interface as shown in the figure, and the first identifier may be an HDR identifier of a highly dynamic scene.
The method provided by the embodiment of the present application is described above with reference to fig. 3 to fig. 11, and the apparatus provided by the embodiment of the present application for performing the method is described below. As shown in fig. 12, fig. 12 is a schematic structural diagram of a delay shooting device provided in this embodiment of the present application, where the delay shooting device may be a terminal device in this embodiment of the present application, and may also be a chip or a chip system in the terminal device.
As shown in fig. 12, the time-lapse photographing apparatus 120 may be used in a communication device, a circuit, a hardware component, or a chip, and includes: a display unit 1201, and a processing unit 1202, and the like. The display unit 1201 is used for supporting the display step executed by the delayed shooting method; the processing unit 1202 is configured to support a step of the time-lapse photographing apparatus performing information processing.
The processing unit 1202 and the display unit 1201 may be integrated, and communication may occur between the processing unit 1202 and the display unit 1201.
In a possible implementation manner, the time-lapse photographing apparatus may further include: a storage unit 1203. The storage unit 1203 may include one or more memories, and the memories may be devices in one or more devices or circuits for storing programs or data.
The storage unit 1203 may be separate and connected to the processing unit 1202 by a communication bus. The storage unit 1203 may also be integrated with the processing unit 1202.
Taking the example that the time-lapse photographing apparatus may be a chip or a chip system of the terminal device in the embodiment of the present application, the storage unit 1203 may store a computer-executable instruction of the method of the terminal device, so that the processing unit 1202 executes the method of the terminal device in the above embodiment. The storage unit 1203 may be a register, a cache memory, a Random Access Memory (RAM), or the like, and the storage unit 1203 may be integrated with the processing unit 1202. The storage unit 1203 may be a read-only memory (ROM) or other type of static storage device that may store static information and instructions, and the storage unit 1203 may be separate from the processing unit 1202.
In one possible implementation, the time-lapse photographing apparatus may further include: a communication unit 1204. The communication unit 1204 is used to support the time-lapse photographing apparatus to interact with other devices. Illustratively, when the time-lapse photographing apparatus is a terminal device, the communication unit 1204 may be a communication interface or an interface circuit. When the delay time photographing device is a chip or a chip system in a terminal apparatus, the communication unit 1204 may be a communication interface. For example, the communication interface may be an input/output interface, a pin or a circuit, etc.
The apparatus of this embodiment may be correspondingly used to perform the steps performed in the method embodiments, and the implementation principle and technical effects are similar, which are not described herein again.
Fig. 13 is a schematic diagram of a hardware structure of a control device according to an embodiment of the present application, and as shown in fig. 13, the control device includes a processor 1301, a communication line 1304, and at least one communication interface (an exemplary case of the communication interface 1303 in fig. 13 is described as an example).
The processor 1301 may be a general processing unit (CPU), a microprocessor, an application-specific integrated circuit (ASIC), or one or more ics for controlling the execution of programs in accordance with the present disclosure.
Communication lines 1304 may include circuitry to transfer information between the above-described components.
Communication interface 1303 may be implemented using any transceiver or the like for communicating with other devices or communication networks, such as ethernet, wireless Local Area Networks (WLANs), etc.
Possibly, the control device may also comprise a memory 1302.
The memory 1302 may be, but is not limited to, a read-only memory (ROM) or other type of static storage device that can store static information and instructions, a Random Access Memory (RAM) or other type of dynamic storage device that can store information and instructions, an electrically erasable programmable read-only memory (EEPROM), a compact disk read-only memory (CD-ROM) or other optical disk storage, optical disk storage (including compact disk, laser disk, optical disk, digital versatile disk, blu-ray disk, etc.), magnetic disk storage media or other magnetic storage devices, or any other medium that can be used to carry or store desired program code in the form of instructions or data structures and that can be accessed by a computer. The memory may be separate and coupled to the processor via communication line 1304. The memory may also be integral to the processor.
The memory 1302 is used for storing computer-executable instructions for executing the present invention, and is controlled by the processor 1301 to execute the instructions. The processor 1301 is configured to execute the computer executable instructions stored in the memory 1302, thereby implementing the methods provided by the embodiments of the present application.
Possibly, the computer executed instructions in the embodiments of the present application may also be referred to as application program codes, which are not specifically limited in the embodiments of the present application.
In particular implementations, processor 1301 may include one or more CPUs, such as CPU0 and CPU1 in fig. 13, as one embodiment.
In particular implementations, for one embodiment, the control device may include multiple processors, such as processor 1301 and processor 1305 in fig. 13. Each of these processors may be a single-core (single-CPU) processor or a multi-core (multi-CPU) processor. A processor herein may refer to one or more devices, circuits, and/or processing cores that process data (e.g., computer program instructions).
Exemplarily, fig. 14 is a schematic structural diagram of a chip provided in an embodiment of the present application. Chip 140 includes one or more (including two) processors 1420 and a communication interface 1430.
In some embodiments, memory 1440 stores the following elements: an executable module or a data structure, or a subset thereof, or an expanded set thereof.
In the illustrated embodiment, memory 1440 may include both read-only memory and random-access memory, and may provide instructions and data to processor 1420. A portion of the memory 1440 may also include non-volatile random access memory (NVRAM).
In the illustrated embodiment, memory 1440, communication interface 1430, and processor 1420 are coupled together by bus system 1410. The bus system 1410 may include a power bus, a control bus, a status signal bus, and the like, in addition to a data bus. For ease of description, the various buses are identified in FIG. 14 as bus system 1410.
The method described in the embodiment of the present application may be applied to the processor 1420, or implemented by the processor 1420. Processor 1420 may be an integrated circuit chip having signal processing capabilities. In implementation, the steps of the above method may be performed by integrated logic circuits of hardware or instructions in the form of software in the processor 1420. The processor 1420 may be a general-purpose processor (e.g., a microprocessor or a conventional processor), a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an FPGA (field-programmable gate array) or other programmable logic device, discrete gate, transistor logic device or discrete hardware component, and the processor 1420 may implement or perform the methods, steps and logic blocks disclosed in the embodiments of the present invention.
The steps of the method disclosed in connection with the embodiments of the present application may be directly implemented by a hardware decoding processor, or implemented by a combination of hardware and software modules in the decoding processor. The software module may be located in a storage medium mature in the field, such as a random access memory, a read-only memory, a programmable read-only memory, or a charged erasable programmable memory (EEPROM). The storage medium is located in the memory 1440, and the processor 1420 reads the information in the memory 1440 and performs the steps of the above-described method in conjunction with the hardware thereof.
In the above embodiments, the instructions stored by the memory for execution by the processor may be implemented in the form of a computer program product. The computer program product may be written in the memory in advance, or may be downloaded in the form of software and installed in the memory.
The computer program product includes one or more computer instructions. The procedures or functions according to the embodiments of the present application are all or partially generated when the computer program instructions are loaded and executed on a computer. The computer may be a general purpose computer, a special purpose computer, a network of computers, or other programmable device. The computer instructions may be stored in, or transmitted from, one computer-readable storage medium to another, e.g., from one website, computer, server, or datacenter, through a wired (e.g., coaxial cable, fiber optic, digital Subscriber Line (DSL), or wireless (e.g., infrared, wireless, microwave, etc.) manner to another website, computer, server, or datacenter.
The embodiment of the application also provides a computer readable storage medium. The methods described in the above embodiments may be implemented in whole or in part by software, hardware, firmware, or any combination thereof. Computer-readable media may include computer storage media and communication media, and may include any medium that can communicate a computer program from one place to another. A storage medium may be any target medium that can be accessed by a computer.
As one possible design, the computer-readable medium may include a compact disk read-only memory (CD-ROM), RAM, ROM, EEPROM, or other optical disk storage; the computer readable medium may include a disk memory or other disk storage device. Also, any connecting line may also be properly termed a computer-readable medium. For example, if the software is transmitted from a website, server, or other remote source using a coaxial cable, fiber optic cable, twisted pair, DSL, or wireless technologies such as infrared, radio, and microwave, then the coaxial cable, fiber optic cable, twisted pair, DSL, or wireless technologies such as infrared, radio, and microwave are included in the definition of medium. Disk and disc, as used herein, includes Compact Disc (CD), laser disc, optical disc, digital Versatile Disc (DVD), floppy disk and blu-ray disc where disks usually reproduce data magnetically, while discs reproduce data optically with lasers.
Combinations of the above should also be included within the scope of computer-readable media. The above description is only for the specific embodiments of the present invention, but the scope of the present invention is not limited thereto, and any person skilled in the art can easily think of the changes or substitutions within the technical scope of the present invention, and shall cover the scope of the present invention. Therefore, the protection scope of the present invention shall be subject to the protection scope of the claims.

Claims (13)

1. A method of time-lapse photography, the method comprising:
the terminal equipment displays a first interface; the first interface comprises a first control, and the first interface is used for recording the delayed photography video;
the terminal equipment receives a first operation aiming at the first control;
responding to the first operation, the terminal equipment acquires preview stream data and photographing stream data acquired by a camera;
the terminal equipment performs multiple downsampling on the preview stream data to obtain a preview thumbnail;
the terminal equipment acquires multi-frame preview data in the preview small picture;
the terminal equipment judges whether the preview stream data belongs to an HDR scene or not based on the proportion of the first image in the multi-frame preview data; the first image is an image with the proportion of highlight pixels exceeding a first threshold, and the highlight pixels are pixel points with gray values larger than a gray threshold;
when the terminal device determines that the preview stream data does not belong to the HDR scene, the terminal device calculates the interframe change degree of a first adjacent frame in the preview small picture; the first adjacent frame comprises a first frame and a second frame;
when the interframe change degree of the first adjacent frame is greater than a preset threshold value, the terminal equipment extracts preview stream data corresponding to the second frame to obtain the preview stream data after sampling processing;
when the interframe change degree of the first adjacent frame is smaller than a preset threshold value, the terminal equipment extracts a first image frame from the preview stream data according to a fixed sampling rate to obtain the preview stream data after sampling processing;
and generating an image frame sequence containing a plurality of first image frames according to the sampled preview stream data.
2. The method of claim 1, wherein said generating the image frame sequence comprising the first plurality of image frames from the sample processed preview stream data comprises:
the terminal equipment performs registration processing on the sampled preview stream data to obtain the preview stream data after the registration processing;
and the terminal equipment performs brightness adjustment on the preview stream data after the registration processing to generate the image frame sequence containing the plurality of first image frames.
3. The method of claim 1, wherein the degree of interframe variation comprises at least one of: a change value of a pixel point between the first adjacent frames, a change value of a gray value between the first adjacent frames, or a change value of an average value of gray values between the first adjacent frames.
4. The method according to claim 3, wherein the terminal device performs registration processing on the sampled preview stream data to obtain the registered preview stream data, and the registration processing includes:
the terminal equipment acquires a second adjacent frame in the sampled preview stream data; the second adjacent frame comprises a third frame and a fourth frame;
the terminal equipment aligns the fourth frame to the third frame, cuts out the unaligned area in the fourth frame, and obtains a cut fourth frame;
and the terminal equipment performs interpolation processing on the cut fourth frame based on the third frame to obtain preview stream data after registration processing.
5. The method of claim 1, further comprising:
when the terminal device determines that the preview stream data belongs to the HDR scene, the terminal device processes the photo stream data to generate an image frame sequence containing a plurality of second image frames.
6. The method of claim 5, wherein the processing of the camera stream data by the terminal device to generate an image frame sequence comprising a plurality of second image frames comprises:
the terminal equipment performs image fusion processing on the photographing stream data to obtain the photographing stream data after the fusion processing;
the terminal equipment extracts the second image frame from the photo taking stream data after the fusion processing to obtain the photo taking stream data after the sampling processing;
and the terminal equipment performs registration processing on the sampled photographing stream data to obtain the image frame sequence containing a plurality of second image frames.
7. The method according to claim 5, wherein the step of the terminal device performing image fusion processing on the photo stream data to obtain the photo stream data after the fusion processing includes:
the terminal equipment processes the photo stream data into data corresponding to the size of the picture of the preview stream data to obtain the photo stream data after the picture size processing;
when the terminal device determines that the picture brightness of the photo stream data after the picture size processing exceeds a first brightness threshold, the terminal device generates a first data frame and a second data frame;
and the terminal equipment fuses the first data frame and the second data frame to obtain the photo stream data after fusion processing.
8. The method of claim 7, further comprising:
when the terminal equipment determines that the picture brightness of the shot stream data after the picture size processing does not exceed a second brightness threshold value, the terminal equipment generates the first data frame and a third data frame; the first brightness threshold is greater than the second brightness threshold;
and the terminal equipment fuses the first data frame and the third data frame to obtain the photo stream data after fusion processing.
9. The method according to claim 6, wherein the terminal device performs registration processing on the sampled camera stream data to obtain the image frame sequence including a plurality of second image frames, and includes:
the terminal equipment acquires a third adjacent frame in the sampled and processed photographing stream data; the third adjacent frame comprises a fifth frame and a sixth frame;
the terminal equipment aligns the sixth frame to the fifth frame, cuts out the unaligned area in the sixth frame, and obtains a cut sixth frame;
and the terminal equipment performs interpolation processing on the cut sixth frame based on the fifth frame to obtain the image frame sequence containing a plurality of second image frames.
10. The method of claim 5, further comprising:
the terminal equipment displays a second interface; the second interface comprises a second control;
the terminal equipment receives a second operation aiming at the second control;
in response to the second operation, the terminal device encodes the image frame sequence containing a plurality of the first image frames and/or the image frame sequence containing a plurality of the second image frames as a time-lapse photography video.
11. The method of claim 1, further comprising:
receiving a third operation of opening the time-delay shooting video file by the user;
the terminal equipment displays a third interface; the third interface comprises the delayed shooting file and a first identifier corresponding to the delayed shooting file; the first identifier is used for indicating the video type of the delayed shooting video file.
12. An electronic device comprising a memory, a processor, and a computer program stored in the memory and executable on the processor, wherein the processor, when executing the computer program, causes the electronic device to perform the method of any of claims 1 to 11.
13. A computer-readable storage medium, in which a computer program is stored which, when executed by a processor, causes a computer to carry out the method according to any one of claims 1 to 11.
CN202110849860.2A 2021-07-27 2021-07-27 Time-delay shooting method and device Active CN113810596B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110849860.2A CN113810596B (en) 2021-07-27 2021-07-27 Time-delay shooting method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110849860.2A CN113810596B (en) 2021-07-27 2021-07-27 Time-delay shooting method and device

Publications (2)

Publication Number Publication Date
CN113810596A CN113810596A (en) 2021-12-17
CN113810596B true CN113810596B (en) 2023-01-31

Family

ID=78893147

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110849860.2A Active CN113810596B (en) 2021-07-27 2021-07-27 Time-delay shooting method and device

Country Status (1)

Country Link
CN (1) CN113810596B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115564659B (en) * 2022-02-28 2024-04-05 荣耀终端有限公司 Video processing method and device
CN115526788A (en) * 2022-03-18 2022-12-27 荣耀终端有限公司 Image processing method and device
CN116708753B (en) * 2022-12-19 2024-04-12 荣耀终端有限公司 Method, device and storage medium for determining preview blocking reason
CN117082225B (en) * 2023-10-12 2024-02-09 山东海量信息技术研究院 Virtual delay video generation method, device, equipment and storage medium

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104253941A (en) * 2013-06-26 2014-12-31 卡西欧计算机株式会社 Moving image generating apparatus and moving image generating method
CN104349060A (en) * 2013-08-06 2015-02-11 卡西欧计算机株式会社 Image processing apparatus for time-lapse moving image, image processing method, and storage medium
JP2015053670A (en) * 2013-08-06 2015-03-19 カシオ計算機株式会社 Image processing apparatus, image processing method, and program
CN106161967A (en) * 2016-09-13 2016-11-23 维沃移动通信有限公司 A kind of backlight scene panorama shooting method and mobile terminal
CN106713778A (en) * 2016-12-28 2017-05-24 上海兴芯微电子科技有限公司 Exposure control method and device
CN107277387A (en) * 2017-07-26 2017-10-20 维沃移动通信有限公司 High dynamic range images image pickup method, terminal and computer-readable recording medium
CN110086985A (en) * 2019-03-25 2019-08-02 华为技术有限公司 A kind of method for recording and electronic equipment of time-lapse photography
JP2020077964A (en) * 2018-11-07 2020-05-21 キヤノン株式会社 Imaging apparatus and control method thereof
CN112532859A (en) * 2019-09-18 2021-03-19 华为技术有限公司 Video acquisition method and electronic equipment
CN112532857A (en) * 2019-09-18 2021-03-19 华为技术有限公司 Shooting method and equipment for delayed photography

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002016876A (en) * 2000-06-27 2002-01-18 Toshiba Corp Time lapse recording video encoder and method for generating data
US9992443B2 (en) * 2014-05-30 2018-06-05 Apple Inc. System and methods for time lapse video acquisition and compression
US10170157B2 (en) * 2015-06-07 2019-01-01 Apple Inc. Method and apparatus for finding and using video portions that are relevant to adjacent still images
KR102527811B1 (en) * 2015-12-22 2023-05-03 삼성전자주식회사 Apparatus and method for generating time lapse image
US10771712B2 (en) * 2017-09-25 2020-09-08 Gopro, Inc. Optimized exposure temporal smoothing for time-lapse mode

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104253941A (en) * 2013-06-26 2014-12-31 卡西欧计算机株式会社 Moving image generating apparatus and moving image generating method
CN104349060A (en) * 2013-08-06 2015-02-11 卡西欧计算机株式会社 Image processing apparatus for time-lapse moving image, image processing method, and storage medium
JP2015053670A (en) * 2013-08-06 2015-03-19 カシオ計算機株式会社 Image processing apparatus, image processing method, and program
CN106161967A (en) * 2016-09-13 2016-11-23 维沃移动通信有限公司 A kind of backlight scene panorama shooting method and mobile terminal
CN106713778A (en) * 2016-12-28 2017-05-24 上海兴芯微电子科技有限公司 Exposure control method and device
CN107277387A (en) * 2017-07-26 2017-10-20 维沃移动通信有限公司 High dynamic range images image pickup method, terminal and computer-readable recording medium
JP2020077964A (en) * 2018-11-07 2020-05-21 キヤノン株式会社 Imaging apparatus and control method thereof
CN110086985A (en) * 2019-03-25 2019-08-02 华为技术有限公司 A kind of method for recording and electronic equipment of time-lapse photography
CN112532859A (en) * 2019-09-18 2021-03-19 华为技术有限公司 Video acquisition method and electronic equipment
CN112532857A (en) * 2019-09-18 2021-03-19 华为技术有限公司 Shooting method and equipment for delayed photography

Also Published As

Publication number Publication date
CN113810596A (en) 2021-12-17

Similar Documents

Publication Publication Date Title
CN110086985B (en) Recording method for delayed photography and electronic equipment
CN113810596B (en) Time-delay shooting method and device
JP7403551B2 (en) Recording frame rate control method and related equipment
CN113810601B (en) Terminal image processing method and device and terminal equipment
CN115086567B (en) Time delay photographing method and device
CN113810604B (en) Document shooting method, electronic device and storage medium
CN115526787B (en) Video processing method and device
CN112954251B (en) Video processing method, video processing device, storage medium and electronic equipment
CN113747058B (en) Image content shielding method and device based on multiple cameras
EP4318383A1 (en) Video processing method and apparatus
CN113572948B (en) Video processing method and video processing device
CN114429495B (en) Three-dimensional scene reconstruction method and electronic equipment
CN113593567A (en) Method for converting video and sound into text and related equipment
CN116055894B (en) Image stroboscopic removing method and device based on neural network
CN115426449B (en) Photographing method and terminal
CN115529411A (en) Video blurring method and device
CN116703995A (en) Video blurring processing method and device
CN112348738B (en) Image optimization method, image optimization device, storage medium and electronic equipment
CN115022526B (en) Full depth image generation method and device
CN115526788A (en) Image processing method and device
CN117278855B (en) Video anti-shake method and related equipment
CN115297269B (en) Exposure parameter determination method and electronic equipment
WO2023077938A1 (en) Video frame generation method and apparatus, electronic device, and storage medium
EP4246955A1 (en) Image processing method and electronic device
CN117911299A (en) Video processing method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant