CN109547697B - Dynamic image shooting method and terminal equipment - Google Patents

Dynamic image shooting method and terminal equipment Download PDF

Info

Publication number
CN109547697B
CN109547697B CN201811554763.5A CN201811554763A CN109547697B CN 109547697 B CN109547697 B CN 109547697B CN 201811554763 A CN201811554763 A CN 201811554763A CN 109547697 B CN109547697 B CN 109547697B
Authority
CN
China
Prior art keywords
input
shooting
sub
speed
user
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811554763.5A
Other languages
Chinese (zh)
Other versions
CN109547697A (en
Inventor
刘易家
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN201811554763.5A priority Critical patent/CN109547697B/en
Publication of CN109547697A publication Critical patent/CN109547697A/en
Application granted granted Critical
Publication of CN109547697B publication Critical patent/CN109547697B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/62Control of parameters via user interfaces

Abstract

The embodiment of the invention provides a dynamic image shooting method and terminal equipment, relates to the technical field of communication, and aims to solve the problem that dynamic effects of dynamic images shot by the existing terminal equipment are poor. The method comprises the following steps: receiving a first input of a user on a shooting preview interface; responding to the first input, executing shooting operation according to the target shooting speed corresponding to the input parameter of the first input, and outputting target data; wherein the input parameters comprise at least one of input direction, input distance, input track and input times. The method can be applied to the shooting scene of the terminal equipment.

Description

Dynamic image shooting method and terminal equipment
Technical Field
The embodiment of the invention relates to the technical field of communication, in particular to a dynamic image shooting method and terminal equipment.
Background
With the wider application range of terminal devices, users can use the terminal devices to shoot dynamic images, which may include videos or dynamic pictures, where a dynamic image refers to an image in which a group of specific static images are displayed at a specified frequency to generate a certain dynamic display effect.
At present, in the process of shooting a dynamic image, a user can click a shooting control to trigger a terminal device to shoot a video or a dynamic image according to a normal double speed, and the shot video or dynamic image is stored in an album of the terminal device.
However, in the above method for shooting a dynamic image, the video or the dynamic image shot at the normal double speed can only be displayed at the normal double speed, which results in a poor dynamic effect of the dynamic image shot by the terminal device.
Disclosure of Invention
The embodiment of the invention provides a dynamic image shooting method and terminal equipment, and aims to solve the problem that dynamic effects of dynamic images shot by the existing terminal equipment are poor.
In order to solve the technical problem, the invention is realized as follows:
in a first aspect, an embodiment of the present invention provides a dynamic image shooting method, which is applied to a terminal device, and the method includes: receiving a first input of a user on a shooting preview interface; responding to the first input, executing shooting operation according to the target shooting speed corresponding to the input parameter of the first input, and outputting target data; wherein the input parameters comprise at least one of input direction, input distance, input track and input times.
In a second aspect, an embodiment of the present invention provides a terminal device, where the terminal device includes a receiving module and a shooting module. The receiving module is used for receiving a first input of a user on the shooting preview interface. The shooting module is used for responding to the first input received by the receiving module, executing shooting operation according to the target shooting speed corresponding to the input parameter of the first input and outputting target data; wherein the input parameters comprise at least one of input direction, input distance, input track and input times.
In a third aspect, an embodiment of the present invention provides a terminal device, where the terminal device includes a processor, a memory, and a computer program stored on the memory and operable on the processor, and the computer program, when executed by the processor, implements the steps of the moving image shooting method in the first aspect.
In a fourth aspect, an embodiment of the present invention provides a computer-readable storage medium, on which a computer program is stored, which, when executed by a processor, implements the steps of the moving image capturing method in the first aspect described above.
In the embodiment of the invention, a first input of a user on a shooting preview interface can be received, and in response to the first input, a shooting operation is executed according to a target shooting double speed corresponding to an input parameter of the first input, and target data is output; wherein the input parameters comprise at least one of input direction, input distance, input track and input times. Through the scheme, the target shooting speed can be determined according to the corresponding relation between the preset shooting speed and the input parameters, and if the input parameters input by the user are different, the target shooting speed is different, so that the shooting operation can be executed according to different shooting speeds (such as a fast shooting speed, a normal shooting speed and a slow shooting speed), and dynamic effects of dynamic images shot at different shooting speeds are rich, the shooting speed can be flexibly set, and the dynamic effects of the dynamic images shot by the terminal equipment can be improved.
Drawings
Fig. 1 is a schematic diagram of an architecture of a possible android operating system according to an embodiment of the present invention;
FIG. 2 is a schematic diagram of a dynamic image capturing method according to an embodiment of the present invention;
fig. 3 is one of schematic interfaces of an application of a dynamic image shooting method according to an embodiment of the present invention;
FIG. 4 is a second schematic interface diagram of an application of the dynamic image capturing method according to the embodiment of the present invention;
fig. 5 is a third schematic interface diagram of an application of the dynamic image shooting method according to the embodiment of the present invention;
fig. 6 is a second schematic diagram of a dynamic image capturing method according to an embodiment of the invention;
FIG. 7 is a fourth schematic interface diagram of an application of the dynamic image capturing method according to the embodiment of the present invention;
FIG. 8 is a fifth schematic view of an interface applied to the dynamic image capturing method according to the embodiment of the present invention;
FIG. 9 is a sixth schematic view of an interface applied to a dynamic image capturing method according to an embodiment of the present invention;
fig. 10 is a third schematic diagram of a dynamic image shooting method according to an embodiment of the present invention;
fig. 11 is a schematic structural diagram of a terminal device according to an embodiment of the present invention;
fig. 12 is a second schematic structural diagram of a terminal device according to an embodiment of the present invention;
fig. 13 is a hardware schematic diagram of a terminal device according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The term "and/or" herein is an association relationship describing an associated object, meaning that three relationships may exist, e.g., a and/or B, may mean: a exists alone, A and B exist simultaneously, and B exists alone. The symbol "/" herein denotes a relationship in which the associated object is or, for example, a/B denotes a or B.
The terms "first" and "second," and the like, in the description and in the claims of the present invention are used for distinguishing between different objects and not for describing a particular order of the objects. For example, the first input and the second input, etc. are for distinguishing different inputs, rather than for describing a particular order of inputs.
In the embodiments of the present invention, words such as "exemplary" or "for example" are used to mean serving as examples, illustrations or descriptions. Any embodiment or design described as "exemplary" or "e.g.," an embodiment of the present invention is not necessarily to be construed as preferred or advantageous over other embodiments or designs. Rather, use of the word "exemplary" or "such as" is intended to present concepts related in a concrete fashion.
In the description of the embodiments of the present invention, unless otherwise specified, "a plurality" means two or more, for example, a plurality of processing units means two or more processing units, and the like.
The embodiment of the invention provides a dynamic image shooting method and terminal equipment, which can receive a first input of a user on a shooting preview interface, respond to the first input, execute shooting operation according to a target shooting speed corresponding to an input parameter of the first input, and output target data; wherein the input parameters comprise at least one of input direction, input distance, input track and input times. Through the scheme, the target shooting speed can be determined according to the corresponding relation between the preset shooting speed and the input parameters, and if the input parameters input by the user are different, the target shooting speed is different, so that the shooting operation can be executed according to different shooting speeds (such as a fast shooting speed, a normal shooting speed and a slow shooting speed), and dynamic effects of dynamic images shot at different shooting speeds are rich, and the dynamic effects of the dynamic images shot by the terminal equipment can be improved.
The terminal device in the embodiment of the present invention may be a terminal device having an operating system. The operating system may be an Android (Android) operating system, an ios operating system, or other possible operating systems, and embodiments of the present invention are not limited in particular.
Next, a software environment applied to the dynamic image shooting method provided by the embodiment of the present invention is described by taking an android operating system as an example.
Fig. 1 is a schematic diagram of an architecture of a possible android operating system according to an embodiment of the present invention. In fig. 1, the architecture of the android operating system includes 4 layers, which are respectively: an application layer, an application framework layer, a system runtime layer, and a kernel layer (specifically, a Linux kernel layer).
The application program layer comprises various application programs (including system application programs and third-party application programs) in an android operating system.
The application framework layer is a framework of the application, and a developer can develop some applications based on the application framework layer under the condition of complying with the development principle of the framework of the application.
The system runtime layer includes libraries (also called system libraries) and android operating system runtime environments. The library mainly provides various resources required by the android operating system. The android operating system running environment is used for providing a software environment for the android operating system.
The kernel layer is an operating system layer of an android operating system and belongs to the bottommost layer of an android operating system software layer. The kernel layer provides kernel system services and hardware-related drivers for the android operating system based on the Linux kernel.
Taking an android operating system as an example, in the embodiment of the present invention, a developer may develop a software program for implementing the dynamic image shooting method provided in the embodiment of the present invention based on the system architecture of the android operating system shown in fig. 1, so that the dynamic image shooting method may operate based on the android operating system shown in fig. 1. Namely, the processor or the terminal device can implement the dynamic image shooting method provided by the embodiment of the invention by running the software program in the android operating system.
The terminal equipment in the embodiment of the invention can be a mobile terminal or a non-mobile terminal. For example, the mobile terminal may be a mobile phone, a tablet computer, a notebook computer, a palm top computer, a vehicle-mounted terminal, a wearable device, an ultra-mobile personal computer (UMPC), a netbook or a Personal Digital Assistant (PDA), and the like, and the non-mobile terminal may be a Personal Computer (PC), a Television (TV), a teller machine, a self-service machine, and the like, and the embodiment of the present invention is not limited in particular.
The execution subject of the dynamic image shooting method provided in the embodiment of the present invention may be the terminal device, or may also be a functional module and/or a functional entity capable of implementing the dynamic image shooting method in the terminal device, and may specifically be determined according to actual use requirements, which is not limited in the embodiment of the present invention. The following describes an exemplary moving image capturing method according to an embodiment of the present invention, taking a terminal device as an example.
As shown in fig. 2, an embodiment of the present invention provides a moving image photographing method, which may include steps 200 and 201 described below.
Step 200, the terminal equipment receives a first input of a user on a shooting preview interface.
Wherein the first input is used for triggering the shooting of the dynamic image.
In the embodiment of the invention, if a user needs to shoot dynamic images (including videos and dynamic photos), the user can trigger the terminal equipment to operate the camera application program, then trigger the terminal equipment to start the dynamic image shooting function of the camera and display the shooting preview interface of the dynamic images, and further trigger the terminal equipment to shoot the dynamic images through the first input on the shooting preview interface.
Optionally, in the embodiment of the present invention, the first input of the user may be a sliding input, or may be an input in any other possible form, which may be determined specifically according to an actual use requirement, and the embodiment of the present invention is not limited.
For example, the first input may be a sliding input of a user along a certain direction on the shooting preview interface, or may be a sliding input of a specific distance, or may be a sliding input of a specific graphic track, or may be a sliding input multiple times, or may be any other possible input, or any combination of the above inputs, which may be determined according to actual use requirements, and the embodiment of the present invention is not limited.
And step 201, the terminal device responds to the first input, executes shooting operation according to the target shooting double speed corresponding to the input parameter of the first input, and outputs target data.
Wherein the input parameter may include at least one of an input direction, an input distance, an input trajectory, and an input number.
In the embodiment of the present invention, if the user inputs on the shooting preview interface, the terminal device may determine, in response to the input, the target shooting speed corresponding to the first input parameter, and shoot the dynamic image according to the target shooting speed.
In the embodiment of the invention, the target shooting double speed means that the number of image frames can be acquired in unit time in the process of shooting the dynamic image by the terminal equipment. Specifically, it is assumed that a specific shooting double speed (hereinafter referred to as a normal shooting double speed) means that the terminal device can capture a specific number of image frames per unit time in shooting a moving image. If the number of the image frames which can be collected in unit time is more than the specific number in the process of shooting the dynamic image by the terminal equipment, the target shooting speed is higher than the normal shooting speed, and the target shooting speed is called as a fast shooting speed; if the number of the image frames which can be acquired in the unit time by the terminal equipment is less than the specific number in the process of shooting the dynamic image, the target shooting speed is lower than the normal shooting speed, and the target shooting speed is called as a slow shooting speed.
A specific implementation of the moving image capturing method according to the embodiment of the present invention will be described in detail below with respect to a case where the input parameter is an input trajectory (first embodiment below), and a case where the input parameter is an input direction and an input number (second embodiment and third embodiment below).
First embodiment
In the first embodiment, it is assumed that the input parameter of the first input is an input trajectory, in this case, the step 201 may be specifically realized by the step 201a described below.
Step 201a, the terminal device responds to the first input, and executes shooting operation according to the target shooting speed corresponding to the input track of the first input; wherein, different input trajectories correspond to different shooting speeds.
In the embodiment of the invention, the corresponding relation between the shooting speed and the input track can be stored in the terminal equipment in advance. Specifically, if the user inputs on the shooting preview interface, the terminal device may determine, in response to the input, a target shooting speed corresponding to the input trajectory according to the correspondence, and perform a shooting operation according to the shooting speed.
In the embodiment of the present invention, it is assumed that the input trajectory is a graphical trajectory input (for example, drawn) by a user on the shooting preview interface, and different graphical trajectories correspond to different shooting speeds. Illustratively, a triangular trajectory may correspond to x 1.5 times speed and a circular trajectory may correspond to x 0.5 times speed.
The following describes a specific implementation manner of the step 201a in detail with reference to fig. 3.
Assume that the input trajectory is a triangular trajectory and the triangular trajectory corresponds to x 1.5 times speed, and that the input trajectory is a circular trajectory and the circular trajectory corresponds to x 0.5 times speed. The user can drag the shooting control on the shooting preview interface and input (for example, draw) different graphs on the shooting preview interface so as to trigger the terminal device to execute shooting operation at the shooting double speed corresponding to the input graph track.
As shown in fig. 3 (a), if the user drags the photographing control to input a triangle trajectory, the terminal device photographs a moving image at × 1.5 × speed during the time when the user inputs the triangle. After the user input is completed and the shooting control is released, the shooting control automatically returns to the initial position before shooting.
Alternatively, as shown in (a) of fig. 3, when the user inputs the triangular track, the terminal device may display the input triangular track in the area where the user inputs and display that the current shooting speed is × 1.5 speed, so as to prompt the user that the moving image is currently shot at × 1.5 speed.
As shown in fig. 3 (b), if the user drags the photographing control to input a circular trajectory, the terminal device photographs a moving image at × 0.5 × speed while the user inputs the circle. After the user input is completed and the shooting control is released, the shooting control automatically returns to the initial position before shooting.
Alternatively, as shown in (b) of fig. 3, when the user inputs a circular trajectory, the terminal device may display the input circular trajectory in an area where the user inputs and display that the current shooting speed is × 0.5 speed, so as to prompt the user that the moving image is currently shot at × 0.5 speed.
It should be noted that, the above description takes the graph track as the triangle track and the circle track as the exemplary illustrations, it is understood that the embodiment of the present invention includes but is not limited to the graph track, which may be determined according to actual use requirements, and the embodiment of the present invention is not limited thereto.
It should be noted that, the triangular track corresponds to × 1.5 times speed and the circular track corresponds to × 0.5 times speed as an exemplary description, it is understood that the embodiment of the present invention includes, but is not limited to, the correspondence relationship, which may be determined according to actual use requirements, and the embodiment of the present invention is not limited.
In the embodiment of the invention, when a user shoots a dynamic image through the terminal equipment, the user can input the graphic track on the shooting preview interface so as to trigger the terminal equipment to execute shooting operation according to the shooting speed (such as a fast shooting speed, a normal shooting speed and a slow shooting speed) corresponding to the input graphic track. Therefore, the dynamic effect of the dynamic image obtained by shooting at different shooting speeds is rich, the dynamic effect of the dynamic image obtained by the terminal equipment can be improved, the shooting speed can be flexibly set, and the interest of shooting the dynamic image is increased.
Second embodiment
In the second embodiment, it is assumed that the input parameters of the first input include an input direction and an input number, and the first input includes: a first sub-input entered in a first direction and a second sub-input entered in a second direction, in which case the above step 201 may be specifically realized by the steps 201b and 201c described below.
And step 201b, the terminal equipment responds to the first sub-input and executes shooting operation according to the first shooting double speed corresponding to the first sub-input.
The first shooting double speed is in direct proportion to the input times corresponding to the first sub-input.
And step 201c, the terminal equipment responds to the second sub-input and executes shooting operation according to the second shooting double speed corresponding to the second sub-input.
And the second shooting double speed is inversely proportional to the input times corresponding to the second sub-input.
In an embodiment of the present invention, the first direction and the second direction may be opposite, the number of inputs (e.g. sliding) corresponding to the first sub-input is proportional to the shooting speed, and the number of inputs (e.g. sliding) corresponding to the second sub-input is inversely proportional to the shooting speed. That is, the greater the number of inputs (e.g., sliding) corresponding to the first sub-input, the greater the shooting speed; the larger the number of inputs (e.g., sliding) corresponding to the second sub-input, the smaller the shooting speed.
The correspondence relationship between each sub-input and the shooting double speed in the first input will be described in detail below with reference to fig. 4. Assume that the first sub-input is a slide-up input on the capture preview interface and the second sub-input is a slide-down input on the capture preview interface.
As shown in (a) and (b) in fig. 4, the arrow up direction indicates that the shooting double speed can be increased, the arrow down direction indicates that the shooting double speed can be decreased, one arrow represents one sliding, and two arrows represent two consecutive sliding. The finger of the user slides upwards, which means that the current shooting speed is higher than the normal speed, one sliding indicates that the current shooting speed is increased to 1.5 times of the normal speed, and two upward continuous sliding indicates that the current shooting speed is increased to 2 times of the normal speed. The finger of the user slides downwards to indicate that the current shooting speed is lower than the normal speed, slides downwards once to indicate that the current shooting speed is reduced to 0.5 speed of the normal speed, and slides downwards twice continuously to indicate that the shooting speed is changed to 0.25 time of the normal speed.
As shown in (a) of fig. 4, assuming that the normal photographing double speed is 1, if the user slides up the photographing preview interface 1 time, the current photographing double speed may increase to 1.5 times the normal photographing double speed, denoted as × 1.5, at which time the terminal device may photograph a moving image at × 1.5 times speed. If the user slides upwards continuously for 2 times on the shooting preview interface, the current shooting speed can be increased to 2 times of the normal shooting speed, which is expressed as x 2, and at this time, the terminal device can shoot the dynamic image at the x 2 speed. By analogy, if the user slides up m times (m is greater than 1) on the shooting preview interface, the current shooting speed is increased to m times of the normal shooting speed, and at this time, the terminal device can shoot the dynamic image at the speed of × m times.
Alternatively, as shown in (a) of fig. 4, when the user slides up on the shooting preview interface, the terminal device may display on the shooting preview interface a current shooting speed of × 1.5 or × 2, etc., to prompt the user that the moving image is currently shot at × 1.5 or × 2.
Accordingly, as shown in (b) of fig. 4, assuming that the normal photographing double speed is 1, if the user slides down 1 time on the photographing preview interface, the current photographing double speed is reduced to 0.5 times of the normal photographing double speed, denoted as × 0.5, and at this time the terminal device can photograph a moving image at × 0.5 times speed. If the user slides downwards continuously for 2 times on the shooting preview interface, the current shooting speed is reduced to 0.25 time of the normal shooting speed, which is expressed as x 0.25, and at this time, the terminal device can shoot the dynamic image at the shooting speed of x 0.25. By analogy, if the user slides down the shooting preview interface n times (n is greater than 0 and less than 1), the current shooting speed is reduced to n times of the normal shooting speed, and at this time, the terminal device can shoot the dynamic image at × n times.
Alternatively, as shown in (b) of fig. 4, when the user slides down on the shooting preview interface for input, the terminal device may display a current shooting speed of × 0.5 or × 0.25 on the shooting preview interface to prompt the user that the moving image is currently shot at × 0.5 or × 0.25.
It should be noted that, the sliding directions of the first sub input and the second sub input are upward sliding and downward sliding, respectively, which are taken as an exemplary illustration, and it is understood that the embodiment of the present invention includes but is not limited to the sliding directions, which can be determined according to actual use requirements, and the embodiment of the present invention is not limited thereto.
It should be noted that, the above description is made by taking the case where the shooting speed is increased to 1.5 times, 2 times, etc. of the normal shooting speed or the case where the shooting speed is decreased to 0.5 times, 0.25 times, etc. of the normal shooting speed as an example, it is understood that the embodiment of the present invention includes, but is not limited to, the relationship between the shooting speed and the normal shooting speed, and may be determined according to actual use requirements, and the embodiment of the present invention is not limited thereto.
In the embodiment of the invention, when a user shoots a dynamic image through the terminal device, the user can slide along the upward direction or the downward direction on the shooting preview interface so as to trigger the terminal device to execute shooting operation according to the corresponding preset multiple speed (such as a fast shooting multiple speed, a normal shooting multiple speed and a slow shooting multiple speed). Therefore, the dynamic effect of the dynamic image obtained by shooting at different shooting speeds is rich, the dynamic effect of the dynamic image obtained by the terminal equipment can be improved, the interestingness of shooting the dynamic image is increased, and the operation is convenient and fast.
Third embodiment
In the third embodiment, it is assumed that the input parameters include an input direction and an input number, and the first input is a third sub-input in a third direction and a fourth sub-input in a fourth direction, in this case, the step 201 described above may be specifically implemented by the step 201d and the step 201e described below.
Step 201d, the terminal device responds to the third sub-input, shoots the first duration at a third shooting speed corresponding to the third sub-input, and outputs the first sub-data.
Step 201e, the terminal device responds to the fourth sub-input, shoots a second time length according to a fourth shooting speed corresponding to the fourth sub-input, and outputs second sub-data.
The target data comprises first subdata and second subdata. The target shooting speed includes a third shooting speed corresponding to the third sub-input and a fourth shooting speed corresponding to the fourth sub-input.
In this embodiment of the present invention, the first ratio is the ratio of the first duration to the second duration, and the second ratio is the ratio of the input (e.g. sliding) times corresponding to the third sub-input to the input (e.g. sliding) times corresponding to the fourth sub-input. That is, the more the number of inputs (e.g., sliding) is, the larger the value of the photographing time period is.
In the embodiment of the present invention, the number of times of the user's finger input (e.g., sliding) in the third direction and the number of times of the user's finger input (e.g., sliding) in the fourth direction represent time ratios of different multiple speeds in the whole picture when the moving image is captured, and the sequence of the finger input (e.g., sliding) represents the capturing timing sequence of different multiple speeds in the moving image.
In the embodiment of the present invention, the third shooting speed corresponding to the third sub-input and the fourth shooting speed corresponding to the fourth sub-input are determined. Assume that the third shooting speed is × 0.5 speed and the fourth shooting speed is × 1.5 speed. Assuming that the third sub-input may be a slide-to-left input on the capture preview interface and the fourth sub-input may be a slide-to-right input on the capture preview interface, the correspondence relationship between each sub-input and the capture time length in the first input will be described in detail below with reference to fig. 5.
As shown in fig. 5, the user has slid 2 times successively to the right and then slid once to the left, indicating that in the entire moving image, the photographing time period of the fourth photographing double speed (i.e., × 1.5 times speed) corresponding to the rightward slide input occupies two thirds of the total photographing time period of the entire moving image, and the photographing time period of the third photographing double speed (i.e., × 0.5 times speed) corresponding to the leftward slide input occupies one third of the total photographing time period of the entire moving image. When the user first performs a right slide input and then performs a left slide input, the first two thirds of the time when the moving image is captured at a speed of × 1.5 times; the latter one-third of the time for capturing the moving image is captured at × 0.5 times the speed.
Alternatively, as shown in fig. 5, when the user performs a leftward slide input or a rightward slide input on the shooting preview interface, the terminal device may display the current shooting speed at × 0.5 speed or × 1.5 speed in the area where the user inputs, so as to prompt the user that the moving image is currently shot at × 0.5 speed or × 1.5 speed.
It should be noted that, the sliding directions of the third sub input and the fourth sub input are sliding leftward and sliding rightward, respectively, which are taken as an exemplary illustration, and it is understood that the embodiment of the present invention includes but is not limited to the sliding directions, which may be determined according to actual use requirements, and the embodiment of the present invention is not limited thereto.
It should be noted that, the third shooting speed is × 0.5 speed, and the fourth shooting speed is × 1.5 speed, which are exemplary descriptions, and it is understood that the embodiments of the present invention include, but are not limited to, the shooting speeds described above, which may be determined according to actual use requirements, and the embodiments of the present invention are not limited thereto.
In the embodiment of the invention, when shooting the dynamic image through the terminal equipment, the user can slide in the left and right directions to trigger the terminal equipment to shoot the mixed multiple speed dynamic image, namely, the shot dynamic image has both the fast multiple speed image and the slow multiple speed image. Therefore, dynamic effects presented by dynamic images obtained by shooting at different shooting speeds are rich, and the dynamic effects presented by the dynamic images acquired by the terminal equipment can be improved.
Optionally, in the embodiment of the present invention, the step of outputting the target data in step 201 may be specifically implemented by step 201f described below.
And step 201f, after the terminal equipment receives the preset time length after the first input, outputting the target data.
Optionally, in the embodiment of the present invention, the preset time period may be 5 seconds, may also be 8 seconds, and may also be any other possible time period, which may be determined specifically according to an actual use requirement, and the embodiment of the present invention is not limited.
In the embodiment of the present invention, after receiving a first input of a user, a terminal device may respond to the first input of the user, execute a shooting operation according to a target shooting double speed corresponding to an input parameter of the first input, automatically save the dynamic image (i.e., target data) after receiving a preset time length after the first input, and output the dynamic image. Therefore, the operation convenience of the terminal equipment in shooting the dynamic image can be improved.
In the above embodiments of the present invention, when capturing a moving image, the terminal device may capture the moving image at multiple speeds in response to an input from a user. The shooting speed can be faster than the normal speed or slower than the normal speed. If the shooting speed is higher than the shooting speed of 1 time (for example, 1.5 times speed and 2 times speed), the dynamic image effect is faster than the display speed of the normal dynamic image when the dynamic image is previewed and shot; if the shooting speed is less than the shooting speed of 1 time (for example, 0.5 time and 0.25 time), the moving image effect is slower than the display speed of the normal moving image when previewing and shooting the moving image. Therefore, in the process of shooting the current dynamic image, the multi-speed shooting effect is increased, and the interest of shooting the dynamic image is increased.
The dynamic image shooting method provided by the embodiment of the invention can receive a first input of a user on a shooting preview interface, respond to the first input, execute shooting operation according to a target shooting double speed corresponding to an input parameter of the first input, and output target data; wherein the input parameters comprise at least one of input direction, input distance, input track and input times. Through the scheme, the target shooting speed can be determined according to the corresponding relation between the preset shooting speed and the input parameters, and if the input parameters input by the user are different, the target shooting speed is different, so that the shooting operation can be executed according to different shooting speeds (such as a fast shooting speed, a normal shooting speed and a slow shooting speed), and dynamic effects of dynamic images shot at different shooting speeds are rich, and the dynamic effects of the dynamic images shot by the terminal equipment can be improved.
Optionally, with reference to fig. 2, as shown in fig. 6, before the step 200, the method for capturing a dynamic image according to the embodiment of the present invention further includes the following steps 202 to 206.
Step 202, the terminal device receives a second input of the user on the shooting preview interface.
And the second input is used for calling out a first sub-interface in which different shooting speeds are displayed.
And step 203, the terminal device responds to the second input, and displays a first sub-interface on the shooting preview interface, wherein the first sub-interface comprises at least one area, and each area indicates different shooting speed.
And step 204, the terminal equipment receives a third input of the user on the shooting preview interface.
Wherein the third input is used for setting the corresponding relation between the input parameters of the second input and the shooting speed
And step 205, the terminal device responds to the third input and displays the input parameters of the second input in the target area in the first sub-interface.
And step 206, the terminal device responds to the third input and stores the corresponding relation between the input parameters and the shooting speed indicated by the target area.
It should be noted that, the execution order of step 205 and step 206 may not be limited by the embodiment of the present invention. That is, in the embodiment of the present invention, step 205 may be executed first, and then step 206 may be executed; step 206 may be executed first, and then step 205 may be executed; step 205 and step 206 may also be performed simultaneously. It is to be understood that fig. 6 illustrates the step 205 being executed first and then the step 206 being executed.
The following describes a specific implementation process of the above steps 202 to 206 exemplarily with reference to fig. 7, 8 and 9.
As shown in fig. 7 (a), a finger is pulled or slid downward from the top of the terminal device at a shooting preview interface for shooting a moving image. As shown in fig. 7 (b), when the finger slides down to a certain position, the terminal device may display a pull-down menu 70 of the custom graphic, in which "+" in the pull-down menu 70 indicates that the shooting speed at which the moving image is shot is greater than the normal speed, "-" indicates that the shooting speed at which the moving image is shot is less than the normal speed, and numerals 2.0, 1.5, 0.25, and 0.5 indicate that the shooting speed is a multiple of the normal speed.
As shown in fig. 8 (a), if the user clicks an area where the speed is +1.5 times in the pull-down menu 70, the terminal device may display a custom drawing interface 71 on the preview screen. The user may enter (e.g., draw) graphics, such as triangles, on the custom drawing interface 71. As shown in (b) in fig. 8, after the input is completed and the finger leaves the screen, the terminal device displays a triangle in the area where the +1.5 times speed is located in the pull-down menu 70, which indicates that the user has successfully set the corresponding relationship between the +1.5 times speed and the triangle track.
Similarly, if the user clicks the area of the drop-down menu 70 where the speed is-0.5 times, the terminal device may display the custom drawing interface 71 on the preview screen. The user may enter (e.g., draw) a graphic, such as a circle, on the custom drawing interface 71. As shown in (b) in fig. 8, after the input is completed and the finger leaves the screen, the terminal device displays a circle in the area where the speed is-0.5 times in the pull-down menu 70, which indicates that the user has successfully set the corresponding relationship between the speed and the circular track at-0.5 times.
As shown in (a) of fig. 9, when the user drags the photographing control to input a triangular track on the photographing preview interface, the terminal device may display the input triangular track in an area where the user input is located and display that the current photographing speed is × 1.5 speed, and the terminal device highlights a first area 72 (including +1.5 and triangles) in the pull-down menu 70 to prompt the user that the dynamic image is currently photographed at × 1.5 speed through the triangular track input.
As shown in (b) of fig. 9, when the user drags the photographing control to input a circular trajectory on the photographing preview interface, the terminal device may display the input circular trajectory in an area where the user input is located and display that the current photographing speed is x 0.5 speed, and the terminal device highlights a second area 73 (including-0.5 and a circle) in the pull-down menu 70 to prompt the user that the moving image is currently photographed at x 0.5 speed through the circular trajectory input.
In the above embodiment of the present invention, before shooting the dynamic image, the terminal device may set the corresponding relationship between the shooting speed and the preset input in response to the input of the user, and further, the user may trigger the terminal device to shoot the dynamic image at different speeds with different inputs. Therefore, the dynamic effect presented by the dynamic image obtained by shooting at different shooting speeds is rich, and the dynamic effect presented by the dynamic image obtained by the terminal equipment can be improved.
Optionally, as shown in fig. 10 in conjunction with fig. 6, after the step 206 and before the step 200, the method for capturing a dynamic image according to the embodiment of the present invention further includes the following steps 207 and 208.
And step 207, the terminal equipment receives a fourth input of the user.
Wherein the fourth input is for displaying a prompt message.
Optionally, in the embodiment of the present invention, the first input of the user may be a sliding input, may also be a click input (for example, a single-click input or a double-click input), and may also be an input in any other possible form, which may be determined specifically according to an actual use requirement, which is not limited in the embodiment of the present invention.
For example, in the moving image shooting interface, a user may start a downward sliding input of a finger from a top edge of the terminal device, and when the finger slides down to a certain position, the terminal device may display a pull-down menu, which may include a shooting double speed and preset input information corresponding to the shooting double speed.
And step 208, the terminal device responds to the fourth input, and displays prompt information on the shooting preview interface, wherein the prompt information comprises at least one shooting speed and input parameters corresponding to each shooting speed.
For example, in an embodiment of the present invention, the prompt information may include at least one shooting speed and a preset input track (e.g., a graphic track) corresponding to each shooting speed. The user can refer to the prompt information and input one of the preset input tracks on the shooting preview interface to trigger the terminal equipment to shoot the dynamic image at the shooting double-speed corresponding to the preset input track.
Further exemplarily, in an embodiment of the present invention, the prompt information may include at least one shooting double speed and an input direction (e.g., a sliding direction) corresponding to each shooting double speed. The user can refer to the prompt information, and the terminal equipment is triggered to shoot the dynamic image at the shooting double-speed corresponding to the sliding input by sliding input in the preset direction on the shooting preview interface.
In the embodiment of the invention, when the dynamic image is shot, the terminal equipment can respond to the input of the user and display the prompt information on the shooting preview interface so as to prompt the user to trigger the terminal equipment to shoot the dynamic image according to different speeds through different inputs, thereby improving the flexibility and convenience of shooting the dynamic image by the user. Therefore, the dynamic effect presented by the dynamic image obtained by shooting at different shooting speeds is rich, and the dynamic effect presented by the dynamic image obtained by the terminal equipment can be improved.
For convenience of description, in the embodiments of the present invention, taking a moving picture is taken as an example to exemplarily describe the moving image capturing method provided in the embodiments of the present invention. For the description of the video shooting method, reference may be made to the description of the dynamic photo shooting method, and details are not repeated here.
As shown in fig. 11, an embodiment of the present invention provides a terminal device, where the terminal device 700 may include a receiving module 701 and a shooting module 702. The receiving module 701 is configured to receive a first input of a user on a shooting preview interface. The shooting module 702 is configured to, in response to the first input received by the receiving module 701, execute a shooting operation according to the target shooting double speed corresponding to the input parameter of the first input, and output target data. Wherein the input parameter may include at least one of an input direction, an input distance, an input trajectory, and an input number.
Optionally, with reference to fig. 11, as shown in fig. 12, the terminal device provided in the embodiment of the present invention may further include a display module 703 and a storage module 704. The receiving module 701 is further configured to receive a second input of the user on the shooting preview interface before receiving the first input of the user on the shooting preview interface. The display module 703 is configured to display a first sub-interface on the shooting preview interface in response to the second input received by the receiving module 701, where the first sub-interface includes at least one region, and each region indicates a different shooting speed. The receiving module 701 is further configured to receive a third input of the user on the shooting preview interface, where the third input is used to set a corresponding relationship between an input parameter of the second input and a shooting speed. The display module 703 is further configured to display the input parameters of the second input in the target area in the first sub-interface in response to the third input received by the receiving module 701. The storage module 704 is configured to store a corresponding relationship between the input parameter and the shooting double speed indicated by the target area in response to the third input received by the receiving module 701.
Optionally, in this embodiment of the present invention, the receiving module 701 is further configured to receive a fourth input of the user after the storage module 704 stores the corresponding relationship between the input parameter and the shooting double speed indicated by the target area, and before the receiving module 701 receives the first input of the user on the shooting preview interface, where the fourth input is used to display the prompt information. The display module 703 is further configured to display the prompt information on the shooting preview interface in response to the fourth input received by the receiving module 701, where the prompt information may include at least one shooting double speed and an input parameter corresponding to each shooting double speed.
Optionally, in an embodiment of the present invention, the input parameter is an input trajectory. In this case, the shooting module 702 is specifically configured to, in response to the first input, execute a shooting operation at a target shooting double speed corresponding to the input trajectory of the first input. Wherein, different input trajectories correspond to different shooting speeds.
Optionally, in this embodiment of the present invention, it is assumed that the input parameters include an input direction and an input number, and the first input includes a first sub-input along a first direction and a second sub-input along a second direction, in this case, the capturing module 702 is specifically configured to respond to the first sub-input and perform a capturing operation at a first capturing double speed corresponding to the first sub-input, where the first capturing double speed is proportional to the input number corresponding to the first sub-input. The shooting module 702 is further specifically configured to, in response to the second sub-input, execute a shooting operation at a second shooting double speed corresponding to the second sub-input, where the second shooting double speed is inversely proportional to the number of inputs corresponding to the second sub-input.
Optionally, in this embodiment of the present invention, it is assumed that the input parameters include an input direction and an input frequency, and the first input includes a third sub-input along a third direction and a fourth sub-input along a fourth direction, in this case, the capturing module 702 is specifically configured to respond to the third sub-input, capture a first time length at a third capturing speed corresponding to the third sub-input, and output the first sub-data. The shooting module 702 is further specifically configured to, in response to the fourth sub-input, shoot a second duration at a fourth shooting speed corresponding to the fourth sub-input, and output second sub-data. The target data comprises first subdata and second subdata; the first ratio is the ratio of the first duration to the second duration, and the second ratio is the ratio of the input times corresponding to the third sub-input to the input times corresponding to the fourth sub-input.
The terminal device provided by the embodiment of the present invention can implement each process implemented by the terminal device in the above method embodiments, and is not described here again to avoid repetition.
The terminal device provided by the embodiment of the invention can receive a first input of a user on a shooting preview interface, respond to the first input, execute shooting operation according to a target shooting double speed corresponding to an input parameter of the first input, and output target data; wherein the input parameters comprise at least one of input direction, input distance, input track and input times. Through the scheme, the target shooting speed can be determined according to the corresponding relation between the preset shooting speed and the input parameters, and if the input parameters input by the user are different, the target shooting speed is different, so that the shooting operation can be executed according to different shooting speeds (such as a fast shooting speed, a normal shooting speed and a slow shooting speed), and dynamic effects of dynamic images shot at different shooting speeds are rich, and the dynamic effects of the dynamic images shot by the terminal equipment can be improved.
Fig. 13 is a schematic diagram of a hardware structure of a terminal device for implementing various embodiments of the present invention. As shown in fig. 13, the terminal device 800 includes but is not limited to: a radio frequency unit 801, a network module 802, an audio output unit 803, an input unit 804, a sensor 805, a display unit 806, a user input unit 807, an interface unit 808, a memory 809, a processor 810, and a power supply 811. Those skilled in the art will appreciate that the terminal device configuration shown in fig. 13 does not constitute a limitation of the terminal device, and that the terminal device may include more or fewer components than shown, or combine certain components, or a different arrangement of components. In the embodiment of the present invention, the terminal device includes, but is not limited to, a mobile phone, a tablet computer, a notebook computer, a palm computer, a vehicle-mounted terminal, a wearable device, a pedometer, and the like.
A user input unit 807 for receiving a first input of a user on the shooting preview interface; a processor 810 for performing a photographing operation at a target photographing double speed corresponding to an input parameter of the first input in response to the first input received by the user input unit 807, and outputting target data; wherein the input parameters comprise at least one of input direction, input distance, input track and input times.
The embodiment of the invention provides terminal equipment, which can receive a first input of a user on a shooting preview interface, respond to the first input, execute shooting operation according to a target shooting double speed corresponding to an input parameter of the first input, and output target data; wherein the input parameters comprise at least one of input direction, input distance, input track and input times. Through the scheme, the target shooting speed can be determined according to the corresponding relation between the preset shooting speed and the input parameters, and if the input parameters input by the user are different, the target shooting speed is different, so that the shooting operation can be executed according to different shooting speeds (such as a fast shooting speed, a normal shooting speed and a slow shooting speed), and dynamic effects of dynamic images shot at different shooting speeds are rich, and the dynamic effects of the dynamic images shot by the terminal equipment can be improved.
It should be understood that, in the embodiment of the present invention, the radio frequency unit 801 may be used for receiving and sending signals during a message sending and receiving process or a call process, and specifically, receives downlink data from a base station and then processes the received downlink data to the processor 810; in addition, the uplink data is transmitted to the base station. In general, radio frequency unit 801 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like. Further, the radio frequency unit 801 can also communicate with a network and other devices through a wireless communication system.
The terminal device 800 provides the user with wireless broadband internet access through the network module 802, such as helping the user send and receive e-mails, browse webpages, access streaming media, and the like.
The audio output unit 803 may convert audio data received by the radio frequency unit 801 or the network module 802 or stored in the memory 809 into an audio signal and output as sound. Also, the audio output unit 803 may also provide audio output related to a specific function performed by the terminal apparatus 800 (e.g., a call signal reception sound, a message reception sound, etc.). The audio output unit 803 includes a speaker, a buzzer, a receiver, and the like.
The input unit 804 is used for receiving an audio or video signal. The input unit 804 may include a Graphics Processing Unit (GPU) 8041 and a microphone 8042, and the graphics processor 8041 processes image data of a still picture or video obtained by an image capturing apparatus (e.g., a camera) in a video capturing mode or an image capturing mode. The processed image frames may be displayed on the display unit 806. The image frames processed by the graphics processor 8041 may be stored in the memory 809 (or other storage medium) or transmitted via the radio frequency unit 801 or the network module 802. The microphone 8042 can receive sound, and can process such sound into audio data. The processed audio data may be converted into a format output transmittable to a mobile communication base station via the radio frequency unit 801 in case of a phone call mode.
The terminal device 800 also includes at least one sensor 805, such as light sensors, motion sensors, and other sensors. Specifically, the light sensor includes an ambient light sensor that can adjust the brightness of the display panel 8061 according to the brightness of ambient light, and a proximity sensor that can turn off the display panel 8061 and/or the backlight when the terminal device 800 moves to the ear. As one of the motion sensors, the accelerometer sensor can detect the magnitude of acceleration in each direction (generally three axes), detect the magnitude and direction of gravity when stationary, and can be used for identifying the terminal device posture (such as horizontal and vertical screen switching, related games, magnetometer posture calibration), vibration identification related functions (such as pedometer and tapping) and the like; the sensors 805 may also include fingerprint sensors, pressure sensors, iris sensors, molecular sensors, gyroscopes, barometers, hygrometers, thermometers, infrared sensors, etc., which are not described in detail herein.
The display unit 806 is used to display information input by the user or information provided to the user. The display unit 806 may include a display panel 8061, and the display panel 8061 may be configured in the form of a Liquid Crystal Display (LCD), an organic light-emitting diode (OLED), or the like.
The user input unit 807 is operable to receive input numeric or character information and generate key signal inputs related to user settings and function control of the terminal device. Specifically, the user input unit 807 includes a touch panel 8071 and other input devices 8072. The touch panel 8071, also referred to as a touch screen, may collect touch operations by a user on or near the touch panel 8071 (e.g., operations by a user on or near the touch panel 8071 using a finger, a stylus, or any other suitable object or accessory). The touch panel 8071 may include two portions of a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 810, receives a command from the processor 810, and executes the command. In addition, the touch panel 8071 can be implemented by various types such as a resistive type, a capacitive type, an infrared ray, and a surface acoustic wave. In addition to the touch panel 8071, the user input unit 807 can include other input devices 8072. In particular, other input devices 8072 may include, but are not limited to, a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, and a joystick, which are not described in detail herein.
Further, the touch panel 8071 can be overlaid on the display panel 8061, and when the touch panel 8071 detects a touch operation on or near the touch panel 8071, the touch operation is transmitted to the processor 810 to determine the type of the touch event, and then the processor 810 provides a corresponding visual output on the display panel 8061 according to the type of the touch event. Although in fig. 13, the touch panel 8071 and the display panel 8061 are two independent components to implement the input and output functions of the terminal device, in some embodiments, the touch panel 8071 and the display panel 8061 may be integrated to implement the input and output functions of the terminal device, and this is not limited herein.
The interface unit 808 is an interface for connecting an external device to the terminal apparatus 800. For example, the external device may include a wired or wireless headset port, an external power supply (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, an audio input/output (I/O) port, a video I/O port, an earphone port, and the like. The interface unit 808 may be used to receive input (e.g., data information, power, etc.) from an external device and transmit the received input to one or more elements within the terminal apparatus 800 or may be used to transmit data between the terminal apparatus 800 and an external device.
The memory 809 may be used to store software programs as well as various data. The memory 809 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the cellular phone, and the like. Further, the memory 809 can include high speed random access memory, and can also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
The processor 810 is a control center of the terminal device, connects various parts of the whole terminal device by using various interfaces and lines, and performs various functions of the terminal device and processes data by running or executing software programs and/or modules stored in the memory 809 and calling data stored in the memory 809, thereby performing overall monitoring of the terminal device. Processor 810 may include one or more processing units; optionally, the processor 810 may integrate an application processor and a modem processor, wherein the application processor mainly handles operating systems, user interfaces, application programs, and the like, and the modem processor mainly handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into processor 810.
Terminal device 800 may also include a power supply 811 (e.g., a battery) for powering the various components, and optionally, power supply 811 may be logically coupled to processor 810 via a power management system to manage charging, discharging, and power consumption management functions via the power management system.
In addition, the terminal device 800 includes some functional modules that are not shown, and are not described in detail here.
Optionally, an embodiment of the present invention further provides a terminal device, which includes a processor 810 as shown in fig. 13, a memory 809, and a computer program that is stored in the memory 809 and is executable on the processor 810, and when the computer program is executed by the processor 810, the computer program implements each process of the above-mentioned embodiment of the dynamic image shooting method, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here.
An embodiment of the present invention further provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the computer program implements each process of the above dynamic image shooting method embodiment, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here. The computer-readable storage medium may include a read-only memory (ROM), a Random Access Memory (RAM), a magnetic or optical disk, and the like.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solution of the present invention or portions thereof contributing to the prior art may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal device (such as a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method disclosed in the embodiments of the present invention.
While the present invention has been described with reference to the embodiments shown in the drawings, the present invention is not limited to the embodiments, which are illustrative and not restrictive, and it will be apparent to those skilled in the art that various changes and modifications can be made therein without departing from the spirit and scope of the invention as defined in the appended claims.

Claims (12)

1. A dynamic image shooting method is applied to terminal equipment and is characterized by comprising the following steps:
receiving a first input of a user on a shooting preview interface;
responding to the first input, executing shooting operation according to the target shooting double speed corresponding to the input parameters of the first input, and outputting target data;
the input parameters comprise at least one of input direction, input distance, input track and input times;
in a case where the input parameters include an input direction and an input number, the first input includes: a third sub-input entered in a third direction and a fourth sub-input entered in a fourth direction;
the responding to the first input, executing shooting operation according to the target shooting double speed corresponding to the input parameter of the first input, and outputting target data, and the method comprises the following steps:
responding to the third sub-input, shooting a first duration according to a third shooting speed corresponding to the third sub-input, and outputting first sub-data;
responding to the fourth sub-input, shooting a second time length according to a fourth shooting speed corresponding to the fourth sub-input, and outputting second subdata;
the target data comprises first subdata and second subdata; the first ratio is the ratio of the first duration to the second duration, and the second ratio is the ratio of the input times corresponding to the third sub-input to the input times corresponding to the fourth sub-input.
2. The method of claim 1, wherein prior to receiving the first input from the user on the capture preview interface, the method further comprises:
receiving a second input of the user on the shooting preview interface;
displaying a first sub-interface on the capture preview interface in response to the second input, the first sub-interface including at least one region, each region indicating a different capture multiple speed;
receiving a third input of a user on the shooting preview interface, wherein the third input is used for setting a corresponding relation between the input parameters of the second input and the shooting speed;
and responding to the third input, displaying the input parameters of the second input in a target area in the first sub-interface, and storing the corresponding relation between the input parameters and the shooting double speed indicated by the target area.
3. The method of claim 2, wherein after storing the correspondence of the input parameter to the capture multiple speed indicated by the target area, and prior to the receiving a first input by a user on the capture preview interface, the method further comprises:
receiving a fourth input of the user, wherein the fourth input is used for displaying prompt information;
and responding to the fourth input, and displaying the prompt information on the shooting preview interface, wherein the prompt information comprises at least one shooting speed and an input parameter corresponding to each shooting speed.
4. The method of any one of claims 1 to 3, wherein the input parameter is an input trajectory;
the responding to the first input, and executing shooting operation according to the target shooting double speed corresponding to the input parameter of the first input, wherein the shooting operation comprises:
responding to the first input, and executing shooting operation according to the target shooting speed corresponding to the input track of the first input;
wherein, different input trajectories correspond to different shooting speeds.
5. The method according to any one of claims 1 to 3, wherein the input parameters include an input direction and an input number;
the first input includes: a first sub-input inputted in a first direction and a second sub-input inputted in a second direction;
the responding to the first input, and executing shooting operation according to the target shooting double speed corresponding to the input parameter of the first input, wherein the shooting operation comprises:
responding to the first sub-input, and executing shooting operation according to a first shooting double speed corresponding to the first sub-input, wherein the first shooting double speed is in direct proportion to the input times corresponding to the first sub-input;
and responding to the second sub-input, and executing shooting operation according to a second shooting double speed corresponding to the second sub-input, wherein the second shooting double speed is inversely proportional to the input times corresponding to the second sub-input.
6. The terminal equipment is characterized by comprising a receiving module and a shooting module;
the receiving module is used for receiving a first input of a user on a shooting preview interface;
the shooting module is used for responding to the first input received by the receiving module, executing shooting operation according to the target shooting double speed corresponding to the input parameter of the first input, and outputting target data;
the input parameters comprise at least one of input direction, input distance, input track and input times;
in a case where the input parameters include an input direction and an input number, the first input includes: a third sub-input entered in a third direction and a fourth sub-input entered in a fourth direction;
the shooting module is specifically configured to respond to the third sub-input, shoot a first duration at a third shooting speed corresponding to the third sub-input, and output first sub-data;
the shooting module is specifically further configured to respond to the fourth sub-input, shoot a second duration at a fourth shooting speed corresponding to the fourth sub-input, and output second sub-data;
the target data comprises first subdata and second subdata; the first ratio is the ratio of the first duration to the second duration, and the second ratio is the ratio of the input times corresponding to the third sub-input to the input times corresponding to the fourth sub-input.
7. The terminal device according to claim 6, wherein the terminal device further comprises a display module and a storage module;
the receiving module is further used for receiving a second input of the user on the shooting preview interface before receiving a first input of the user on the shooting preview interface;
the display module is configured to display a first sub-interface on the shooting preview interface in response to the second input received by the receiving module, where the first sub-interface includes at least one region, and each region indicates a different shooting speed;
the receiving module is further configured to receive a third input of the user on the shooting preview interface, where the third input is used to set a corresponding relationship between an input parameter of the second input and a shooting speed;
the display module is further used for responding to the third input received by the receiving module and displaying the input parameters of the second input in a target area in the first sub-interface;
the storage module is used for responding to the third input received by the receiving module and storing the corresponding relation between the input parameters and the shooting double speed indicated by the target area.
8. The terminal device according to claim 6, wherein the receiving module is further configured to receive a fourth input from the user after the storing module stores the correspondence between the input parameter and the shooting double speed indicated by the target area and before the receiving module receives the first input from the user on the shooting preview interface, where the fourth input is used for displaying a prompt message;
the display module is further configured to display the prompt information on the shooting preview interface in response to the fourth input received by the receiving module, where the prompt information includes at least one shooting speed and an input parameter corresponding to each shooting speed.
9. The terminal device according to any one of claims 6 to 8, wherein the input parameter is an input trajectory;
the shooting module is specifically used for responding to the first input and executing shooting operation according to the target shooting speed corresponding to the input track of the first input;
wherein, different input trajectories correspond to different shooting speeds.
10. The terminal device according to any one of claims 6 to 8, wherein the input parameters include an input direction and an input number;
the first input includes: a first sub-input inputted in a first direction and a second sub-input inputted in a second direction;
the shooting module is specifically configured to respond to the first sub-input and execute a shooting operation at a first shooting double speed corresponding to the first sub-input, where the first shooting double speed is proportional to the number of inputs corresponding to the first sub-input;
the shooting module is specifically further configured to, in response to the second sub-input, execute a shooting operation at a second shooting double speed corresponding to the second sub-input, where the second shooting double speed is inversely proportional to the number of inputs corresponding to the second sub-input.
11. A terminal device characterized by comprising a processor, a memory, and a computer program stored on the memory and executable on the processor, the computer program, when executed by the processor, implementing the steps of the moving image capturing method according to any one of claims 1 to 5.
12. A computer-readable storage medium, characterized in that a computer program is stored thereon, which when executed by a processor implements the steps of the moving image capturing method according to any one of claims 1 to 5.
CN201811554763.5A 2018-12-18 2018-12-18 Dynamic image shooting method and terminal equipment Active CN109547697B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811554763.5A CN109547697B (en) 2018-12-18 2018-12-18 Dynamic image shooting method and terminal equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811554763.5A CN109547697B (en) 2018-12-18 2018-12-18 Dynamic image shooting method and terminal equipment

Publications (2)

Publication Number Publication Date
CN109547697A CN109547697A (en) 2019-03-29
CN109547697B true CN109547697B (en) 2020-10-09

Family

ID=65855411

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811554763.5A Active CN109547697B (en) 2018-12-18 2018-12-18 Dynamic image shooting method and terminal equipment

Country Status (1)

Country Link
CN (1) CN109547697B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112738402B (en) * 2020-12-30 2022-03-15 维沃移动通信(杭州)有限公司 Shooting method, shooting device, electronic equipment and medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103973975A (en) * 2014-04-10 2014-08-06 北京智谷睿拓技术服务有限公司 Interaction method, interaction device and user equipment
GB2525600A (en) * 2014-04-28 2015-11-04 Youlapse Oy User input technique for adjusting successive image capturing
CN105306824A (en) * 2015-11-17 2016-02-03 小米科技有限责任公司 Shooting parameter adjusting method and device
CN106375646A (en) * 2015-07-21 2017-02-01 腾讯科技(深圳)有限公司 Information processing method and terminal
CN108566527A (en) * 2018-03-30 2018-09-21 青岛海信移动通信技术股份有限公司 A kind of kinescope method and device

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7545434B2 (en) * 2002-02-04 2009-06-09 Hewlett-Packard Development Company, L.P. Video camera with variable image capture rate and related methodology

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103973975A (en) * 2014-04-10 2014-08-06 北京智谷睿拓技术服务有限公司 Interaction method, interaction device and user equipment
GB2525600A (en) * 2014-04-28 2015-11-04 Youlapse Oy User input technique for adjusting successive image capturing
CN106375646A (en) * 2015-07-21 2017-02-01 腾讯科技(深圳)有限公司 Information processing method and terminal
CN105306824A (en) * 2015-11-17 2016-02-03 小米科技有限责任公司 Shooting parameter adjusting method and device
CN108566527A (en) * 2018-03-30 2018-09-21 青岛海信移动通信技术股份有限公司 A kind of kinescope method and device

Also Published As

Publication number Publication date
CN109547697A (en) 2019-03-29

Similar Documents

Publication Publication Date Title
CN108668083B (en) Photographing method and terminal
CN110891144B (en) Image display method and electronic equipment
CN109002243B (en) Image parameter adjusting method and terminal equipment
CN109862267B (en) Shooting method and terminal equipment
CN108495029B (en) Photographing method and mobile terminal
CN109525874B (en) Screen capturing method and terminal equipment
CN110489045B (en) Object display method and terminal equipment
CN110752981B (en) Information control method and electronic equipment
CN109922265B (en) Video shooting method and terminal equipment
CN110944236B (en) Group creation method and electronic device
CN110769174B (en) Video viewing method and electronic equipment
CN108228902B (en) File display method and mobile terminal
CN111050109B (en) Electronic equipment control method and electronic equipment
CN111010511B (en) Panoramic body-separating image shooting method and electronic equipment
CN110865745A (en) Screen capturing method and terminal equipment
CN111064848B (en) Picture display method and electronic equipment
CN110908554B (en) Long screenshot method and terminal device
CN111010523A (en) Video recording method and electronic equipment
CN111597370A (en) Shooting method and electronic equipment
CN111596990A (en) Picture display method and device
CN109361864B (en) Shooting parameter setting method and terminal equipment
CN109117037B (en) Image processing method and terminal equipment
CN108833791B (en) Shooting method and device
CN111064896A (en) Device control method and electronic device
CN108196699B (en) Page switching method and equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant