CN108260013B - Video playing control method and terminal - Google Patents

Video playing control method and terminal Download PDF

Info

Publication number
CN108260013B
CN108260013B CN201810264914.7A CN201810264914A CN108260013B CN 108260013 B CN108260013 B CN 108260013B CN 201810264914 A CN201810264914 A CN 201810264914A CN 108260013 B CN108260013 B CN 108260013B
Authority
CN
China
Prior art keywords
control
input
video
target
display area
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810264914.7A
Other languages
Chinese (zh)
Other versions
CN108260013A (en
Inventor
崔晓东
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN201810264914.7A priority Critical patent/CN108260013B/en
Publication of CN108260013A publication Critical patent/CN108260013A/en
Application granted granted Critical
Publication of CN108260013B publication Critical patent/CN108260013B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/472End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72469User interfaces specially adapted for cordless or mobile telephones for operating the device by selecting functions from two or more displayed items, e.g. menus or icons
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/485End-user interface for client configuration
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2250/00Details of telephonic subscriber devices
    • H04M2250/22Details of telephonic subscriber devices including a touch pad, a touch sensor or a touch detector

Abstract

The invention provides a video playing control method and a terminal, wherein the video playing control method comprises the following steps: the method comprises the steps of receiving a first input of a user, responding to the first input, displaying a target control, receiving a second input of the user on the target control, responding to the second input, and executing video playing control operation corresponding to the second input on a currently played target video. According to the scheme of the invention, when the playing control is carried out on the currently played video, the playing control can be conveniently and quickly carried out on the currently played video in a smaller area only by operating the target control which is triggered and displayed by the first input, so that the operation is simplified, and the operation time is saved.

Description

Video playing control method and terminal
Technical Field
The embodiment of the invention relates to the technical field of communication, in particular to a video playing control method and a terminal.
Background
With the development of communication technology, a terminal such as a smart phone has become one of essential tools in people's life, and the terminal has more and more functions such as social chat, music playing, video playing and the like. During idle time such as commuting and leisure and entertainment time, people usually spend a large portion of their time watching various videos. During the viewing of video, the user often holds the terminal with one hand.
Currently, the control of video playing, such as volume control, fast forward control, etc., is mainly implemented by manipulating physical buttons on the side of the terminal or virtual buttons on the video playing interface. Therefore, on the premise that the user holds the terminal with one hand, the user is generally required to use the other hand to perform video playing control, or the user needs to use the hand of the handheld terminal to move for a large displacement to perform video playing control, so that the operation is inconvenient.
Disclosure of Invention
The embodiment of the invention provides a video playing control method and a terminal, aiming at solving the problem that the existing video playing control method is inconvenient to operate.
In order to solve the technical problem, the invention is realized as follows:
in a first aspect, an embodiment of the present invention provides a video playing control method, which is applied to a terminal, and includes:
receiving a first input of a user;
displaying a target control in response to the first input;
receiving a second input of the user on the target control;
and responding to the second input, and executing video playing control operation corresponding to the second input on the currently played target video.
In a second aspect, an embodiment of the present invention further provides a terminal, including:
the first receiving module is used for receiving a first input of a user;
a first display module to display a target control in response to the first input;
the second receiving module is used for receiving a second input of the user on the target control;
and the execution module is used for responding to the second input and executing video playing control operation corresponding to the second input on the currently played target video.
In a third aspect, an embodiment of the present invention further provides a terminal, including a memory, a processor, and a computer program stored on the memory and executable on the processor, where the computer program, when executed by the processor, implements the steps of the video playback control method.
In a fourth aspect, an embodiment of the present invention further provides a computer-readable storage medium, on which a computer program is stored, where the computer program, when executed by a processor, implements the steps of the video playback control method.
In the embodiment of the invention, the first input of a user is received, the target control is displayed in response to the first input, the second input of the user on the target control is received, the video playing control operation corresponding to the second input is executed on the currently played target video in response to the second input, and when the currently played video is played and controlled, the displayed target control is triggered only by operating the first input, so that the currently played video can be conveniently and quickly played and controlled in a small area, the operation is simplified, and the operation time is saved.
Drawings
Fig. 1 is a flowchart of a video playing control method according to an embodiment of the present invention;
FIG. 2 is one of the schematic diagrams of generating a target control based on a first input from a user according to an embodiment of the present invention;
FIG. 3 is a second illustration of a target control generated based on a first input from a user according to an embodiment of the present invention;
FIG. 4 is a diagram illustrating control of fast forwarding of video playing according to an embodiment of the present invention
Fig. 5 is a schematic diagram illustrating controlling the video playback volume according to an embodiment of the present invention;
FIG. 6 is a diagram illustrating controlling video playback speed according to an embodiment of the present invention;
fig. 7 is a schematic diagram illustrating controlling video playing brightness according to an embodiment of the present invention;
FIG. 8 is a schematic diagram of controlling video capture according to an embodiment of the present invention;
fig. 9 is a schematic structural diagram of a terminal according to an embodiment of the present invention;
fig. 10 is a second schematic structural diagram of a terminal according to an embodiment of the present invention.
Detailed Description
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings required to be used in the embodiments of the present invention will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without inventive labor.
Referring to fig. 1, an embodiment of the present invention provides a video playing control method, which is applied to a terminal, and includes the following steps:
step 101: a first input is received from a user.
In the embodiment of the present invention, the first input may be a voice input, a gesture input, or a touch input on a display screen of the terminal. Specifically, step 101 may include:
a first input of a user in a target display area of a terminal display screen is received.
The target display area can be at least one of a first display area, a second display area and a third display area of a terminal display screen. The specific positions of the first display area, the second display area and the third display area can be preset according to actual conditions. During specific implementation, the terminal display screen can be a special-shaped screen, correspondingly, the first display area and the second display area are two areas which are separated from each other in the top end area of the special-shaped screen, and the third display area is other display areas except the first display area and the second display area in the special-shaped screen. In this way, by inputting in different display areas of the special-shaped screen, the input operation of the user can be diversified, and the input operation of the user is simplified.
For example, the structure of a terminal shaped screen can be seen in fig. 2, but not limited thereto. In fig. 2, the terminal, that is, the special-shaped screen of the mobile phone, may be referred to as a screen with a bang area, the bang area is specifically a non-display area in which the top end of the screen is recessed and which is used for setting a camera, and the like, two side areas of the bang area, that is, the display area a and the display area B in fig. 2, correspond to the first display area and the second display area, which may be generally referred to as an ear area, and the display area C in fig. 2 corresponds to the third display area.
In particular implementations, the first input may include (but is not limited to) at least one of:
multi-finger sliding operation of a user in a target display area; for example, the user can simultaneously press the first display area and the second display area and then simultaneously slide downwards; or, the user can simultaneously press the first display area and the second display area for a preset time and then simultaneously slide downwards, and the like;
pressing operation of a user in the target display area; for example, a user presses the first display area and the second display area at the same time;
sliding and hovering operations of a user in the target display area; for example, referring to fig. 2 and 3, the user performs a first sliding operation of sliding a first preset distance in the third display area, i.e., from the display area near the bottom end of the bang area, a hovering operation of staying at an operation end position of the first sliding operation for a preset time, and a second sliding operation of sliding a second preset distance from the operation end position of the first sliding operation;
a sliding operation of the user from the first display area to the second display area; and so on.
Step 102: in response to the first input, a target control is displayed.
In the embodiment of the present invention, the target control may be a preset control, or may be a control generated based on the first input. The target control can be displayed on any side of the terminal display screen and further can be determined by the side where the operation position of the first input is located.
In a specific implementation, the target control may be a linear control, which may be named, but not limited to Touch-line, Touch line, or Touch line, and may be interpreted as, but not limited to, a Touch line, a manipulation line, or a manipulation control, where the linear control may be, for example, a linear arc track, a linear track, or the like, and the linear width of the linear control may range from 1 to 5cm, for example. For example, referring to FIG. 3, the target control may be only the linear type sub-control 1, or the linear type sub-control 3.
In addition, the target control may also include two linear sub-controls, for example, as shown in fig. 3, the target control may include a linear sub-control 1 and a linear sub-control 3. The target control may further include two line-type sub-controls and one contact-type sub-control, as shown in fig. 3, where 1 and 2 represent the line-type sub-controls, and 3 represents the contact-type sub-control), in which case the contact-type sub-control may be used to hover one line-type sub-control, that is, the initial position of the line-type sub-control is the position defined by the contact-type sub-control, and the line-type sub-control may be restored to the initial position after performing, for example, a pulling operation, a dragging operation, and the like on the line-type sub-control. Wherein in the case where the target control includes two linear sub-controls, the shapes of the two linear sub-controls may be the same or different. Therefore, the input operation of the user aiming at the target control can be diversified by means of various composition modes of the target control.
Step 103: a second input by the user on the target control is received.
In this embodiment of the present invention, since the target control in the above step may include a plurality of sub-controls, step 103 may specifically include:
a second input by the user on the target child control of the target control is received.
And in the case that the target control comprises two linear sub-controls, the target sub-control can be at least one of the two linear sub-controls. In the case where the target control includes two line-type sub-controls and one contact-type sub-control, the target sub-control may be at least one of the two line-type sub-controls and the one contact-type sub-control. It should be understood that in the case where the target control is a control, i.e., where there is no sub-control, the second input is the input for the target control.
In particular implementations, the second input may include (but is not limited to) at least one of:
sliding operation of a user on the target child control;
dragging operation of the target child control by the user;
pressing operation of a user on the target child control;
a pulling operation of the user on the target sub-control; and so on.
Step 104: and responding to the second input, and executing video playing control operation corresponding to the second input on the currently played target video.
In an embodiment of the present invention, the video playing control operation may include at least one of the following:
the control operation of fast forward or fast backward of video playing, the control operation of video playing volume, the control operation of video playing speed, the control operation of video capturing, the control operation of selecting video clip playing, the control operation of video playing brightness and the like.
It should be noted that after the video playing control is completed, the terminal may exit the video playing control mode according to the user input, that is, the target control displayed on the terminal display interface is eliminated.
According to the video playing control method provided by the embodiment of the invention, the first input of the user is received, the target control is displayed in response to the first input, the second input of the user on the target control is received, the video playing control operation corresponding to the second input is executed on the currently played target video in response to the second input, and when the currently played video is played and controlled, the displayed target control is triggered only by operating the first input, so that the currently played video can be played and controlled conveniently and quickly in a small area, the operation is simplified, and the operation time is saved.
For example, referring to fig. 2 and 3, fig. 2 and 3 are schematic diagrams illustrating generation of a target control based on a first input of a user according to an embodiment of the present invention. In the embodiment of the invention, the display screen of the mobile phone is a special-shaped screen, and the target control comprises a linear sub-control 1, a linear sub-control 2 and a contact sub-control 3; in the video playing mode, when a first sliding operation of a user starting from a display area D of the display area C and sliding laterally by a preset distance s1 and a hovering operation of staying at an operation ending position of the first sliding operation for a preset time t are received, the display area D is a display area close to the bottom end of the bang area, and the mobile phone can trigger a display line type sub-control 1, namely a linear touch line Touchline, as shown in fig. 2; further, when receiving a continuous side sliding operation of the user from the operation end position of the first sliding operation, that is, a second sliding operation of sliding a preset distance s2, the mobile phone may trigger to display the linear sub-control 2, that is, the curvilinear Touchline, and the contact sub-control 3, as shown in fig. 3, where the contact sub-control 3 is used to hover and fix the linear sub-control 2, the position of the contact sub-control 3 does not change in the subsequent operation process, and a display area E is formed between the linear Touchline and the curvilinear Touchline.
In the embodiment of the present invention, in the process of executing step 104, the method may further include:
and displaying a video playing control result corresponding to the video playing control operation in a target display area of the terminal display screen.
The target display area can be preset as an area convenient for a user to watch according to actual conditions.
Further, in the case that the target control includes two linear type sub-controls, in the process of executing step 104, the method may further include:
and displaying a video playing control result corresponding to the video playing control operation in a display area between the two linear sub-controls.
In this case, the display area between the two linear sub-controls is the target display area. Therefore, in the process of executing the video playing control operation on the target video, the corresponding video playing control result is displayed in the target display area, so that a user can conveniently check the video playing control result, and the situation of video playing control can be known conveniently.
In this embodiment of the present invention, in order to accurately perform corresponding video playing control, step 104 may include:
acquiring the operation characteristics of the second input;
and executing video playing control operation corresponding to the second input operation characteristic on the currently played target video.
Wherein, the operation characteristic of the second input may include at least one of the following (but not limited to):
the operation direction of the second input, the operation starting position of the second input, the operation ending position of the second input, the magnitude of the pressing pressure of the second input, the change characteristic of the pressing pressure of the second input and the sub-control of the target control corresponding to the second input.
Therefore, the corresponding video playing control operation is executed on the target video by means of the second input operation characteristic, and the video playing control meeting the user requirements can be accurately realized.
Some video playing control processes that can be realized by the embodiments of the present invention are described below with reference to fig. 4 to 7.
For example, referring to fig. 4, fig. 4 is a schematic diagram illustrating controlling fast forward of video playing according to an embodiment of the present invention. In fig. 4, the display screen of the mobile phone is a special-shaped screen, and the target controls include a straight line Touchline 1 (i.e., a linear sub-control 1), a curved line Touchline 2 (i.e., a linear sub-control 2), and a display area
Figure BDA0001611142850000071
A display area E is formed between the displayed virtual button 3 (i.e., the contact type sub-control 3), the straight line Touchline 1 and the curve Touchline 2; in the video playing mode, when receiving an input of a user pressing and pulling a curve Touchline 2 to slide a distance s to the right (in the direction of an arrow in fig. 4) (i.e., to slide the distance s to the direction away from the virtual button 3 in the middle of the display screen of the mobile phone), the mobile phone can select a picture at a fast forward node from a currently played video, and simultaneously display a video fast forward preview window F in a display area E, and display a picture at the fast forward node in a reduced video form in the video fast forward preview window F, wherein the position of the fast forward node is related to the size of the distance s, i.e., the larger the distance s is, the later the position of the fast forward node in the currently played video is, and at this time, the display screen of the mobile phone is in a state of; if the position of the curve Touchline 2 pulled by the user is not moved in the input process, simultaneously playing the video at the corresponding fast forward node in the video fast forward preview window F; after the user finishes the input operation, the curve Touchline 2 is restored to the initial position, and the video is played on the display screen of the mobile phone from the determined fast-forward node.
When video playing backward control is performed, a user can press and pull the curve Touchline 2 to slide leftward, a corresponding mobile phone can select a picture at a backward node from a currently played video, a video fast forward preview window F is displayed in the display area E, the picture at the backward node is displayed in the video fast forward preview window F in a reduced video mode, the position of the backward node is related to the leftward sliding distance, namely, the larger the sliding distance is, the more the position of the backward node in the currently played video is.
For another example, referring to fig. 5, fig. 5 is a schematic diagram illustrating controlling the playing volume of a video according to an embodiment of the present invention. In fig. 5, the display screen of the mobile phone is a special-shaped screen, and the target controls include a straight line Touchline 1 (i.e., a linear sub-control 1), a curved line Touchline 2 (i.e., a linear sub-control 2), and a display area
Figure BDA0001611142850000072
Displayed virtual button 3 (i.e., contact type child control 3), straight line touchA display area E is formed between hline 1 and the curve Touchline 2; in the video playing mode, when receiving an input that the user slides upwards (in the direction of the arrow in fig. 5) on the curve Touchline 2, the mobile phone may increase the volume of the currently played video, and simultaneously display the increased volume in the display area E in the form of the volume bar G, as shown in fig. 5. Correspondingly, when receiving the input of the user sliding down on the curve Touchline 2, the mobile phone can reduce the volume of the currently played video, and simultaneously, synchronously display the reduced volume in the display area E.
For another example, referring to fig. 6, fig. 6 is a schematic diagram illustrating controlling a video playing rate according to an embodiment of the present invention. In fig. 6, the display screen of the mobile phone is a shaped screen, and the target controls include a straight line Touchline 1 (i.e. a linear sub-control 1), a curved line Touchline 2 (i.e. a linear sub-control 2), and a display screen
Figure BDA0001611142850000073
A display area E is formed between the displayed virtual button 3 (i.e., the contact type sub-control 3), the straight line Touchline 1 and the curve Touchline 2; in the video playing mode, when receiving an input that the user presses the virtual button 3 and increases the force, the mobile phone may increase the playing speed of the currently played video, and simultaneously display the increased playing speed in the display area E in the form of the rotating arrow H, where the increased playing speed is 3 as shown in fig. 6; when the desired double speed is obtained, the user can end the input operation, i.e., stop the pressing operation of the virtual button 3, and after the input operation is ended, the mobile phone can play the video at the increased speed. Correspondingly, when receiving an input of pressing the virtual button 3 and reducing the force by the user, the mobile phone may reduce the playing speed of the currently played video, slightly pressing the virtual button corresponds to a slower playing speed, and simultaneously synchronously displaying the reduced playing speed in the form of a rotating arrow (opposite to the direction of the rotating arrow in fig. 6) in the display area E, and after the user finishes the input operation, the mobile phone may play the video at the reduced speed.
For another example, referring to fig. 7, fig. 7 is a schematic diagram illustrating controlling video playing brightness according to an embodiment of the present invention. In fig. 7, the display screen of the mobile phone is a special-shaped screenThe markup control parts comprise a straight line Touchline 1 (namely a linear type child control part 1), a curve Touchline 2 (namely a linear type child control part 2) and a display part
Figure BDA0001611142850000081
A display area E is formed between the displayed virtual button 3 (i.e., the contact type sub-control 3), the straight line Touchline 1 and the curve Touchline 2; in the video playing mode, when receiving an input that the user slides upwards (in the direction of the arrow in fig. 7) on the curve Touchline 1, the mobile phone may increase the brightness of the currently played video, and simultaneously display the increased brightness in the form of the arrow K in the display area E, as shown in fig. 7 by 70%; when the brightness is adjusted to the top, the adjusted brightness is the maximum brightness. Correspondingly, when receiving an input of a user sliding down on the curve Touchline 1, the mobile phone may reduce the brightness of the currently played video, and simultaneously display the reduced brightness in the display area E, where the adjusted brightness is the minimum brightness when sliding down to the low end.
In the embodiment of the present invention, the video playing control operation executed by the terminal may include selecting a video clip to play. Specifically, step 103 may include:
under the condition that the target control comprises a first linear sub-control and a second linear sub-control, receiving second input of a user for selecting a first position point on the first linear sub-control and selecting a second position point on the second linear sub-control respectively;
and receiving a second input of selecting the first position point and the second position point on the target control by the user under the condition that the target control is a linear control.
Correspondingly, step 104 may include:
determining a first video time point corresponding to the first position point as a video playing starting time point, and determining a second video time point corresponding to the second position point as a video playing ending time point;
and playing the target video according to the video playing starting time point and the video playing ending time point.
Therefore, by selecting the first position point and the second position point on the target control and determining the video time points corresponding to the first position point and the second position point as the starting time point and the ending time point of video playing, a user can conveniently select a required video clip to play, and the operation is simplified.
When the first position point and the second position point are selected, the position points corresponding to the video starting point and the video ending point of the target video and the position distance corresponding to the duration of the target video are fully considered, so that the first position point and the second position point are accurately selected, the starting time point and the ending time point of the video clip are further accurately determined, and the required video clip is accurately selected to be played. For example, if the target control is a straight line Touchline, a first endpoint of the straight line Touchline corresponds to a video starting point of the target video, a second endpoint of the straight line Touchline corresponds to a video end point of the target video, and the length of the straight line Touchline corresponds to the duration of the target video, when the user selects the latter half segment of the target video to play, the selectable position points are a middle point and a second endpoint of the straight line Touchline.
Optionally, after selecting the first location point and the second location point, step 104 may further include:
and intercepting video content between a first video time point corresponding to the first position point and a second video time point corresponding to the second position point in the target video, and outputting the target sub-video.
Therefore, by selecting the first position point and the second position point on the target control and intercepting the video content between the first video time point corresponding to the first position point and the second video time point corresponding to the second position point in the target video, a user can conveniently intercept a required video clip, and the operation is simplified.
For example, referring to fig. 8, fig. 8 is a schematic diagram for controlling video capture according to an embodiment of the present invention. In fig. 8, the display screen of the mobile phone is a special-shaped screen, and the target controls include a straight line Touchline 1 (i.e., a linear sub-control 1), a curved line Touchline 2 (i.e., a linear sub-control 2), and a display area
Figure BDA0001611142850000091
Display deviceA display area E is formed between the straight line Touchline 1 and the curve Touchline 2 of the virtual button 3 (i.e., the contact type sub-control 3); wherein, a starting position point M of a central point of the straight line Touchline 1 corresponds to a video starting point of the target video P, a starting position point N of a central point of the straight line Touchline 2 corresponds to a video end point of the target video P, a distance between the starting position point M of the central point and the starting position point N of the central point corresponds to a duration of the target video P, and video content of the target video P corresponds to a line segment Z; in the case of playing the target video P, when receiving an input that the user pulls the straight line Touchline 1 to the right (arrow direction in fig. 8) and pulls the curve Touchline 2 to the left (arrow direction in fig. 8), as shown in fig. 8, the ending position point of the center point of the pulled straight line Touchline 1 is M ', the ending position point of the center point of the pulled curve Touchline 2 is N', that is, selecting a position point M 'on Touchline 1 and a position point N' on curve Touchline 2, the mobile phone can intercept video content between a video time point corresponding to the position point M 'and a video time point corresponding to the position point N' in the target video P, such as video content corresponding to a line segment I in fig. 8, input the intercepted video, simultaneously displaying a line segment J in the display area E, wherein the line segment J connects the position points M 'and N', corresponds to the line segment I and represents the time length of the intercepted video; in the process of capturing the video, as shown in fig. 8, the straight line Touchline 1 and the curve Touchline 2 before being pulled may be displayed in a low-luminance form, and the straight line Touchline 1 and the curve Touchline 2 after being pulled may be displayed in a high-luminance form, so as to be distinguished.
It should be noted that, in the above embodiments, the transverse screen playing video is taken as an example for description, but besides the transverse screen playing video, the terminal may also play the video in the vertical screen, and at this time, the corresponding video playing control is similar to that when the transverse screen playing video, and is not described herein again.
According to the video playing control method, the currently played target video is controlled through the input of the user on the target control, and the currently played video can be conveniently and quickly controlled to be played in a small area, so that the operation is simplified, and the operation time is saved; meanwhile, the corresponding video playing control result is displayed in the target display area, so that a user can conveniently check the video playing control result, and the situation of video playing control can be conveniently known.
The above-mentioned embodiment describes the video playback control method of the present invention, and a terminal corresponding to the video playback control method of the present invention will be described with reference to the embodiment and the drawings.
Referring to fig. 9, an embodiment of the present invention further provides a terminal, which includes a first receiving module 91, a first displaying module 92, a second receiving module 93, and an executing module 94.
The first receiving module 91 is configured to receive a first input of a user.
The first display module 92 is configured to display a target control in response to the first input.
The second receiving module 93 is configured to receive a second input of the user on the target control.
The executing module 94 is configured to, in response to the second input, execute a video playing control operation corresponding to the second input on the currently played target video.
According to the terminal provided by the embodiment of the invention, the first input of the user is received, the target control is displayed in response to the first input, the second input of the user on the target control is received, the video playing control operation corresponding to the second input is executed on the currently played target video in response to the second input, and when the currently played video is played and controlled, the displayed target control is triggered only by operating the first input, so that the currently played video can be played and controlled conveniently and quickly in a small area, the operation is simplified, and the operation time is saved.
Optionally, the terminal further includes:
and the second display module is used for displaying a video playing control result corresponding to the video playing control operation in a target display area of a terminal display screen.
Optionally, the terminal further includes at least one of:
the target control is a linear control;
the target control comprises two linear sub-controls;
the target control comprises two linear sub-controls and a contact type sub-control;
wherein, in the case that the target control comprises two linear sub-controls, the shapes of the two linear sub-controls are the same or different.
Optionally, the second receiving module is specifically configured to:
receiving a second input of a user on a target sub-control of the target control;
under the condition that the target control comprises two linear sub-controls, the target sub-control is at least one sub-control in the two linear sub-controls;
and under the condition that the target control comprises two line type sub-controls and one contact type sub-control, the target sub-control is at least one of the two line type sub-controls and the contact type sub-control.
Optionally, the target control includes two linear sub-controls;
the terminal further comprises:
and the third display module is used for displaying a video playing control result corresponding to the video playing control operation in a display area between the two linear sub-controls.
Optionally, the executing module includes:
an acquisition unit configured to acquire an operation characteristic of the second input;
the execution unit is used for executing video playing control operation corresponding to the second input operation characteristic on the currently played target video;
wherein the operational characteristics of the second input include at least one of:
the operation direction of the second input, the operation starting position of the second input, the operation ending position of the second input, the magnitude of the pressing pressure of the second input, the change characteristic of the pressing pressure of the second input, and the sub-control of the target control corresponding to the second input.
Optionally, the video playing control operation includes at least one of:
the control operation of fast forward or fast backward of video playing, the control operation of video playing volume, the control operation of video playing speed, the control operation of video capturing, the control operation of selecting video clip playing and the control operation of video playing brightness.
Optionally, the second receiving module is specifically configured to:
under the condition that the target control comprises a first linear type sub-control and a second linear type sub-control, receiving second input of a user for selecting a first position point on the first linear type sub-control and selecting a second position point on the second linear type sub-control;
receiving a second input of a user selecting a first position point and a second position point on the target control under the condition that the target control is a linear control;
the execution module comprises:
a determining unit, configured to determine a first video time point corresponding to the first position point as a video playing start time point, and determine a second video time point corresponding to the second position point as a video playing end time point;
and the playing unit is used for playing the target video according to the video playing starting time point and the video playing ending time point.
Optionally, the terminal further includes:
and the intercepting module is used for intercepting video content between a first video time point corresponding to the first position point and a second video time point corresponding to the second position point in the target video and outputting a target sub-video.
Optionally, the first receiving module is specifically configured to:
receiving a first input of a user in a target display area of a terminal display screen;
the target display area is at least one of a first display area, a second display area and a third display area of the terminal display screen.
Optionally, the terminal display screen is a special-shaped screen, the first display area and the second display area are two areas separated from each other in a top end area of the special-shaped screen, and the third display area is another display area of the special-shaped screen except the first display area and the second display area.
An embodiment of the present invention further provides a terminal, including a processor, a memory, and a computer program stored in the memory and capable of running on the processor, where the computer program, when executed by the processor, implements each process of the video playback control method embodiment, and can achieve the same technical effect, and is not described herein again to avoid repetition.
Specifically, fig. 10 is a schematic diagram of a hardware structure of a terminal for implementing various embodiments of the present invention, where the terminal 1000 includes, but is not limited to: a radio frequency unit 1001, a network module 1002, an audio output unit 1003, an input unit 1004, a sensor 1005, a display unit 1006, a user input unit 1007, an interface unit 1008, a memory 1009, a processor 1010, and a power supply 1011. Those skilled in the art will appreciate that the terminal configuration shown in fig. 10 is not intended to be limiting, and that the terminal may include more or fewer components than shown, or some components may be combined, or a different arrangement of components. In the embodiment of the present invention, the terminal includes, but is not limited to, a mobile phone, a tablet computer, a notebook computer, a palm computer, a vehicle-mounted terminal, a wearable terminal, a pedometer, and the like.
The user input unit 1007 is used for receiving a first input of a user;
a display unit 1006, configured to display a target control in response to the first input;
the user input unit 1007 is also used to: receiving a second input of the user on the target control;
a processor 1010 configured to: and responding to the second input, and executing video playing control operation corresponding to the second input on the currently played target video.
According to the terminal provided by the embodiment of the invention, the first input of the user is received, the target control is displayed in response to the first input, the second input of the user on the target control is received, the video playing control operation corresponding to the second input is executed on the currently played target video in response to the second input, and when the currently played video is played and controlled, the displayed target control is triggered only by operating the first input, so that the currently played video can be played and controlled conveniently and quickly in a small area, the operation is simplified, and the operation time is saved.
It should be understood that, in the embodiment of the present invention, the radio frequency unit 1001 may be used for receiving and sending signals during a message transmission or a call, and specifically, receives downlink data from a base station and then processes the received downlink data to the processor 1010; in addition, the uplink data is transmitted to the base station. In general, radio frequency unit 1001 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like. Further, the radio frequency unit 1001 may also communicate with a network and other terminals through a wireless communication system.
The terminal provides the user with wireless broadband internet access through the network module 1002, such as helping the user send and receive e-mails, browse webpages, access streaming media, and the like.
The audio output unit 1003 may convert audio data received by the radio frequency unit 1001 or the network module 1002 or stored in the memory 1009 into an audio signal and output as sound. Also, the audio output unit 1003 can provide audio output (e.g., a call signal reception sound, a message reception sound, etc.) related to a specific function performed by the terminal 1000. The audio output unit 1003 includes a speaker, a buzzer, a receiver, and the like.
The input unit 1004 is used to receive an audio or video signal. The input Unit 1004 may include a Graphics Processing Unit (GPU) 10041 and a microphone 10042, the Graphics processor 10041 Processing image data of still pictures or video obtained by an image capturing device (such as a camera) in a video capturing mode or an image capturing mode. The processed image frames may be displayed on the display unit 1006. The image frames processed by the graphic processor 10041 may be stored in the memory 1009 (or other storage medium) or transmitted via the radio frequency unit 1001 or the network module 1002. The microphone 10042 can receive sound and can process such sound into audio data. The processed audio data may be converted into a format output transmittable to a mobile communication base station via the radio frequency unit 1001 in case of a phone call mode.
Terminal 1000 can also include at least one sensor 1005 such as a light sensor, motion sensor, and other sensors. Specifically, the light sensor includes an ambient light sensor that can adjust the brightness of the display panel 10061 according to the brightness of ambient light, and a proximity sensor that can turn off the display panel 10061 and/or a backlight when the terminal 1000 moves to the ear. As one of the motion sensors, the accelerometer sensor can detect the magnitude of acceleration in each direction (generally three axes), detect the magnitude and direction of gravity when stationary, and can be used to identify the terminal posture (such as horizontal and vertical screen switching, related games, magnetometer posture calibration), vibration identification related functions (such as pedometer, tapping), and the like; the sensors 1005 may also include a fingerprint sensor, a pressure sensor, an iris sensor, a molecular sensor, a gyroscope, a barometer, a hygrometer, a thermometer, an infrared sensor, etc., which will not be described in detail herein.
The display unit 1006 is used to display information input by the user or information provided to the user. The Display unit 1006 may include a Display panel 10061, and the Display panel 10061 may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like.
The user input unit 1007 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the terminal. Specifically, the user input unit 1007 includes a touch panel 10071 and other input terminals 10072. The touch panel 10071, also referred to as a touch screen, may collect touch operations by a user on or near the touch panel 10071 (e.g., operations by a user on or near the touch panel 10071 using a finger, a stylus, or any other suitable object or attachment). The touch panel 10071 may include two parts, a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 1010, and receives and executes commands sent by the processor 1010. In addition, the touch panel 10071 may be implemented by various types, such as a resistive type, a capacitive type, an infrared ray, and a surface acoustic wave. The user input unit 1007 may include other input terminals 10072 in addition to the touch panel 10071. Specifically, the other input terminals 10072 may include, but are not limited to, a physical keyboard, function keys (such as volume control keys, switch keys, etc.), a track ball, a mouse, and a joystick, which are not described herein again.
Further, the touch panel 10071 can be overlaid on the display panel 10061, and when the touch panel 10071 detects a touch operation thereon or nearby, the touch operation is transmitted to the processor 1010 to determine the type of the touch event, and then the processor 1010 provides a corresponding visual output on the display panel 10061 according to the type of the touch event. Although in fig. 10, the touch panel 10071 and the display panel 10061 are two independent components for implementing the input and output functions of the terminal, in some embodiments, the touch panel 10071 and the display panel 10061 may be integrated for implementing the input and output functions of the terminal, which is not limited herein.
Interface unit 1008 is an interface for connecting an external device to terminal 1000. For example, the external device may include a wired or wireless headset port, an external power supply (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, an audio input/output (I/O) port, a video I/O port, an earphone port, and the like. Interface unit 1008 can be used to receive input from external devices (e.g., data information, power, etc.) and transmit the received input to one or more elements within terminal 1000 or can be used to transmit data between terminal 1000 and external devices.
The memory 1009 may be used to store software programs as well as various data. The memory 1009 may mainly include a program storage area and a data storage area, wherein the program storage area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, and the like), and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the cellular phone, and the like. Further, the memory 1009 may include high-speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
The processor 1010 is a control center of the terminal, connects various parts of the entire terminal using various interfaces and lines, and performs various functions of the terminal and processes data by operating or executing software programs and/or modules stored in the memory 1009 and calling data stored in the memory 1009, thereby integrally monitoring the terminal. Processor 1010 may include one or more processing units; preferably, the processor 1010 may integrate an application processor, which mainly handles operating systems, user interfaces, application programs, etc., and a modem processor, which mainly handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into processor 1010.
Terminal 1000 can also include a power supply 1011 (e.g., a battery) for powering the various components, and preferably, power supply 1011 can be logically coupled to processor 1010 through a power management system that provides management of charging, discharging, and power consumption.
In addition, terminal 1000 can also include some functional modules not shown, which are not described herein.
The embodiment of the present invention further provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the computer program implements each process of the video playing control method embodiment, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here. The computer-readable storage medium is, for example, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present invention may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal (such as a mobile phone, a computer, a server, an air conditioner, or a network terminal) to execute the method according to the embodiments of the present invention.
While the present invention has been described with reference to the embodiments shown in the drawings, the present invention is not limited to the embodiments, which are illustrative and not restrictive, and it will be apparent to those skilled in the art that various changes and modifications can be made therein without departing from the spirit and scope of the invention as defined in the appended claims.

Claims (16)

1. A video playing control method is applied to a terminal and is characterized by comprising the following steps:
receiving a first input of a user;
displaying a target control in response to the first input;
receiving a second input of the user on the target control;
responding to the second input, and executing video playing control operation corresponding to the second input on the currently played target video;
the target control comprises two linear sub-controls and a contact type sub-control;
the terminal display screen is an abnormal screen and comprises a first display area, a second display area and a third display area, wherein the first display area and the second display area are two areas which are separated from each other in the top end area of the abnormal screen, and the third display area is other display areas except the first display area and the second display area in the abnormal screen;
the step of displaying a target control in response to the first input comprises:
receiving a first sliding operation of a user starting from the third display area and sliding laterally for a preset distance and a hovering operation of staying at an operation ending position of the first sliding operation for a preset time, and triggering and displaying a first linear sub-control in response to the first sliding operation and the hovering operation;
receiving a second sliding operation of a preset distance of lateral sliding of the user from the operation ending position of the first sliding operation, and triggering and displaying a second linear type sub-control and a contact type sub-control in response to the second sliding operation;
the contact type sub-control is used for hovering and fixing a second line type sub-control;
the receiving a second input of the user on the target control comprises:
receiving a second input of a user on a target sub-control of the target control;
the target sub-control is at least one of the two line type sub-controls and the contact type sub-control.
2. The method according to claim 1, wherein during the video playing control operation corresponding to the second input is performed on the currently playing target video, the method further comprises:
and displaying a video playing control result corresponding to the video playing control operation in a target display area of a terminal display screen.
3. The method of claim 2,
in the process of executing the video playing control operation corresponding to the second input on the currently played target video, the method further includes:
and displaying a video playing control result corresponding to the video playing control operation in a display area between the two linear sub-controls.
4. The method according to claim 1, wherein the performing, in response to the second input, a video playing control operation corresponding to the second input on the currently playing target video comprises:
acquiring the operation characteristics of the second input;
executing video playing control operation corresponding to the second input operation characteristic on the currently played target video;
wherein the operational characteristics of the second input include at least one of:
the operation direction of the second input, the operation starting position of the second input, the operation ending position of the second input, the magnitude of the pressing pressure of the second input, the change characteristic of the pressing pressure of the second input, and the sub-control of the target control corresponding to the second input.
5. The method of claim 1, wherein the video playback control operation comprises at least one of:
the control operation of fast forward or fast backward of video playing, the control operation of video playing volume, the control operation of video playing speed, the control operation of video capturing, the control operation of selecting video clip playing and the control operation of video playing brightness.
6. The method of claim 1, wherein receiving a second input by a user on the target control comprises:
receiving a second input that a user selects a first position point on the first linear sub-control and a second position point on the second linear sub-control respectively;
the responding to the second input, executing video playing control operation corresponding to the second input on the currently played target video, and including:
determining a first video time point corresponding to the first position point as a video playing starting time point, and determining a second video time point corresponding to the second position point as a video playing ending time point;
and playing the target video according to the video playing starting time point and the video playing ending time point.
7. The method of claim 6, wherein after receiving the second input by the user on the target control, the method further comprises:
and intercepting video content between a first video time point corresponding to the first position point and a second video time point corresponding to the second position point in the target video, and outputting a target sub-video.
8. The method of claim 1, wherein receiving a first input from a user comprises:
receiving a first input of a user in a target display area of a terminal display screen;
the target display area is at least one of a first display area, a second display area and a third display area of the terminal display screen.
9. A terminal, comprising:
the first receiving module is used for receiving a first input of a user;
a first display module to display a target control in response to the first input;
the second receiving module is used for receiving a second input of the user on the target control;
the execution module is used for responding to the second input and executing video playing control operation corresponding to the second input on the currently played target video;
the target control comprises two linear sub-controls and a contact type sub-control;
the terminal display screen is an abnormal screen and comprises a first display area, a second display area and a third display area, wherein the first display area and the second display area are two areas which are separated from each other in the top end area of the abnormal screen, and the third display area is other display areas except the first display area and the second display area in the abnormal screen;
the first display module is specifically configured to:
receiving a first sliding operation of a user starting from a third display area and sliding laterally for a preset distance and a hovering operation of staying at an operation ending position of the first sliding operation for a preset time, and triggering and displaying a first linear sub-control in response to the first sliding operation and the hovering operation;
receiving a second sliding operation of a preset distance of lateral sliding of the user from the operation ending position of the first sliding operation, and triggering and displaying a second linear type sub-control and a contact type sub-control in response to the second sliding operation;
the contact type sub-control is used for hovering and fixing a second line type sub-control;
the second receiving module is specifically configured to:
receiving a second input of a user on a target sub-control of the target control;
the target sub-control is at least one of the two line type sub-controls and the contact type sub-control.
10. The terminal of claim 9, wherein the terminal further comprises:
and the second display module is used for displaying a video playing control result corresponding to the video playing control operation in a target display area of a terminal display screen.
11. The terminal of claim 9,
the terminal further comprises:
and the third display module is used for displaying a video playing control result corresponding to the video playing control operation in a display area between the two linear sub-controls.
12. The terminal of claim 9, wherein the execution module comprises:
an acquisition unit configured to acquire an operation characteristic of the second input;
the execution unit is used for executing video playing control operation corresponding to the second input operation characteristic on the currently played target video;
wherein the operational characteristics of the second input include at least one of:
the operation direction of the second input, the operation starting position of the second input, the operation ending position of the second input, the magnitude of the pressing pressure of the second input, the change characteristic of the pressing pressure of the second input, and the sub-control of the target control corresponding to the second input.
13. The terminal of claim 9, wherein the video playback control operation comprises at least one of:
the control operation of fast forward or fast backward of video playing, the control operation of video playing volume, the control operation of video playing speed, the control operation of video capturing, the control operation of selecting video clip playing and the control operation of video playing brightness.
14. The terminal of claim 9, wherein the second receiving module is specifically configured to:
receiving a second input that a user selects a first position point on the first linear sub-control and a second position point on the second linear sub-control respectively;
the execution module comprises:
a determining unit, configured to determine a first video time point corresponding to the first position point as a video playing start time point, and determine a second video time point corresponding to the second position point as a video playing end time point;
and the playing unit is used for playing the target video according to the video playing starting time point and the video playing ending time point.
15. The terminal of claim 14, wherein the terminal further comprises:
and the intercepting module is used for intercepting video content between a first video time point corresponding to the first position point and a second video time point corresponding to the second position point in the target video and outputting a target sub-video.
16. The terminal of claim 9, wherein the first receiving module is specifically configured to:
receiving a first input of a user in a target display area of a terminal display screen;
the target display area is at least one of a first display area, a second display area and a third display area of the terminal display screen.
CN201810264914.7A 2018-03-28 2018-03-28 Video playing control method and terminal Active CN108260013B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810264914.7A CN108260013B (en) 2018-03-28 2018-03-28 Video playing control method and terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810264914.7A CN108260013B (en) 2018-03-28 2018-03-28 Video playing control method and terminal

Publications (2)

Publication Number Publication Date
CN108260013A CN108260013A (en) 2018-07-06
CN108260013B true CN108260013B (en) 2021-02-09

Family

ID=62747482

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810264914.7A Active CN108260013B (en) 2018-03-28 2018-03-28 Video playing control method and terminal

Country Status (1)

Country Link
CN (1) CN108260013B (en)

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109120979B (en) * 2018-08-23 2020-08-21 Oppo广东移动通信有限公司 Video enhancement control method and device and electronic equipment
CN109151546A (en) * 2018-08-28 2019-01-04 维沃移动通信有限公司 A kind of method for processing video frequency, terminal and computer readable storage medium
CN109144394A (en) * 2018-08-28 2019-01-04 维沃移动通信有限公司 A kind of dynamic image processing method and terminal
CN109257636B (en) * 2018-09-28 2021-05-11 Oppo广东移动通信有限公司 Switching method and device for video enhancement, electronic equipment and storage medium
CN109710779A (en) * 2018-12-24 2019-05-03 北京金山安全软件有限公司 Multimedia file intercepting method, device, equipment and storage medium
CN110460907B (en) * 2019-08-16 2021-04-13 维沃移动通信有限公司 Video playing control method and terminal
CN111372140A (en) * 2020-03-04 2020-07-03 网易(杭州)网络有限公司 Barrage adjusting method and device, computer readable storage medium and electronic equipment

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101382867A (en) * 2008-10-28 2009-03-11 深圳市迅雷网络技术有限公司 Video playing interaction method and terminal
CN104516637A (en) * 2013-09-26 2015-04-15 腾讯科技(深圳)有限公司 Display control method and device for button in media playing page
CN104777999A (en) * 2015-03-20 2015-07-15 广东欧珀移动通信有限公司 Touch position display method and touch position display system
CN106293410A (en) * 2016-08-22 2017-01-04 维沃移动通信有限公司 A kind of video progress control method and mobile terminal
CN107124656A (en) * 2017-04-24 2017-09-01 维沃移动通信有限公司 The player method and mobile terminal of a kind of multimedia file

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102042265B1 (en) * 2012-03-30 2019-11-08 엘지전자 주식회사 Mobile terminal

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101382867A (en) * 2008-10-28 2009-03-11 深圳市迅雷网络技术有限公司 Video playing interaction method and terminal
CN104516637A (en) * 2013-09-26 2015-04-15 腾讯科技(深圳)有限公司 Display control method and device for button in media playing page
CN104777999A (en) * 2015-03-20 2015-07-15 广东欧珀移动通信有限公司 Touch position display method and touch position display system
CN106293410A (en) * 2016-08-22 2017-01-04 维沃移动通信有限公司 A kind of video progress control method and mobile terminal
CN107124656A (en) * 2017-04-24 2017-09-01 维沃移动通信有限公司 The player method and mobile terminal of a kind of multimedia file

Also Published As

Publication number Publication date
CN108260013A (en) 2018-07-06

Similar Documents

Publication Publication Date Title
CN108260013B (en) Video playing control method and terminal
CN108762954B (en) Object sharing method and mobile terminal
CN108668083B (en) Photographing method and terminal
CN109388304B (en) Screen capturing method and terminal equipment
CN109992231B (en) Screen projection method and terminal
CN109714485B (en) Display method and mobile terminal
CN108920239B (en) Long screen capture method and mobile terminal
WO2019196929A1 (en) Video data processing method and mobile terminal
CN108628515B (en) Multimedia content operation method and mobile terminal
WO2019120192A1 (en) Method for editing text, and mobile device
CN108446156B (en) Application program control method and terminal
WO2021068885A1 (en) Control method and electronic device
CN108958593B (en) Method for determining communication object and mobile terminal
CN108196753B (en) Interface switching method and mobile terminal
CN109683768B (en) Application operation method and mobile terminal
CN109710130B (en) Display method and terminal
CN110442279B (en) Message sending method and mobile terminal
CN109388324B (en) Display control method and terminal
CN109683802B (en) Icon moving method and terminal
WO2019154360A1 (en) Interface switching method and mobile terminal
CN108469940B (en) Screenshot method and terminal
WO2019120190A1 (en) Dialing method and mobile terminal
CN108132749B (en) Image editing method and mobile terminal
CN110413363B (en) Screenshot method and terminal equipment
CN109542321B (en) Control method and device for screen display content

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant