CN111669618A - Picture playing control method, device, equipment and storage medium - Google Patents
Picture playing control method, device, equipment and storage medium Download PDFInfo
- Publication number
- CN111669618A CN111669618A CN201910174937.3A CN201910174937A CN111669618A CN 111669618 A CN111669618 A CN 111669618A CN 201910174937 A CN201910174937 A CN 201910174937A CN 111669618 A CN111669618 A CN 111669618A
- Authority
- CN
- China
- Prior art keywords
- control
- picture
- time point
- control parameter
- picture data
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/23—Processing of content or additional data; Elementary server operations; Server middleware
- H04N21/238—Interfacing the downstream path of the transmission network, e.g. adapting the transmission rate of a video stream to network bandwidth; Processing of multiplex streams
- H04N21/2387—Stream processing in response to a playback request from an end-user, e.g. for trick-play
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/23—Processing of content or additional data; Elementary server operations; Server middleware
- H04N21/231—Content storage operation, e.g. caching movies for short term storage, replicating data over plural servers, prioritizing data for deletion
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/60—Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client
- H04N21/65—Transmission of management data between client and server
- H04N21/658—Transmission by the client directed to the server
- H04N21/6587—Control parameters, e.g. trick play commands, viewpoint selection
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
Abstract
The invention discloses a picture playing control method, a picture playing control device, picture playing control equipment and a storage medium, and belongs to the technical field of multimedia. The method comprises the following steps: acquiring a first control parameter according to a first control instruction, and caching picture data of a first time period determined according to the first control parameter, wherein the first control instruction is used for starting the control of the picture playing progress; acquiring a second control parameter according to a second control instruction, wherein the second control instruction is used for controlling the picture playing progress, and determining a control rate according to the first control parameter and the second control parameter; determining a current positioning time point according to the control rate; and if the current positioning time point is located in the first time period range, acquiring the picture data corresponding to the current positioning time point from the picture data in the first time period to play. The invention can improve the positioning efficiency and improve the user experience.
Description
Technical Field
The present invention relates to the field of multimedia technologies, and in particular, to a method, an apparatus, a device, and a storage medium for controlling picture playing.
Background
With the rapid development of multimedia technology, users can watch videos through various playing terminals. In the process of watching the video, the playing terminal can perform picture playing control according to the feedback of the user so as to meet the playing requirement of the user.
In the related art, when a screen is played, if a user performs a positioning operation such as dragging a progress bar, the screen play control process includes: acquiring a positioning position according to the positioning operation; determining a positioning time stamp and a positioning control parameter corresponding to a positioning position; acquiring positioning video data corresponding to the positioning timestamp according to the positioning control parameters; the positioning video data is displayed.
However, in the picture playing control method provided by the related art, since the positioning video data corresponding to the positioning timestamp is acquired, the positioning efficiency is low, and the user experience is poor.
Disclosure of Invention
The embodiment of the invention provides a picture playing control method, a picture playing control device, picture playing control equipment and a storage medium, and aims to solve the problems of low positioning efficiency and poor user experience in the related art. The technical scheme is as follows:
in one aspect, a method for controlling picture playing is provided, where the method includes:
acquiring a first control parameter according to a first control instruction, and caching picture data of a first time period determined according to the first control parameter, wherein the first control instruction is used for starting the control of the picture playing progress;
acquiring a second control parameter according to a second control instruction, wherein the second control instruction is used for controlling the picture playing progress, and determining a control rate according to the first control parameter and the second control parameter;
determining a current positioning time point according to the control rate;
and if the current positioning time point is located in the first time period range, acquiring the picture data corresponding to the current positioning time point from the picture data in the first time period to play.
Optionally, after determining the current positioning time point according to the control rate, the method further includes:
if the current positioning time point exceeds the range of the first time period, caching the picture data of a second time period determined according to the second control parameter, and controlling picture playing according to the picture data of the second time period.
Optionally, the method further comprises:
acquiring an ending time point corresponding to a third control instruction, wherein the third control instruction is used for ending the control of the picture playing progress;
and controlling to play the picture data corresponding to the ending time point.
Optionally, before the obtaining the first control parameter according to the first control instruction, the method further includes:
and if the previous mouse event is detected to be a click operation aiming at the picture progress bar and the next mouse event is detected to be a movement operation aiming at the picture progress bar, acquiring a first control instruction.
Optionally, before the obtaining of the second control parameter according to the second control instruction, the method further includes:
and if the previous mouse event is detected to be the moving operation aiming at the picture progress bar and the next mouse event is detected to be the moving operation aiming at the picture progress bar, acquiring a second control instruction.
Optionally, before obtaining the ending time point corresponding to the third control instruction, the method further includes:
and if the previous mouse event is detected to be the moving operation aiming at the picture progress bar and the next mouse event is detected to be the releasing operation aiming at the picture progress bar, acquiring a third control instruction.
Optionally, the first control parameter comprises a first time point and a first position, and the second control parameter comprises a second time point and a second position;
the determining a control rate according to the first control parameter and the second control parameter includes:
acquiring a control duration according to the first time point and the second time point;
acquiring a control distance according to the first position and the second position;
and taking the quotient of the control distance and the control duration as the determined control rate.
Optionally, if the code stream of the cached picture data is a first type of code stream, the cached picture data is an intra-frame coded frame; and if the code stream of the cached image data is a second-class code stream, the cached image data is an intra-frame coding frame and a refreshing P frame.
There is also provided a picture playback control apparatus, the apparatus including:
the acquisition module is used for acquiring a first control parameter according to a first control instruction, and the first control instruction is used for starting the control of the picture playing progress;
the cache module is used for caching the picture data of a first time period determined according to the first control parameter;
the acquisition module is further used for acquiring a second control parameter according to a second control instruction, and the second control instruction is used for controlling the playing progress of the picture;
the determining module is used for determining a control rate according to the first control parameter and the second control parameter and determining a current positioning time point according to the control rate;
and the control module is used for acquiring the picture data corresponding to the current positioning time point from the picture data in the first time period to play if the current positioning time point is located in the first time period range.
Optionally, the caching module is further configured to cache the picture data of a second time period determined according to the second control parameter if the current positioning time point exceeds the range of the first time period;
and the control module is also used for carrying out picture playing control according to the picture data of the second time period.
Optionally, the obtaining module is further configured to obtain an ending time point corresponding to a third control instruction, where the third control instruction is used to end the control of the picture playing progress;
and the control module is also used for controlling the playing of the picture data corresponding to the end time point.
Optionally, the obtaining module is further configured to obtain the first control instruction if it is detected that the previous mouse event is a click operation for the screen progress bar and it is detected that the next mouse event is a movement operation for the screen progress bar.
Optionally, the obtaining module is further configured to obtain the second control instruction if it is detected that the previous mouse event is a moving operation for the screen progress bar and it is detected that the next mouse event is a moving operation for the screen progress bar.
Optionally, the obtaining module is further configured to obtain a third control instruction if it is detected that the previous mouse event is a moving operation for the screen progress bar and the next mouse event is a releasing operation for the screen progress bar.
Optionally, the first control parameter comprises a first time point and a first position, and the second control parameter comprises a second time point and a second position;
the acquisition module is used for acquiring a control duration according to the first time point and the second time point; acquiring a control distance according to the first position and the second position; and taking the quotient of the control distance and the control duration as the determined control rate.
Optionally, if the code stream of the picture data cached by the cache module is a first type of code stream, the cached picture data is an intra-frame coded frame; if the code stream of the picture data cached by the caching module is a second-type code stream, the cached picture data is an intra-frame coding frame and a refresh P frame.
There is also provided a picture play control system, the system comprising: a client and a server;
the client is used for acquiring a first control parameter according to a first control instruction and sending the first control parameter to the server, wherein the first control instruction is used for starting the control of the picture playing progress;
the server is used for caching the picture data of a first time period determined according to the first control parameter;
the client is further used for acquiring a second control parameter according to a second control instruction, the second control instruction is used for controlling the picture playing progress, and the control rate is determined according to the first control parameter and the second control parameter; determining a current positioning time point according to the control rate, and sending the current positioning time point to the server;
the server is further configured to receive the current positioning time point, and return, to the client, picture data corresponding to the current positioning time point, which is obtained from the picture data in the first time period, if the current positioning time point is within the first time period range;
and the client is also used for playing the picture data corresponding to the current positioning time point fed back by the server.
Optionally, the client is further configured to return a second control parameter to the server if the current positioning time point exceeds the first time period range;
the server is further configured to cache picture data of a second time period determined according to the second control parameter, where the picture data of the second time period is used for the client to perform picture playing control.
Optionally, the client is further configured to obtain an end time point corresponding to a third control instruction, where the third control instruction is used to end the control of the picture playing progress; sending the end time point to the server;
the server is further configured to acquire the picture data of the ending time point and send the picture data of the ending time point to the client;
and the client is also used for controlling the playing of the picture data corresponding to the end time point.
There is provided a picture play control apparatus comprising a processor and a memory, the memory having stored therein at least one instruction which is displayed and executed by the processor to implement any of the picture play control methods described above.
There is provided a computer readable storage medium having stored therein at least one instruction that is displayed and executed by a processor to implement any of the above-described picture playback control methods.
The technical scheme provided by the embodiment of the invention has the beneficial effects that at least:
after the first control parameter is obtained according to the first control instruction, the picture data of the first time period determined according to the first control parameter is cached, then the second control parameter is obtained based on the second control instruction, and the control speed is determined through the control parameters corresponding to the first control instruction and the second control instruction respectively, so that the picture playing progress is controlled according to the control speed, the positioning efficiency can be improved, and the user experience can be improved. In addition, through caching the picture data, the picture playing can be quickly responded, so that the picture playing is smoother, and the user experience is further improved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present invention, the drawings needed to be used in the description of the embodiments will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
FIG. 1 is a schematic diagram of an implementation environment provided by an embodiment of the invention;
fig. 2 is a flowchart of a method for controlling image playing according to an embodiment of the present invention;
FIG. 3 is a schematic diagram of a screen playing interface according to an embodiment of the present invention;
FIG. 4 is a schematic diagram of mouse event processing according to an embodiment of the present invention;
FIG. 5 is a schematic diagram of a screen playing interface according to an embodiment of the present invention;
FIG. 6 is a schematic diagram of mouse event processing according to an embodiment of the present invention;
fig. 7 is a flowchart of a method for controlling image playback according to an embodiment of the present invention;
FIG. 8 is a diagram illustrating mouse event processing according to an embodiment of the present invention;
fig. 9 is a schematic structural diagram of a picture playing control apparatus according to an embodiment of the present invention;
fig. 10 is a schematic structural diagram of a terminal according to an embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, embodiments of the present invention will be described in detail with reference to the accompanying drawings.
The embodiment of the invention provides a picture playing control method, which can be applied to the implementation environment shown in fig. 1. In fig. 1, at least one terminal 11 and a server 12 are included, and the terminal 11 may be communicatively connected to the server 12 to download a video file from the server 12.
The terminal 11 may be any electronic product that can perform human-Computer interaction with a user through one or more modes such as a mouse, a touch pad, and a touch screen, for example, a PC (Personal Computer), a smart phone, a wearable terminal, a pocket PC (pocket PC), a tablet Computer, a smart car, a smart television, and the like.
The server 12 may be a server, a server cluster composed of a plurality of servers, or a cloud computing service center.
It should be understood by those skilled in the art that the above-mentioned terminal 11 and server 12 are only examples, and other existing or future terminals or servers may be suitable for the present application and are included within the scope of the present application and are herein incorporated by reference.
Based on the implementation environment shown in fig. 1, referring to fig. 2, an embodiment of the present invention provides a picture playing control method, which is applicable to the terminal shown in fig. 1, for example, to a client such as a video player installed on the terminal. As shown in fig. 2, the method includes:
When the video is played, the method provided by the embodiment of the invention can detect whether some mouse clicks and other operations are performed on the played picture in real time, namely, the mouse event is detected, and different mouse events can trigger different control operations. For example, the method provided by the embodiment of the present invention uses an instruction for starting control of the screen playing progress as a first control instruction, and determines whether to trigger the first control instruction for starting control of the screen playing progress by detecting a mouse event. Optionally, the detecting the first control instruction comprises: and if the previous mouse event is detected to be a click operation aiming at the picture progress bar and the next mouse event is detected to be a movement operation aiming at the picture progress bar, detecting a first control instruction.
After the first control instruction is detected, because the first control instruction is used for starting the control of the picture playing progress, the method provided by the embodiment of the invention acquires the first control parameter according to the first control instruction. Optionally, the first control parameter comprises a first point in time and a first position. The first time point is a time point when the first control instruction is detected, that is, a time point when control of the picture playing progress is started, and the time point is system time. For example, when the first control instruction is acquired by detecting a mouse event, the first time point is a time point when the mouse event is detected as a click operation on the screen progress bar. The first position is an operation position for the screen progress bar, and for example, a position of a click operation with respect to the screen progress bar is taken as the first position.
For example, when a user wants to watch a certain video, a video player is started to play the video, and the progress bar is dragged to realize quick browsing of the video and positioning to a favorite time period. The dragging progress bar may be clicked by a mouse, or may be clicked by other methods, such as a touch click. When the mouse clicks the progress bar to start dragging, the video player of the terminal receives a message for starting dragging, namely, when the mouse event is detected to be a click operation aiming at the picture progress bar, the time and the position corresponding to the time point for starting dragging are recorded. The playing time and the position corresponding to the time point for starting dragging correspond to a first control parameter of a first control instruction, wherein the time corresponding to the time point for starting dragging is a first time point, and the position corresponding to the time point for starting dragging is a first position.
For convenience of understanding, taking the playing screen shown in fig. 3 as an example, when a video is currently being played, and it is detected that the user clicks the screen progress bar through a mouse, a mouse event is detected, where the mouse event is a click operation, that is, a mouse is pressed down. And then, when the user starts to execute a moving operation through the mouse, detecting another mouse event, wherein the mouse event is a moving operation, namely mouse movement, so that the user can confirm that the user wants to drag the screen progress bar to control the screen playing progress, and then detecting a first control instruction. As shown in fig. 3, the position of the mouse cursor is the position corresponding to the playing time of "01: 01: 08", the position is the first position, the playing time of "01: 01: 08" is the time of the first position corresponding to the screen progress bar, and the system time of the mouse pressing is the first time point.
Further, after the first control parameter is acquired, the picture data in the first time period determined according to the first control parameter is cached. By caching the picture data in the first time period, the picture corresponding to the progress can be quickly responded and played according to the determined progress. Optionally, a first time period is set by taking the time of the first position corresponding to the picture progress bar as a center, and video data in the first time period, that is, picture data of the first time period, is cached.
Optionally, taking the time when the first position corresponds to the screen progress bar, that is, the time point T of the progress bar when the dragging starts as an example, taking T as a center, respectively pushing a period of time T back and forth to form a period of time [ T-T, T + T ], and the terminal caches the video data in the period of time. The size of T is not limited in this embodiment of the present invention, and for example, the size of T may be adjusted according to the configuration of the terminal. By respectively pushing a period of time T back and forth by taking T as a center, a time period (T-T, T + T) is formed, and the positioning time point is ensured to be in the first time period regardless of forward dragging or reverse dragging.
It should be noted that the video player may be a video playing application installed in the terminal, or may be an online video player that is started by accessing a web page on the terminal, which is not limited in this embodiment of the present invention. No matter which type of video player is started, after the video player is started to play the video, if the progress bar clicking operation is obtained, at this moment, the terminal can judge whether the operation is an effective dragging start operation according to the next operation of the clicking operation. Optionally, a click operation on the screen progress bar is detected, and if a next operation of the click operation is a release operation, the click operation is an invalid operation. Accordingly, if the next operation of the click operation is a move operation, the click operation is an effective operation, that is, the start of the drag action is instructed.
Taking a mouse click progress bar as an example, when a user presses a mouse, the terminal detects that a mouse event is a pressing event, namely the mouse pressing event is obtained, then judgment is carried out according to the next operation, for example, the next operation type is judged, if the next operation is mouse release, namely the operation type of the next operation is release operation, the mouse pressing event is invalid operation, and is converted into DragNone (invalid command), and the terminal does not respond; if the next operation is to move the mouse, that is, the operation type of the next operation is a move operation, the mouse pressing event is an effective operation, the operation is converted into DragStart (start dragging), and the terminal receives a drag start message corresponding to the first control instruction. Determining a first time period according to a time point of detecting a mouse-down event, and caching the picture data in the first time period, where the process may refer to a flow shown in fig. 4.
Optionally, if it is detected that the previous mouse event is a moving operation for the screen progress bar and the next mouse event is a moving operation for the screen progress bar, the second control instruction is detected. And for the moving operation of the screen progress bar, indicating that the user wants to adjust the progress, taking the position corresponding to the moving operation as a second position, and taking the time when the moving operation is detected as a second time to obtain a second control parameter, wherein the second control parameter comprises a second time point and a second position.
For example, still taking fig. 3 as an example, after the user clicks the screen progress bar with a mouse for a corresponding playing time of "01: 01: 08", the first control command is detected. And then, continuing to detect the mouse event, and acquiring a second control instruction when detecting that the user executes the movement operation through the mouse. The second control parameter may be obtained according to the second control instruction, and if the position corresponding to the time point "01: 31: 06" in the screen progress bar reached by the user through the mouse to execute the moving operation in fig. 3 is as shown in fig. 5, the second control instruction is obtained. As shown in fig. 5, the position of the mouse cursor is the position of the screen progress bar corresponding to the playing time of "01: 31: 06", the position is the second position, and the time for the mouse to move to the second position is the second time point, so that the second control parameter is obtained.
In addition, in the process that the user executes the movement operation through the mouse, after the method provided by the embodiment of the invention detects the mouse event, whether the operation targeted by the mouse event is the effective movement operation is judged according to the action of the previous mouse event on the screen progress bar. For example, when the movement operation is performed by the mouse, it is detected that the mouse event is the movement operation for the screen progress bar, then, the determination is made according to the previous mouse event, and if the previous mouse event is the movement operation for the screen progress bar, the movement operation for the screen progress bar of the current mouse event is the valid operation. The mouse is moved by the user to control the playing progress of the picture due to the mouse moving event, and the second control instruction is used for controlling the playing progress of the picture, so that the mouse moving event is converted into the second control instruction, namely into DragMoving, and the second control instruction is acquired under the condition of effective operation.
And then adding the second control instruction converted by the mouse movement event into the buffer queue. And then judging subsequent processing based on the last instruction type in the cache queue. For example, if the last instruction is a first control instruction, that is, the last operation is to start a dragging operation, the corresponding instruction is issued to the logic layer for processing, and until a second control instruction in the cache queue is executed, the second control instruction is issued to the device, and the second control instruction is executed. Optionally, if the last instruction is a second control instruction, which indicates that the dragging operation process is currently executed, it is determined whether the current positioning time point exceeds the first time period, if so, the time period of the cache picture is re-issued, and the cache instruction is issued to the logic layer, and when the second control instruction is executed, the second control instruction is issued to the device, and the second control instruction is executed, where the process may be as shown in fig. 6.
And step 203, determining a control rate according to the first control parameter and the second control parameter, and determining a current positioning time point according to the control rate.
The first control parameter comprises a first time point and a first position, and the second control parameter comprises a second time point and a second position; optionally, determining the control rate according to the first control parameter and the second control parameter comprises: acquiring a control duration according to the first time point and the second time point; acquiring a control distance according to the first position and the second position; and taking the quotient of the control distance and the control time length as the determined control rate.
For example, if the first time point in the first control parameter is T1 and the second time point in the second control parameter is T2, the control time period T is T2 to T1. Since the first time point and the second time point are both system times, the control time duration is the difference between the system times of the two operations. And if the first position in the first control parameter is S1 and the second position in the second control parameter is S2, the control distance S is S2-S1. Since the first position and the second position are both positions on the progress bar, the control distance is the difference between the lengths of the progress bars corresponding to the two positions. Taking the quotient S/T of the control distance and the control duration as a determined control rate V; a time interval is determined by the control rate V, and the positioning time point is obtained according to the time interval. In this embodiment, the time interval is adjusted according to the speed of the control rate V, so as to dynamically adjust the acquired playing frame according to the dynamically acquired positioning time point.
When a time interval is determined by the control rate V, the time interval may be determined by the corresponding relationship between the control rate V and the time interval. For example, after the control rate V is acquired, the time interval corresponding to the control rate V is determined in the correspondence relationship between the control rate V and the time interval. In an alternative embodiment, the correspondence may be set empirically or by a user. For example, taking the correspondence relationship shown in table 1 below as an example, if the acquired control rate V is 2, the time interval is 200ms (milliseconds).
TABLE 1
Controlling the rate V | Time interval |
1 | 100ms |
2 | 200ms |
3 | 400ms |
4 | 500ms |
5 | 1s |
Alternatively, in addition to determining the time interval according to the corresponding relationship, the time interval may also be determined according to the control rate in other manners, for example, a reference rate and a reference time interval are set, and then after the control rate is obtained, the obtained ratio of the control rate and the reference rate is multiplied by the reference time interval to obtain the time interval. Or, a reference rate and a reference time interval are not required to be set, but a reference ratio of the rate and the time interval is directly set, and then the ratio is directly multiplied by the acquired control rate to obtain the time interval.
Further, after the time interval is determined, when the positioning time point is obtained according to the time interval, the time of the previous playing picture can be used as a starting point, the time interval is added to the starting point, and the obtained time point is used as the positioning time point. For example, if the time of the previous playing frame is 00:10:00 and the time interval is 1s, the obtained positioning time point is 00:10: 01.
Of course, in addition to the above manner of obtaining the positioning time point according to the time interval, other manners may also be adopted to flexibly process, and this is not limited in the embodiment of the present invention. For example, the positioning time point may be determined by a reference time interval with the time of the previous play-back picture as a start point. Still taking the time of the previous playing frame as 00:10:00 as an example, if the reference time interval is 2s, the positioning time point is 00:10: 02.
It should be noted that the length of the time interval for obtaining the positioning time point dynamically adjusted according to the speed of the control rate V may be: when the control rate V is larger, the time interval for acquiring the positioning time point is long; when the control rate V is small, the time interval for acquiring the positioning time point is short. And adjusting the obtained playing picture according to the rate of the obtained positioning time point, so that the change rate of the playing picture is consistent with the control rate V.
For example, control rate V1>V2>V3Obtaining the time interval t of the positioning time point11>t12>t13(ii) a When the control rate is V1Obtaining the time interval of the positioning time point as t11That is, the time interval between the playing frame corresponding to the positioning time point and the playing frame corresponding to the last positioning time point is t11(ii) a When the control rate is V2Obtaining the time interval of the positioning time point as t12That is, the time interval between the playing frame corresponding to the positioning time point and the playing frame corresponding to the last positioning time point is t12(ii) a When the control rate isV3Obtaining the time interval of the positioning time point as t13That is, the time interval between the playing frame corresponding to the positioning time point and the playing frame corresponding to the last positioning time point is t13。
When the control rate is high, the time interval between adjacent positioning time points is high, so that a user can conveniently and quickly find the interested time period in the large-section video data; when the control rate is smaller, the time interval between adjacent positioning time points is smaller, so that the positioning is more accurate, and a user can find the interested time point conveniently. For example, when the speed of the user dragging the mouse to perform the movement operation is high, a positioning time point is determined every 1s, so that the time of each picture interval is long, and the user can find a small segment which possibly contains the point of interest in a large segment time period; when the user drags slowly, a positioning time point is determined every 500ms, so that the time interval between two pictures is small, and the positioning can be accurate.
Since the picture data in the first time period is cached before, after the positioning time point is determined, if the current positioning time point is within the range of the first time period, the picture data corresponding to the current positioning time point can be directly obtained from the picture data in the first time period and played, so that the response speed is high.
Optionally, if the current positioning time point is beyond the first time period range, as shown in fig. 7, the method provided in the embodiment of the present invention further includes the following step 205.
Since the current positioning time point exceeds the first time period range, the previously cached picture data of the first time period does not include the picture data corresponding to the current positioning time point, so the method provided by the embodiment of the present invention will continue to cache the picture data of the second time period determined according to the second control parameter. The manner of determining the second time period according to the second control parameter is the same as the principle of determining the first time period.
For example, when the current time point of the progress bar dragging exceeds the first time period [ T-T, T + T ], the second time period is acquired again. For example, when the current time point is dragged forward to the time point T + T, taking T + T as the center, the second time period [ T + T) -T, (T + T) + T ], that is [ T, T +2T ] needs to be determined again. By adjusting the second time period, no matter how large the time range of the user dragging the progress bar to move through the mouse is, the terminal only caches the video data in the time period (T-T, T + T), the situation of insufficient memory does not exist, and the user experience is improved.
Optionally, in addition to caching the picture data on the terminal, the method provided by the embodiment of the present invention also supports caching the picture data on the server side, thereby saving the storage resource of the terminal. For example, a client on the terminal may send a first control parameter to a server, and the server caches picture data of a first time period determined according to the first control parameter; the client acquires a second control parameter according to the second control instruction, and determines a control rate according to the first control parameter and the second control parameter; determining a current positioning time point according to the control rate, and sending the current positioning time point to a server; and the server receives the current positioning time point, and returns the picture data corresponding to the current positioning time point, which is acquired from the picture data in the first time period, to the client if the current positioning time point is within the first time period range.
Optionally, if the current positioning time point exceeds the first time period range, the client returns a second control parameter to the server; and the server caches the picture data of a second time period determined according to the second control parameter, wherein the picture data of the second time period is used for controlling picture playing of the client.
Optionally, if the client acquires an end time point corresponding to the third control instruction, sending the end time point to the server; the server acquires the picture data of the ending time point and sends the picture data of the ending time point to the client; and the client controls the picture data corresponding to the playing ending time point.
Optionally, if the next operation of the moving operation is not a click operation or a moving operation, but no operation is detected, the case is that the mouse is floating, and in this case, the buffering of the video data is stopped, and the current time point is taken as the positioning time point, that is, the current screen is kept still. For example, when the user finds an interested time point in the process of dragging the progress bar, the action is kept unchanged, the terminal stops caching the video data, takes the current time point as a positioning time point, and keeps the picture of the time point unchanged.
Optionally, the foregoing describes the screen playing control manner in detail by taking a mouse event as a moving operation for the screen progress bar, that is, detecting the second control instruction as an example. However, the method provided in the embodiment of the present invention further includes another control condition, for example, if the mouse event is a release operation and the moving operation is not continuously performed, in this case, the method provided in the embodiment of the present invention further includes: detecting a third control instruction, wherein the third control instruction is used for finishing the control of the picture playing progress; and acquiring an ending time point corresponding to the third control instruction, and controlling the picture data corresponding to the playing ending time point.
The manner of detecting the third control instruction includes, but is not limited to: and if the previous mouse event is detected to be the moving operation aiming at the picture progress bar and the next mouse event is detected to be the releasing operation aiming at the picture progress bar, detecting a third control instruction. And stopping caching the video data after detecting the third control instruction, and taking the current time point as a positioning time point, namely an ending time point.
For example, when the user finds an interested time point in the process of dragging the progress bar, the mouse is released, the terminal stops caching the video data, and the video data is played sequentially from the current time point as a positioning time point.
And when the terminal detects that the mouse event is a release operation, acquiring the mouse release event. And judging according to the previous mouse event, such as judging the last operation type. If the execution operation (the operation type of the last operation) of the previous mouse event is a click operation, the release operation of the mouse event is an invalid operation; if the execution operation (operation type of the previous operation) of the previous mouse event is a move operation, the release operation of the mouse event is an effective operation, and is converted into DragEnd (drag end) corresponding to the third control instruction. Since the third control instruction is used to instruct to end the control of the picture playing progress, the video data corresponding to the playing end time point is started, and the process is as shown in fig. 8.
In summary, four states (DragStart, DragMoving, DragHangOn, DragEnd, DragNone) can be distinguished for a mouse event, and the meaning of each state is as follows: dragStart: starting dragging corresponding to a first control instruction; DragMoving: during dragging, corresponding to a second control instruction; DragHangOn: the mouse is suspended and fixed during dragging; DragEnd: after dragging is finished, corresponding to a third control instruction; DragNone: an invalidate command. The transition to these four states is based on two previous and subsequent mouse events, as shown in table 2 below:
TABLE 2
The mouse pressing, moving and releasing operations are processed in the manner of the table 2, and the smooth dragging control can be matched.
In addition, no matter the picture data in the first time period or the picture data in the second time period are cached, when the progress bar is dragged, the key frames are cached. Taking the first type code stream and the second type code stream as examples, the first type code stream includes I, P, B (optional) frames, the key frame is an I frame, and the first type code stream may also be referred to as a non-Smart code stream; the second type of code stream includes I frames, refresh P frames (deep P frames), normal P frames, and B frames (optional), the key frames are I frames and refresh P frames (deep P frames), and the second type of code stream may also be referred to as Smart code stream. If the code stream of the video file is the first type code stream, the key frame is an I frame because the first type code stream contains I, P, B (optional) frames. Therefore, the buffered picture data is an I frame (intra-coded frame). If the code stream of the video file is the second code stream, the second code stream contains an I frame, a refresh P frame (a deep P frame), a common P frame and a B frame (optional), and the key frame is the I frame and the refresh P frame (the deep P frame). Therefore, the buffered picture data are I frames (intra-coded frames) and refresh P frames (deep P frames).
Wherein, refreshing the P frame: the frame is a forward prediction reference frame, the decoding reference frame of the frame is an I frame which is positioned in front of the refreshing P frame and is closest to the refreshing P frame, and the inter-frame prediction of the refreshing P frame refers to the I frame and does not refer to the front P frame, so that the frame can be quickly retrieved and quickly decoded in random access or video playback, and the decoding waiting time is reduced.
Normal P frame: the reference frames of the other type of forward prediction reference frame are the previous frame adjacent to the normal P frame and the I frame located before and closest to the normal P frame.
That is, when the terminal caches the video file, if the code stream of the video file is the first type of code stream, caching the I-frame image data in the video file; and if the code stream of the video file is the second code stream, caching the image data of the I frame and the deep P frame in the video file. When the progress bar is dragged, the key frames are cached and played, so that the playing picture is ensured not to be blank, and the response speed is improved.
For example, for a second type of code stream, such as Smart264 or Smart265, when browsing the video of the type of code stream, the I frame and the refreshed P frame image data in the video file are cached.
When the method is used for positioning a favorite video picture in the browsing process, the mouse is released, the terminal receives the end dragging message, the first type of code stream is returned from the end time point, namely the code stream containing the I frame, the P frame and the B frame (bidirectional predictive interpolation coding frame) is returned, the video player normally decodes, only the key frame in the code stream is not extracted for playing, the video does not have a jumping effect any more, and the video data is normally played from the end time point.
Optionally, when performing video browsing positioning, the video player interface displays the video data at the positioning time point. When the cached video data corresponding to the positioning time point is an intra-frame coded frame, namely an I frame, the video player decodes and displays the frame of the I frame; when the cached video data corresponding to the positioning time point is a refresh P frame, the video player decodes an I frame closest to the P frame, the I frame is used as a reference frame, only decoding is performed, the I frame is not displayed, and a refresh P frame picture corresponding to the positioning time point is displayed.
The technical scheme provided by the embodiment of the invention has the beneficial effects that at least:
after the first control parameter is obtained according to the first control instruction, the picture data of the first time period determined according to the first control parameter is cached, then the second control parameter is obtained based on the second control instruction, and the control speed is determined through the control parameters corresponding to the first control instruction and the second control instruction respectively, so that the picture playing progress is controlled according to the control speed, the positioning efficiency can be improved, and the user experience can be improved. In addition, through caching the picture data, the picture playing can be quickly responded, so that the picture playing is smoother, and the user experience is further improved.
Based on the same concept as the above method, referring to fig. 9, an embodiment of the present invention provides a picture play control apparatus, including:
an obtaining module 901, configured to obtain a first control parameter according to a first control instruction, where the first control instruction is used to start control of a picture playing progress;
a caching module 902, configured to cache picture data of a first time period determined according to a first control parameter;
the obtaining module 901 is further configured to obtain a second control parameter according to a second control instruction, where the second control instruction is used to control a picture playing progress;
a determining module 903, configured to determine a control rate according to the first control parameter and the second control parameter, and determine a current positioning time point according to the control rate;
the control module 904 is configured to, if the current positioning time point is within the first time period range, obtain, from the picture data in the first time period, the picture data corresponding to the current positioning time point for playing.
Optionally, the caching module 902 is further configured to cache the picture data of the second time period determined according to the second control parameter if the current positioning time point exceeds the range of the first time period;
the control module 904 is further configured to perform picture playing control according to the picture data of the second time period.
Optionally, the obtaining module 901 is further configured to obtain an ending time point corresponding to a third control instruction, where the third control instruction is used to end the control on the picture playing progress;
the control module 904 is further configured to control the picture data corresponding to the playing ending time point.
Optionally, the obtaining module 901 is further configured to obtain the first control instruction if it is detected that the previous mouse event is a click operation for the screen progress bar and it is detected that the next mouse event is a movement operation for the screen progress bar.
Optionally, the obtaining module 901 is further configured to obtain the second control instruction if it is detected that the previous mouse event is a moving operation for the screen progress bar and it is detected that the next mouse event is a moving operation for the screen progress bar.
Optionally, the obtaining module 901 is further configured to obtain a third control instruction if it is detected that the previous mouse event is a moving operation for the screen progress bar and it is detected that the next mouse event is a releasing operation for the screen progress bar.
Optionally, the first control parameter comprises a first time point and a first position, and the second control parameter comprises a second time point and a second position;
an obtaining module 901, configured to obtain a control duration according to the first time point and the second time point; acquiring a control distance according to the first position and the second position; and taking the quotient of the control distance and the control time length as the determined control rate.
Optionally, if the code stream of the picture data cached by the cache module 902 is a first type of code stream, the cached picture data is an intra-frame coded frame; if the code stream of the picture data cached by the caching module 902 is the second type of code stream, the cached picture data is an intra-frame coded frame and a refresh P frame.
The technical scheme provided by the embodiment of the invention has the beneficial effects that at least:
after the first control parameter is obtained according to the first control instruction, the picture data of the first time period determined according to the first control parameter is cached, then the second control parameter is obtained based on the second control instruction, and the control speed is determined through the control parameters corresponding to the first control instruction and the second control instruction respectively, so that the picture playing progress is controlled according to the control speed, the positioning efficiency can be improved, and the user experience can be improved. In addition, through caching the picture data, the picture playing can be quickly responded, so that the picture playing is smoother, and the user experience is further improved.
It should be noted that, when the device provided in the foregoing embodiment implements the functions thereof, only the division of the functional modules is illustrated, and in practical applications, the functions may be distributed by different functional modules according to needs, that is, the internal structure of the terminal is divided into different functional modules to implement all or part of the functions described above. In addition, the apparatus and method embodiments provided by the above embodiments belong to the same concept, and specific implementation processes thereof are described in the method embodiments for details, which are not described herein again.
The embodiment of the invention provides a picture playing control system, which comprises: a client and a server;
the client is used for acquiring a first control parameter according to a first control instruction and sending the first control parameter to the server, and the first control instruction is used for starting the control of the picture playing progress;
the server is used for caching the picture data of the first time period determined according to the first control parameter;
the client is also used for acquiring a second control parameter according to a second control instruction, the second control instruction is used for controlling the picture playing progress, and the control rate is determined according to the first control parameter and the second control parameter; determining a current positioning time point according to the control rate, and sending the current positioning time point to a server;
the server is also used for receiving the current positioning time point, and returning the picture data corresponding to the current positioning time point acquired from the picture data in the first time period to the client if the current positioning time point is within the first time period range;
and the client is also used for playing the picture data corresponding to the current positioning time point fed back by the server.
Optionally, the client is further configured to return a second control parameter to the server if the current positioning time point exceeds the first time period range;
and the server is also used for caching the picture data of a second time period determined according to the second control parameter, and the picture data of the second time period is used for controlling picture playing of the client.
Optionally, the client is further configured to obtain an end time point corresponding to a third control instruction, where the third control instruction is used to end the control of the picture playing progress; sending the end time point to a server;
the server is also used for acquiring the picture data of the ending time point and sending the picture data of the ending time point to the client;
and the client is also used for controlling the picture data corresponding to the playing end time point.
According to the system provided by the embodiment of the invention, after the client side obtains the first control parameter according to the first control instruction, the client side obtains the second control parameter based on the second control instruction, and the control rate is determined according to the control parameters corresponding to the first control instruction and the second control instruction, so that the picture playing progress is controlled according to the control rate, the positioning efficiency is improved, and the user experience is improved. In addition, the server caches the picture data and returns the picture data to be played to the client, so that the picture playing can be quickly responded, the picture playing is smoother, and the user experience is further improved.
Referring to fig. 10, a schematic structural diagram of a terminal 1000 for video positioning according to an embodiment of the present disclosure is shown. The terminal 1000 can be a portable mobile terminal such as: a smart phone, a tablet computer, an MP3 player (Moving Picture Experts Group Audio Layer III, motion video Experts compression standard Audio Layer 3), an MP4 player (Moving Picture Experts Group Audio Layer IV, motion video Experts compression standard Audio Layer 4), a notebook computer, or a desktop computer. Terminal 1000 can also be referred to as a user terminal, portable terminal, laptop terminal, desktop terminal, or the like by other names.
In general, terminal 1000 can include: a processor 1001 and a memory 1002.
In some embodiments, terminal 1000 can also optionally include: a peripheral terminal interface 1003 and at least one peripheral terminal. The processor 1001, the memory 1002, and the peripheral terminal interface 1003 may be connected by a bus or signal line. Each peripheral terminal may be connected to the peripheral terminal interface 1003 via a bus, signal line, or circuit board. Specifically, the peripheral terminal includes: at least one of radio frequency circuitry 1004, touch screen display 1005, camera 1006, audio circuitry 1007, positioning components 1008, and power supply 1009.
The Radio Frequency circuit 1004 is used for receiving and transmitting RF (Radio Frequency) signals, also called electromagnetic signals. The radio frequency circuit 1004 communicates with a communication network and other communication terminals by electromagnetic signals. The radio frequency circuit 1004 converts an electrical signal into an electromagnetic signal to transmit, or converts a received electromagnetic signal into an electrical signal. Optionally, the radio frequency circuit 1004 comprises: an antenna system, an RF transceiver, one or more amplifiers, a tuner, an oscillator, a digital signal processor, a codec chipset, a subscriber identity module card, and so forth. The radio frequency circuit 1004 may communicate with other terminals via at least one wireless communication protocol. The wireless communication protocols include, but are not limited to: metropolitan area networks, various generation mobile communication networks (2G, 3G, 4G, and 5G), Wireless local area networks, and/or WiFi (Wireless Fidelity) networks. In some embodiments, the rf circuit 1004 may further include NFC (Near Field Communication) related circuits, which are not limited in this application.
The display screen 1005 is used to display a UI (User Interface). The UI may include graphics, text, icons, video, and any combination thereof. When the display screen 1005 is a touch display screen, the display screen 1005 also has the ability to capture touch signals on or over the surface of the display screen 1005. The touch signal may be input to the processor 1001 as a control signal for processing. At this point, the display screen 1005 may also be used to provide virtual buttons and/or a virtual keyboard, also referred to as soft buttons and/or a soft keyboard. In some embodiments, display screen 1005 can be one, providing a front panel of terminal 1000; in other embodiments, display 1005 can be at least two, respectively disposed on different surfaces of terminal 1000 or in a folded design; in still other embodiments, display 1005 can be a flexible display disposed on a curved surface or on a folded surface of terminal 1000. Even more, the display screen 1005 may be arranged in a non-rectangular irregular figure, i.e., a shaped screen. The Display screen 1005 may be made of LCD (Liquid Crystal Display), OLED (Organic Light-Emitting Diode), and the like.
The camera assembly 1006 is used to capture images or video. Optionally, the camera assembly 1006 includes a front camera and a rear camera. Generally, a front camera is disposed at a front panel of the terminal, and a rear camera is disposed at a rear surface of the terminal. In some embodiments, the number of the rear cameras is at least two, and each rear camera is any one of a main camera, a depth-of-field camera, a wide-angle camera and a telephoto camera, so that the main camera and the depth-of-field camera are fused to realize a background blurring function, and the main camera and the wide-angle camera are fused to realize panoramic shooting and VR (Virtual Reality) shooting functions or other fusion shooting functions. In some embodiments, camera assembly 1006 may also include a flash. The flash lamp can be a monochrome temperature flash lamp or a bicolor temperature flash lamp. The double-color-temperature flash lamp is a combination of a warm-light flash lamp and a cold-light flash lamp, and can be used for light compensation at different color temperatures.
The audio circuit 1007 may include a microphone and a speaker. The microphone is used for collecting sound waves of a user and the environment, converting the sound waves into electric signals, and inputting the electric signals to the processor 1001 for processing or inputting the electric signals to the radio frequency circuit 1004 for realizing voice communication. For stereo sound collection or noise reduction purposes, multiple microphones can be provided, each at a different location of terminal 1000. The microphone may also be an array microphone or an omni-directional pick-up microphone. The speaker is used to convert electrical signals from the processor 1001 or the radio frequency circuit 1004 into sound waves. The loudspeaker can be a traditional film loudspeaker or a piezoelectric ceramic loudspeaker. When the speaker is a piezoelectric ceramic speaker, the speaker can be used for purposes such as converting an electric signal into a sound wave audible to a human being, or converting an electric signal into a sound wave inaudible to a human being to measure a distance. In some embodiments, the audio circuit 1007 may also include a headphone jack.
A location component 1008 is employed to locate a current geographic location of terminal 1000 for navigation or LBS (location based Service). The positioning component 1008 may be a positioning component based on the GPS (global positioning System) in the united states, the beidou System in china, the graves System in russia, or the galileo System in the european union.
In some embodiments, terminal 1000 can also include one or more sensors 1010. The one or more sensors 1010 include, but are not limited to: acceleration sensor 1011, gyro sensor 1012, pressure sensor 1013, fingerprint sensor 1014, optical sensor 1014, and proximity sensor 1016.
Acceleration sensor 1010 can detect acceleration in three coordinate axes of a coordinate system established with terminal 1000. For example, the acceleration sensor 1011 may be used to detect components of the gravitational acceleration in three coordinate axes. The processor 1001 may control the touch display screen 1005 to display a user interface in a landscape view or a portrait view according to the gravitational acceleration signal collected by the acceleration sensor 1011. The acceleration sensor 1011 may also be used for acquisition of motion data of a game or a user.
The gyro sensor 1012 may detect a body direction and a rotation angle of the terminal 1000, and the gyro sensor 1012 and the acceleration sensor 1011 may cooperate to acquire a 3D motion of the user on the terminal 1000. From the data collected by the gyro sensor 1012, the processor 1001 may implement the following functions: motion sensing (such as changing the UI according to a user's tilting operation), image stabilization at the time of photographing, game control, and inertial navigation.
Pressure sensor 1013 may be disposed on a side frame of terminal 1000 and/or on a lower layer of touch display 1005. When pressure sensor 1013 is disposed on a side frame of terminal 1000, a user's grip signal on terminal 1000 can be detected, and processor 1001 performs left-right hand recognition or shortcut operation according to the grip signal collected by pressure sensor 1013. When the pressure sensor 1013 is disposed at a lower layer of the touch display screen 1005, the processor 1001 controls the operability control on the UI interface according to the pressure operation of the user on the touch display screen 1005. The operability control comprises at least one of a button control, a scroll bar control, an icon control and a menu control.
The fingerprint sensor 1014 is used to collect a fingerprint of the user, and the processor 1001 identifies the user according to the fingerprint collected by the fingerprint sensor 1014, or the fingerprint sensor 1014 identifies the user according to the collected fingerprint. Upon identifying that the user's identity is a trusted identity, the processor 1001 authorizes the user to perform relevant sensitive operations including unlocking a screen, viewing encrypted information, downloading software, paying, and changing settings, etc. Fingerprint sensor 1014 can be disposed on the front, back, or side of terminal 1000. When a physical key or vendor Logo is provided on terminal 1000, fingerprint sensor 1014 can be integrated with the physical key or vendor Logo.
The optical sensor 1015 is used to collect the ambient light intensity. In one embodiment, the processor 1001 may control the display brightness of the touch display screen 1005 according to the intensity of the ambient light collected by the optical sensor 1015. Specifically, when the ambient light intensity is high, the display brightness of the touch display screen 1005 is increased; when the ambient light intensity is low, the display brightness of the touch display screen 1005 is turned down. In another embodiment, the processor 1001 may also dynamically adjust the shooting parameters of the camera assembly 1006 according to the intensity of the ambient light collected by the optical sensor 1015.
Proximity sensor 1016, also known as a distance sensor, is typically disposed on a front panel of terminal 1000. Proximity sensor 1016 is used to gather the distance between the user and the front face of terminal 1000. In one embodiment, when proximity sensor 1016 detects that the distance between the user and the front surface of terminal 1000 gradually decreases, processor 1001 controls touch display 1005 to switch from a bright screen state to a dark screen state; when proximity sensor 1016 detects that the distance between the user and the front of terminal 1000 is gradually increased, touch display screen 1005 is controlled by processor 1001 to switch from a breath-screen state to a bright-screen state.
Those skilled in the art will appreciate that the configuration shown in FIG. 10 is not intended to be limiting and that terminal 1000 can include more or fewer components than shown, or some components can be combined, or a different arrangement of components can be employed.
In an exemplary embodiment, a computer terminal is also provided that includes a processor and a memory having at least one instruction stored therein. At least one instruction is configured to be executed by one or more processors to implement the above-described picture play control method.
In an exemplary embodiment, there is also provided a computer-readable storage medium having at least one instruction stored therein, the at least one instruction, when executed by a processor of a computer terminal, implementing the above-described screen play control method.
All the above optional technical solutions may be combined arbitrarily to form the optional embodiments of the present disclosure, and are not described herein again.
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the invention, and any modifications, equivalents, improvements and the like that fall within the spirit and principle of the present invention are intended to be included therein.
Claims (13)
1. A picture play control method, the method comprising:
acquiring a first control parameter according to a first control instruction, and caching picture data of a first time period determined according to the first control parameter, wherein the first control instruction is used for starting the control of the picture playing progress;
acquiring a second control parameter according to a second control instruction, wherein the second control instruction is used for controlling the picture playing progress, and determining a control rate according to the first control parameter and the second control parameter;
determining a current positioning time point according to the control rate;
and if the current positioning time point is located in the first time period range, acquiring the picture data corresponding to the current positioning time point from the picture data in the first time period to play.
2. The method of claim 1, wherein after determining a current positioning time point according to the control rate, further comprising:
if the current positioning time point exceeds the range of the first time period, caching the picture data of a second time period determined according to the second control parameter, and controlling picture playing according to the picture data of the second time period.
3. The method of claim 1, further comprising:
acquiring an ending time point corresponding to a third control instruction, wherein the third control instruction is used for ending the control of the picture playing progress;
and controlling to play the picture data corresponding to the ending time point.
4. The method of claim 1, wherein before the obtaining the first control parameter according to the first control instruction, further comprising:
if the previous mouse event is detected to be a click operation aiming at the picture progress bar and the next mouse event is detected to be a movement operation aiming at the picture progress bar, acquiring a first control instruction;
before the obtaining of the second control parameter according to the second control instruction, the method further includes:
and if the previous mouse event is detected to be the moving operation aiming at the picture progress bar and the next mouse event is detected to be the moving operation aiming at the picture progress bar, acquiring a second control instruction.
5. The method of claim 3, wherein obtaining the third control instruction before the corresponding ending time point further comprises:
and if the previous mouse event is detected to be the moving operation aiming at the picture progress bar and the next mouse event is detected to be the releasing operation aiming at the picture progress bar, acquiring a third control instruction.
6. The method according to any of claims 1-5, wherein the first control parameter comprises a first point in time and a first position, and the second control parameter comprises a second point in time and a second position;
the determining a control rate according to the first control parameter and the second control parameter includes:
acquiring a control duration according to the first time point and the second time point;
acquiring a control distance according to the first position and the second position;
and taking the quotient of the control distance and the control duration as the determined control rate.
7. The method according to any one of claims 1 to 5, wherein if the code stream of the cached picture data is a first type code stream, the cached picture data is an intra-coded frame; and if the code stream of the cached image data is a second-class code stream, the cached image data is an intra-frame coding frame and a refreshing P frame.
8. A picture playback control apparatus, comprising:
the acquisition module is used for acquiring a first control parameter according to a first control instruction, and the first control instruction is used for starting the control of the picture playing progress;
the cache module is used for caching the picture data of a first time period determined according to the first control parameter;
the acquisition module is further used for acquiring a second control parameter according to a second control instruction, and the second control instruction is used for controlling the playing progress of the picture;
the determining module is used for determining a control rate according to the first control parameter and the second control parameter and determining a current positioning time point according to the control rate;
and the control module is used for acquiring the picture data corresponding to the current positioning time point from the picture data in the first time period to play if the current positioning time point is located in the first time period range.
9. A picture playback control system, comprising: a client and a server;
the client is used for acquiring a first control parameter according to a first control instruction and sending the first control parameter to the server, wherein the first control instruction is used for starting the control of the picture playing progress;
the server is used for caching the picture data of a first time period determined according to the first control parameter;
the client is further used for acquiring a second control parameter according to a second control instruction, the second control instruction is used for controlling the picture playing progress, and the control rate is determined according to the first control parameter and the second control parameter; determining a current positioning time point according to the control rate, and sending the current positioning time point to the server;
the server is further configured to receive the current positioning time point, and return, to the client, picture data corresponding to the current positioning time point, which is obtained from the picture data in the first time period, if the current positioning time point is within the first time period range;
and the client is also used for playing the picture data corresponding to the current positioning time point fed back by the server.
10. The system according to claim 9, wherein the client is further configured to return a second control parameter to the server if the current positioning time point is beyond the first time period;
the server is further configured to cache picture data of a second time period determined according to the second control parameter, where the picture data of the second time period is used for the client to perform picture playing control.
11. The system according to claim 9, wherein the client is further configured to obtain an end time point corresponding to a third control instruction, where the third control instruction is used to end the control of the picture playing progress; sending the end time point to the server;
the server is further configured to acquire the picture data of the ending time point and send the picture data of the ending time point to the client;
and the client is also used for controlling the playing of the picture data corresponding to the end time point.
12. A picture playback control apparatus, comprising a processor and a memory, the memory having stored therein at least one instruction, the instruction being displayed and executed by the processor to implement the picture playback control method according to any one of claims 1 to 7.
13. A computer-readable storage medium having stored therein at least one instruction for display by a processor and execution to implement the picture playing control method according to any one of claims 1-7.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910174937.3A CN111669618B (en) | 2019-03-08 | 2019-03-08 | Picture playing control method, device, equipment and storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910174937.3A CN111669618B (en) | 2019-03-08 | 2019-03-08 | Picture playing control method, device, equipment and storage medium |
Publications (2)
Publication Number | Publication Date |
---|---|
CN111669618A true CN111669618A (en) | 2020-09-15 |
CN111669618B CN111669618B (en) | 2022-11-15 |
Family
ID=72381974
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201910174937.3A Active CN111669618B (en) | 2019-03-08 | 2019-03-08 | Picture playing control method, device, equipment and storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN111669618B (en) |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101540881A (en) * | 2008-03-19 | 2009-09-23 | 华为技术有限公司 | Method, device and system for realizing positioning playing of streaming media |
CN103179465A (en) * | 2013-03-22 | 2013-06-26 | 广东欧珀移动通信有限公司 | Method for controlling video playing progress and mobile terminal |
CN103533456A (en) * | 2013-06-21 | 2014-01-22 | Tcl集团股份有限公司 | Correction method and system of video playing fast forwarding and fast rewinding |
US20170285924A1 (en) * | 2016-03-31 | 2017-10-05 | Le Holdings (Beijing) Co., Ltd. | Method for adjusting play progress and electronic device |
CN107920258A (en) * | 2016-10-11 | 2018-04-17 | 中国移动通信有限公司研究院 | A kind of data processing method and device |
CN109121008A (en) * | 2018-08-03 | 2019-01-01 | 腾讯科技(深圳)有限公司 | A kind of video previewing method, device, terminal and storage medium |
-
2019
- 2019-03-08 CN CN201910174937.3A patent/CN111669618B/en active Active
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101540881A (en) * | 2008-03-19 | 2009-09-23 | 华为技术有限公司 | Method, device and system for realizing positioning playing of streaming media |
CN103179465A (en) * | 2013-03-22 | 2013-06-26 | 广东欧珀移动通信有限公司 | Method for controlling video playing progress and mobile terminal |
CN103533456A (en) * | 2013-06-21 | 2014-01-22 | Tcl集团股份有限公司 | Correction method and system of video playing fast forwarding and fast rewinding |
US20170285924A1 (en) * | 2016-03-31 | 2017-10-05 | Le Holdings (Beijing) Co., Ltd. | Method for adjusting play progress and electronic device |
CN107920258A (en) * | 2016-10-11 | 2018-04-17 | 中国移动通信有限公司研究院 | A kind of data processing method and device |
CN109121008A (en) * | 2018-08-03 | 2019-01-01 | 腾讯科技(深圳)有限公司 | A kind of video previewing method, device, terminal and storage medium |
Also Published As
Publication number | Publication date |
---|---|
CN111669618B (en) | 2022-11-15 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN111147878B (en) | Stream pushing method and device in live broadcast and computer storage medium | |
CN108391171B (en) | Video playing control method and device, and terminal | |
CN109874312B (en) | Method and device for playing audio data | |
CN110149557B (en) | Video playing method, device, terminal and storage medium | |
CN109697113B (en) | Method, device and equipment for requesting retry and readable storage medium | |
CN113411680B (en) | Multimedia resource playing method, device, terminal and storage medium | |
CN111327694B (en) | File uploading method and device, storage medium and electronic equipment | |
CN109982129B (en) | Short video playing control method and device and storage medium | |
CN113204672B (en) | Resource display method, device, computer equipment and medium | |
CN107896337B (en) | Information popularization method and device and storage medium | |
CN111741366A (en) | Audio playing method, device, terminal and storage medium | |
CN112764654B (en) | Component adsorption operation method and device, terminal and storage medium | |
CN111818358A (en) | Audio file playing method and device, terminal and storage medium | |
CN114245218A (en) | Audio and video playing method and device, computer equipment and storage medium | |
CN110868642B (en) | Video playing method, device and storage medium | |
CN111092991B (en) | Lyric display method and device and computer storage medium | |
CN111459363A (en) | Information display method, device, equipment and storage medium | |
CN112004134A (en) | Multimedia data display method, device, equipment and storage medium | |
CN107888975B (en) | Video playing method, device and storage medium | |
CN109714628B (en) | Method, device, equipment, storage medium and system for playing audio and video | |
CN108966026B (en) | Method and device for making video file | |
CN113613053B (en) | Video recommendation method and device, electronic equipment and storage medium | |
CN111669618B (en) | Picture playing control method, device, equipment and storage medium | |
CN110996115B (en) | Live video playing method, device, equipment, storage medium and program product | |
CN111464829B (en) | Method, device and equipment for switching media data and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |