CN110087149A - A kind of video image sharing method, device and mobile terminal - Google Patents

A kind of video image sharing method, device and mobile terminal Download PDF

Info

Publication number
CN110087149A
CN110087149A CN201910464515.XA CN201910464515A CN110087149A CN 110087149 A CN110087149 A CN 110087149A CN 201910464515 A CN201910464515 A CN 201910464515A CN 110087149 A CN110087149 A CN 110087149A
Authority
CN
China
Prior art keywords
video
video image
terminal
playing
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201910464515.XA
Other languages
Chinese (zh)
Inventor
李天鹏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN201910464515.XA priority Critical patent/CN110087149A/en
Publication of CN110087149A publication Critical patent/CN110087149A/en
Pending legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/472End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/475End-user interface for inputting end-user data, e.g. personal identification number [PIN], preference data
    • H04N21/4756End-user interface for inputting end-user data, e.g. personal identification number [PIN], preference data for rating content, e.g. scoring a recommended movie
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/478Supplemental services, e.g. displaying phone caller identification, shopping application
    • H04N21/4788Supplemental services, e.g. displaying phone caller identification, shopping application communicating with other users, e.g. chatting
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/85Assembly of content; Generation of multimedia applications
    • H04N21/858Linking data to content, e.g. by linking an URL to a video object, by creating a hotspot
    • H04N21/8586Linking data to content, e.g. by linking an URL to a video object, by creating a hotspot by using a URL

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Human Computer Interaction (AREA)
  • General Engineering & Computer Science (AREA)
  • Databases & Information Systems (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)

Abstract

The present invention provides a kind of video image sharing method, device and mobile terminals, belong to technical field of mobile terminals.Wherein, first terminal is during playing video, if detecting picture sharing operation, available currently playing video image, jump information is added for the video image, wherein, which includes the uniform resource locator and current playback progress of the video, and the video image after addition is sent to second terminal.In this way, the user of second terminal is not necessarily to the operation being checked and accepted and adjust manually, second terminal based on the jump information in the video image, can from corresponding playback progress play the video, and then can simplify operating procedure, raising sharing efficiency.

Description

Video image sharing method and device and mobile terminal
Technical Field
The invention belongs to the technical field of mobile terminals, and particularly relates to a video image sharing method and device and a mobile terminal.
Background
With the continuous development of mobile terminal technology, users often use terminals to play videos, and often want to share videos with friends when watching interesting conditions.
In the prior art, generally, a user describes to a friend a video to which a plot that the user wants to share belongs, and specifically, at which time node of the video the user appears. Correspondingly, friends of the user need to manually search and play the corresponding video according to the description of the user, and manually adjust the playing progress of the video to the corresponding position, so that the whole operation process is complicated, and the efficiency is low.
Disclosure of Invention
The invention provides a video image sharing method and device and a mobile terminal, and aims to solve the problems that when a friend of a user views a specific situation of a video shared by the user, the operation process is complicated and the efficiency is low.
In order to solve the technical problem, the invention is realized as follows:
in a first aspect, an embodiment of the present invention provides a video image sharing method, which is applied to a first terminal, and the method may include:
in the process of playing a video, if picture sharing operation is detected, acquiring a currently played video image;
adding skip information to the video image; the skip information comprises a uniform resource locator of the video and the current playing progress;
and sending the video image added with the skip information to a second terminal, wherein the video image is used for indicating the second terminal to play the video according to the skip information.
In a second aspect, an embodiment of the present invention provides a video image sharing method, which is applied to a second terminal, and the method may include:
receiving a video image added with skip information and sent by a first terminal; the skip information comprises a uniform resource locator and a playing progress of a video to which the video image belongs;
and if the first appointed operation on the video image is detected, starting to play the video from the playing progress based on the uniform resource locator of the video and the playing progress.
In a third aspect, an embodiment of the present invention provides a video image sharing device, which is applied to a first terminal, and the device may include:
the first acquisition module is used for acquiring a currently played video image if picture sharing operation is detected in the process of playing a video;
the first adding module is used for adding jump information to the video image; the skip information comprises a uniform resource locator of the video and the current playing progress;
and the sending module is used for sending the added video image to a second terminal, and the video image is used for indicating the second terminal to play the video according to the skip information.
In a fourth aspect, an embodiment of the present invention provides a video image sharing apparatus, which is applied to a second terminal, and the apparatus may include:
the receiving module is used for receiving the video image added with the skip information and sent by the first terminal; the skip information comprises a uniform resource locator and a playing progress of a video to which the video image belongs;
and the playing module is used for starting playing the video from the playing progress based on the uniform resource locator of the video and the playing progress if the first specified operation on the video image is detected.
In a fifth aspect, an embodiment of the present invention provides a video image sharing system, where the system includes a first terminal and a second terminal;
the first terminal is used for acquiring a currently played video image if a picture sharing operation is detected in the process of playing a video;
the first terminal is used for adding skip information to the video image; the skip information comprises a uniform resource locator of the video and the current playing progress;
the first terminal is used for sending the added video image to the second terminal;
the second terminal is used for receiving the video image added with the skip information and sent by the first terminal;
the second terminal is used for displaying the video image;
and the second terminal is used for starting to play the video from the playing progress based on the uniform resource locator of the video and the playing progress if the first appointed operation on the video image is detected.
In a sixth aspect, an embodiment of the present invention provides a mobile terminal, including a processor, a memory, and a computer program stored in the memory and executable on the processor, where the computer program, when executed by the processor, implements the steps of the video image sharing method according to the first aspect or the second aspect.
In a seventh aspect, an embodiment of the present invention provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the steps of the video image sharing method according to the first aspect or the second aspect are implemented.
In the embodiment of the invention, in the process of playing a video, if the picture sharing operation is detected, the first terminal acquires a currently played video image, then adds the skip information to the video image, wherein the skip information comprises the uniform resource locator of the video and the current playing progress, and finally sends the added video image to the second terminal. The user often controls the first terminal to execute the sharing operation when the video is played to the interesting situation, so that the playing progress when the picture sharing operation is detected, namely, the added current playing progress is the playing progress corresponding to the scenario that the user wants to share, and the added uniform resource locator of the video can indicate the position where the video data corresponding to the video can be obtained.
Drawings
Fig. 1 is a flowchart illustrating steps of a video image sharing method according to an embodiment of the present invention;
fig. 2 is a flowchart illustrating steps of another video image sharing method according to an embodiment of the present invention;
fig. 3-1 is a flowchart illustrating steps of another video image sharing method according to an embodiment of the present invention;
FIG. 3-2 is a schematic view of an interface provided by an embodiment of the present invention;
3-3 are schematic diagrams of another interface provided by embodiments of the present invention;
3-4 are schematic diagrams of attribute information provided by embodiments of the present invention;
FIGS. 3-5 are schematic diagrams of a dynamic image provided by an embodiment of the present invention;
fig. 4 is a block diagram of a video image sharing apparatus according to an embodiment of the present invention;
fig. 5 is a block diagram of a video image sharing apparatus according to an embodiment of the present invention;
fig. 6 is a block diagram of a video image sharing system according to an embodiment of the present invention;
fig. 7 is a schematic diagram of a hardware structure of a mobile terminal implementing various embodiments of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Fig. 1 is a flowchart illustrating steps of a video image sharing method according to an embodiment of the present invention, where the method may be applied to a first terminal, and as shown in fig. 1, the method may include:
step 101, in the process of playing a video, if a picture sharing operation is detected, acquiring a currently played video image.
In the embodiment of the present invention, the video may be a video selected by a user from video playing software installed in the first terminal, and certainly, the video may also be a video shot by the user through the first terminal, which is not limited in the embodiment of the present invention. Further, the picture sharing instruction may be sent to the first terminal when the user sees a video mood that the user wants to share, and the picture sharing instruction may be sent by the user by triggering a picture sharing function of the first terminal, where the picture sharing function may be triggered by an entity key combination, for example, the entity key combination may be set as a volume plus key and a power key, and accordingly, the user may trigger the picture sharing function of the first terminal by pressing the volume plus key and the power key simultaneously in a video playing process. Certainly, the picture sharing function may be triggered by a virtual button, for example, the first terminal may provide a picture sharing button, and the user may click the picture sharing button to trigger the picture sharing function of the first terminal.
Further, the currently played video image may be at least one image in the video content played within a certain time period after the picture sharing operation is detected. The video image can be directly extracted from the video data corresponding to the video, or can be obtained by intercepting the video picture. For example, the first terminal may first perform a screenshot operation through a preset screenshot function to obtain an image, and then determine whether a currently played video is in a full-screen playing mode, if so, the image may be directly used as the video image, and if not, the image may be considered to include other content.
102, adding skip information to the video image; the skip information comprises a uniform resource locator of the video and the current playing progress.
In the embodiment of the present invention, the uniform resource locator of the video may be a position indicating that the network can acquire video data corresponding to the video, so that the second terminal can quickly find the corresponding video based on the uniform resource locator of the video in the video image by adding the uniform resource locator of the video to the video image, thereby omitting an operation of manually searching the corresponding video by a user of the second terminal.
Further, the current playing progress may be a playing progress corresponding to the video at the current time, for example, if the video is played to the 1000 th second when the picture sharing operation is detected, the "1000 th second" may be taken as the current playing progress. The picture sharing instruction is often sent when the user sees the plot that the user wants to share, so that the current playing progress can be considered as the playing progress corresponding to the plot that the user wants to share, and thus, the current playing progress is added into the video image, the second terminal can be conveniently and quickly positioned to the position of the plot that the user wants to share based on the playing progress, and further, the operation of manually adjusting the playing progress by the user of the second terminal is omitted. Furthermore, the video images can display part of the content of the video plot in which the user is interested, so that the video images are used as a carrier of the skip information, the user of the second terminal can intuitively feel the part of the content of the video plot, and the sharing effect can be improved.
And 103, sending the video image added with the skip information to a second terminal, wherein the video image is used for indicating the second terminal to play the video according to the skip information.
In the embodiment of the present invention, the second terminal may be a terminal having a friend relationship with the first terminal, for example, the friend relationship may be established through social software of a third party, and specifically, when the added video image is sent to the second terminal, identifiers of all friends having a friend relationship with the first terminal may be displayed first, then, the terminal corresponding to the identifier selected by the user is used as the second terminal, and finally, the added video image is sent to the second terminal, so that the second terminal can play the video according to the skip information in the video image.
In summary, in the video image sharing method provided in the embodiment of the present invention, in the process of playing a video, if a picture sharing operation is detected by a first terminal, a currently played video image is obtained, then skip information is added to the video image, where the skip information includes a uniform resource locator of the video and a current playing progress, and finally, the added video image is sent to a second terminal. The user often controls the first terminal to execute the sharing operation when the video is played to the interesting plot, namely, the picture sharing instruction can be sent when the user sees the plot to be shared, so that the user of the second terminal does not need to manually check and adjust the operation, and the second terminal can play the video from the corresponding playing progress position based on the skip information in the video image, thereby simplifying the operation steps and improving the sharing efficiency.
Fig. 2 is a flowchart of steps of another video image sharing method according to an embodiment of the present invention, where the method may be applied to a second terminal, and as shown in fig. 2, the method may include:
step 201, receiving a video image added with skip information and sent by a first terminal; the skip information comprises a uniform resource locator and a playing progress of the video to which the video image belongs.
In the embodiment of the present invention, the uniform resource locator of the video to which the video image belongs may indicate a position at which video data corresponding to a video that the first terminal user wants to share is obtained, and the play progress included in the skip information may indicate a play progress corresponding to an episode that the first terminal user wants to share.
And 202, displaying the video image.
In the embodiment of the invention, the second terminal can display the video image, so that a user of the second terminal can conveniently and intuitively feel partial content of the video plot and further operate the video image.
Step 203, if the first designated operation on the video image is detected, starting to play the video from the play progress based on the uniform resource locator of the video and the play progress.
In this embodiment of the present invention, the first specifying operation may be preset and is used to trigger the second terminal to execute an operation of starting playing the video from the playing progress added to the video image, for example, the first specifying operation may be a single-click operation on the video image, and the first specifying operation may also be another operation on the video image, which is not limited in this embodiment of the present invention.
Accordingly, the second terminal may consider that the user of the second terminal wants to watch the video shared by the first terminal when detecting the first specified operation of the user on the video image, and therefore, the second terminal may start to play the video from the play progress based on the uniform resource locator and the play progress of the video in the skip information. For example, assuming that the video uniform resource locator included in the skip information is the uniform resource locator of the video a and the playing progress included in the skip information is enemy 1000 seconds, the second terminal may start playing the video a from the 1000 th second position when detecting the first specified operation on the video image.
In summary, in the video image sharing method provided in the embodiment of the present invention, the second terminal receives the video image added with the skip information sent by the first terminal, where the skip information includes the url of the video to which the video image belongs and the playing progress, then displays the video image, and finally, starts to play the video from the playing progress based on the url of the video and the playing progress when the first specified operation on the video image is detected. Therefore, the user of the second terminal does not need to perform manual checking, receiving and adjusting operations, and the second terminal can start playing the video from the corresponding playing progress position based on the skipping information in the video image, so that the operation steps can be simplified, and the sharing efficiency is improved.
Fig. 3-1 is a flowchart illustrating steps of another video image sharing method according to an embodiment of the present invention, as shown in fig. 3-1, the method may include:
step 301, in the process of playing a video, if a picture sharing operation is detected, the first terminal acquires a currently played video image.
For example, suppose that a user triggers a picture sharing function of a first terminal through a picture sharing button and sends a picture sharing instruction to the first terminal, and accordingly, fig. 3-2 is an interface schematic diagram provided in an embodiment of the present invention, as shown in fig. 3-2, the interface schematic diagram includes a video interface 01 and a picture sharing button 02 displayed in the video interface, and the user can trigger the picture sharing function of the first terminal by clicking the picture sharing button 02, so as to send the picture sharing instruction to the first terminal.
Certainly, in order to avoid the problem that the displayed picture sharing button blocks a video interface and further interferes with video viewing, in the embodiment of the present invention, the picture sharing button may be further disposed in the pull-down status bar of the first terminal, so that the user may control the first terminal to display the pull-down status bar when the user wants to share the picture, for example, the user may control the first terminal to display the pull-down status bar through a sliding down operation, and then, the user may send the picture sharing instruction by clicking the picture sharing button in the pull-down status bar. For example, fig. 3-3 are another interface schematic diagram provided by the embodiment of the present invention, as shown in fig. 3-3, the interface schematic diagram includes a video interface 01, a drop-down status bar 03, and a picture sharing button 02 displayed in the drop-down status bar 03.
Step 302, the first terminal adds jump information to the video image; the skip information comprises a uniform resource locator of the video and the current playing progress.
In an actual application scenario, video data used when the first terminal plays the video may be video data corresponding to the video obtained from a network in real time based on the uniform resource locator of the video, or may be video data corresponding to the video directly read from the inside of the first terminal, and accordingly, in order to ensure that the uniform resource locator of the video can be obtained and to ensure that the operation of adding the skip information can be performed normally, in an embodiment of the present invention, the first terminal may perform the following steps a to C before step 302:
and step A, the first terminal determines an acquisition source of video data used when the video is played.
In this step, the first terminal may detect whether the video data is read from inside the first terminal, and if so, may determine that the acquisition source is the first terminal, and if not, may determine that the acquisition source is any device other than the first terminal.
And B, if the acquisition source is the first terminal, the first terminal uploads the video data corresponding to the video to a designated server so as to acquire the uniform resource locator of the video.
Specifically, the first terminal may upload video data corresponding to an internally stored video to a designated server when the acquisition source is the first terminal, where the designated server may be preset, for example, the designated server may be a server corresponding to a designated cloud disk, and accordingly, the designated server may generate a corresponding uniform resource locator for the video data after receiving the video data, and return the uniform resource locator to the first terminal, thereby implementing acquisition of the uniform resource locator for the video. In the embodiment of the invention, the acquisition source of the video data used when the video is played is determined firstly, and under the condition that the acquisition source is the first terminal, the uniform resource locator is generated for the video by uploading the video data corresponding to the video, so that the subsequent adding operation can be carried out normally.
And step C, if the acquisition source is any equipment except the first terminal, the first terminal determines the uniform resource locator of the video based on the acquisition address of the video data.
In this step, if the acquisition source is any device except the first terminal, it may be considered that the video data of the video is acquired by the first terminal on line from the network, and accordingly, the first terminal may directly determine the acquisition address of the used video data as the uniform resource locator of the video.
Further, when detecting the picture sharing operation, the first terminal may further obtain a playing progress corresponding to the video from background data corresponding to the video playing application, so as to obtain a current playing progress. Further, when the operation of adding the skip information to the video image is implemented, the first terminal may add the skip information to a header of picture data corresponding to the video image so that the skip information is hidden in the video image. The essence of the video image is a picture, picture data corresponding to the picture image is often stored in a binary format, the picture data generally includes a header, the header may include a designated image class, the designated image class may be an Exifinterface class, the Exifinterface class may include a plurality of parameters stored in a key-value display, and the parameters stored in the Exifinterface class are carried by the picture data in a hidden manner, that is, the parameters in the Exifinterface class are not displayed on the surface of the picture.
Because the skip information is often long, if the skip information is displayed on the surface of the video image, the watching of the user may be affected, therefore, in the embodiment of the invention, the skip information can be added to the header of the picture data corresponding to the video image, and the problem of blocking the watching of the user is avoided while the video image carries the skip information.
And step 303, the first terminal sends the video image added with the skip information to the second terminal.
Specifically, the step 103 may be referred to in an implementation manner of this step, and this is not limited in this embodiment of the present invention.
Further, in order to enhance the interest of sharing and facilitate the user of the second terminal to quickly know the information related to the shared video, in the embodiment of the present invention, the first terminal may further obtain attribute information of the video before step 303, where the attribute information may include at least one of a name, a type, a duration, and a playing location of the video, and specifically, when obtaining the name, the type, and the duration of the video, the first terminal may search from a network, may also display an information input window, and accordingly, may receive information input by the user through the information input window. The playing place can be a position where the user watches the video by using the first terminal, and for example, the playing place can be sienna, shenzhen, beijing, and the like. Accordingly, the first terminal can read the collected Global Positioning System (GPS) information, and determine the playing location of the video based on the GPS information.
Then, the first terminal may add the attribute information to a packet class of the picture data corresponding to the video image so that the attribute information is displayed on a surface of the video image. In this step, because the attribute information is short and can help the user to quickly know the information related to the video, the attribute information can be added to the package class of the picture data, so that when the terminal displays the video image based on the picture data, the attribute information can be visually displayed on the surface of the video image, and then the user can quickly and conveniently know the information related to the video based on the attribute information displayed on the surface of the video image.
Further, in the embodiment of the present invention, the first terminal may further receive a comment text input by the user before step 303, specifically, the first terminal may display a comment input window after detecting the picture sharing operation, and the user may input a comment on a video episode that the user wants to share through the comment input window, and accordingly, may receive the comment text input by the user through the comment input window, and then may add the comment text to a header or a package of picture data corresponding to the video image. Therefore, under the condition that the comment text is added to the header, the comment text can be sent to the second terminal, meanwhile, the comment text is prevented from being displayed on the video image, and the problem of watching experience of a user is further reduced. Under the condition that the comment text is added to the package class, the comment text can be displayed on the surface of the video image, so that a user of the second terminal can conveniently and intuitively feel the view of the user of the first terminal on the video, and interaction is improved.
It should be noted that, in another alternative embodiment of the present invention, other additional information, such as video resolution, audio/video coding information, decoding information, and the like, may also be added to the video image according to different requirements, and the embodiment of the present invention does not limit this.
And step 304, the second terminal receives the video image added with the jump information sent by the first terminal.
Specifically, the implementation manner of this step may refer to step 201 described above, and details of the embodiment of the present invention are not described herein.
Further, the video image may further be added with attribute information of the video, where the attribute information may include at least one of a video name, a video type, and a video duration, for example, fig. 3 to 4 are schematic diagrams of attribute information provided in an embodiment of the present invention, and fig. 3 to 4 take the attribute information including the video name, the video type, the video duration, and a playing location as an example, as shown in fig. 3 to 4, it can be seen that the video name, the video type, the video duration, and the playing location are displayed in the video image. Accordingly, the second terminal may also perform enlarged display of the area in the video image where the attribute information is displayed, in a case where the second specification operation for the video image is detected. The second designated operation may be a preset operation different from the first designated operation, and for example, the second designated operation may be a single-finger long-pressing operation on the video image. Further, the second terminal may confirm that the user wants to enlarge and display the portion of the attribute information when detecting the second specified operation, and accordingly, the second terminal may perform the enlargement operation, so that the user can conveniently view the attribute information by enlarging and displaying the area of the attribute information in the video image, and the convenience of knowing the attribute information related to the video by the user is improved.
Further, the attribute information may include a type of the video, and accordingly, the second terminal may display a moving image matching the type of the video in a case where the third specifying operation for the video image is detected and the type of the video is included in the attribute information. The third specifying operation may be a preset operation different from the first specifying operation and the second specifying operation, for example, the third specifying operation may be a double-finger long-press operation on the video image, specifically, the second terminal may detect whether the attribute information includes a video type after detecting the third specifying operation, and if the attribute information includes the video type, may search for a dynamic image matching the video type from a preset correspondence between the type and the dynamic image, for example, assuming that the preset dynamic image corresponding to the type "terror" is a dynamic image in a skull shape, the dynamic image corresponding to the type "smile" is a dynamic image in a smiling face shape, the dynamic image corresponding to the type "cartoon" is a dynamic image in a puppy shape, and the video type is a cartoon, the dynamic image in a puppy shape may be displayed, therefore, the type of the video is displayed to the user in a more vivid mode, and the sharing interactivity can be improved to a certain extent. For example, fig. 3-5 are schematic diagrams of a dynamic image provided by an embodiment of the present invention, and as shown in fig. 3-5, a dynamic image with a puppy shape displayed on a video image can be seen.
And 305, displaying the video image.
Specifically, the implementation manner of this step may refer to step 202, which is not described herein again in this embodiment of the present invention.
Step 306, if the second terminal detects the first designated operation on the video image, starting to play the video from the play progress based on the uniform resource locator of the video and the play progress.
Specifically, the implementation manner of this step may refer to step 203 described above, and details of the embodiment of the present invention are not described herein.
Further, the video image may be further added with a comment text, and accordingly, the second terminal may extract the comment text and display the comment text on the video playing interface in the process of playing the video, and specifically, the comment text may be displayed on the video interface in a dynamic manner. Because the user at the second terminal can know the story content that the user at the first terminal wants to share to a certain extent through the played video, the comment text is displayed in the process of playing the video, so that the user at the second terminal can more fully experience the emotion in the comment text, and the interaction of sharing is further improved.
In summary, in the video image sharing method provided in the embodiment of the present invention, in the process of playing a video, if a picture sharing operation is detected by a first terminal, a currently played video image is obtained, then skip information is added to the video image, where the skip information includes a uniform resource locator of the video and a current playing progress, and finally, the added video image is sent to a second terminal, the second terminal receives the video image added with the skip information, displays the video image, and starts playing the video from the playing progress based on the uniform resource locator and the playing progress of the video when a first specific operation on the video image is detected. The user often controls the first terminal to execute the sharing operation when the video is played to the interesting situation, that is, the picture sharing instruction can be considered to be sent when the user sees the desired shared plot, so that the current playing progress can be considered to be the playing progress corresponding to the desired shared plot, and the added uniform resource locator of the video can indicate the position where the video data corresponding to the video can be acquired, so that the user of the second terminal does not need to manually check and adjust the operation, and the second terminal can play the video from the corresponding playing progress based on the skip information in the video image, thereby simplifying the operation steps and improving the sharing efficiency.
Fig. 4 is a block diagram of a video image sharing apparatus according to an embodiment of the present invention, and as shown in fig. 4, the apparatus 40 may include:
the first obtaining module 401 is configured to, in a process of playing a video, obtain a currently played video image if a picture sharing operation is detected.
A first adding module 402, configured to add skip information to the video image; the skip information comprises a uniform resource locator of the video and the current playing progress.
A sending module 403, configured to send the video image to which the skip information is added to a second terminal, where the video image is used to instruct the second terminal to play the video according to the skip information.
In summary, the mobile terminal provided in the embodiment of the present invention can implement each process implemented by the first terminal in the method embodiment of fig. 1, and is not described herein again to avoid repetition. In the device provided by the embodiment of the invention, in the process of playing the video, if the picture sharing operation is detected, the first obtaining module obtains the currently played video image, then the first adding module adds the skip information to the video image, wherein the skip information comprises the uniform resource locator of the video and the current playing progress, and finally, the sending module sends the added video image to the second terminal. Therefore, the user of the second terminal does not need to perform manual checking, receiving and adjusting operations, and the second terminal can start playing the video from the corresponding playing progress position based on the skipping information in the video image, so that the operation steps can be simplified, and the sharing efficiency is improved.
Optionally, the apparatus 40 further includes:
the first determining module is used for determining an acquisition source of video data used when the video is played.
The uploading module is used for uploading the video data corresponding to the video to a designated server to acquire a uniform resource locator of the video if the acquisition source is the first terminal; or,
a second determining module, configured to determine, if the acquisition source is any device other than the first terminal, a uniform resource locator of the video based on the acquisition address of the video data.
Optionally, the first adding module is configured to:
and adding the skip information into a header of picture data corresponding to the video image so as to hide the skip information in the video image.
Optionally, the apparatus 40 further includes:
the second acquisition module is used for acquiring the attribute information of the video; the attribute information includes at least one of a name, a type, a duration, and a playing location of the video.
And the second adding module is used for adding the attribute information into the packet class of the picture data corresponding to the video image so as to display the attribute information on the surface of the video image.
Optionally, the apparatus 40 further includes:
and the receiving module is used for receiving comment texts input by users.
And the writing module is used for adding the comment text into a header or a packet class of the picture data corresponding to the video image.
In summary, the mobile terminal provided in the embodiment of the present invention can implement each process implemented by the first terminal in the method embodiment of fig. 3-1, and for avoiding repetition, details are not described here again. According to the device provided by the embodiment of the invention, in the process of playing a video, if the picture sharing operation is detected by the first obtaining module, a currently played video image is obtained, then the first adding module adds the skip information to the video image, wherein the skip information comprises the uniform resource locator of the video and the current playing progress, the sending module sends the added video image to the second terminal, and the second adding module adds the attribute information to the video image, so that a user of the second terminal can quickly and conveniently know the information related to the video. In the embodiment of the invention, the user of the second terminal does not need to perform manual checking and adjusting operations, and the second terminal can start playing the video from the corresponding playing progress position based on the skip information in the video image, so that the operation steps can be simplified, and the sharing efficiency can be improved.
Fig. 5 is a block diagram of a video image sharing apparatus according to an embodiment of the present invention, and as shown in fig. 5, the apparatus 50 may include:
a receiving module 501, configured to receive a video image added with skip information and sent by a first terminal; the skip information comprises a uniform resource locator and a playing progress of the video to which the video image belongs.
A first display module 502, configured to display the video image.
A playing module 503, configured to start playing the video from the playing progress based on the uniform resource locator of the video and the playing progress if the first specified operation on the video image is detected.
In summary, the apparatus provided in the embodiment of the present invention can implement each process implemented by the mobile terminal in the method embodiment of fig. 2, and is not described herein again to avoid repetition. In the apparatus provided in the embodiment of the present invention, the receiving module receives a video image added with skip information sent by a first terminal, where the skip information includes a uniform resource locator and a playing progress of a video to which the video image belongs, and then the first display module displays the video image, and the playing module starts playing the video from the playing progress based on the uniform resource locator and the playing progress of the video when detecting a first specific operation on the video image. Therefore, the user of the second terminal does not need to perform manual checking, receiving and adjusting operations, and the second terminal can start playing the video from the corresponding playing progress position based on the skipping information in the video image, so that the operation steps can be simplified, and the sharing efficiency is improved.
Optionally, comment text is added to the video image.
The apparatus 50 further comprises:
and the extraction module is used for extracting the comment text.
And the second display module is used for displaying the comment text on a playing interface of the video in the process of playing the video.
Optionally, the video image is further added with attribute information of the video; the apparatus 50 further comprises.
The amplifying module is used for amplifying and displaying the area in the video image, wherein the area is displayed with the attribute information, if the second specified operation on the video image is detected; and/or the presence of a gas in the gas,
and the third display module is used for displaying the dynamic image matched with the type of the video if the third specified operation on the video image is detected and the attribute information comprises the type of the video.
In summary, the apparatus provided in the embodiment of the present invention can implement each process implemented by the second terminal in the method embodiment of fig. 3-1, and is not described herein again to avoid repetition. In the apparatus provided in the embodiment of the present invention, the receiving module receives a video image added with skip information sent by the first terminal, where the skip information includes a uniform resource locator and a playing progress of a video to which the video image belongs, and then the playing module plays the video from the playing progress based on the uniform resource locator and the playing progress of the video when detecting a first specific operation on the video image, and the first display module also displays a comment text in the process of playing the video, so as to improve interactivity. In the embodiment of the invention, the user of the second terminal does not need to perform manual checking and adjusting operations, and the second terminal can start playing the video from the corresponding playing progress position based on the skip information in the video image, so that the operation steps can be simplified, and the sharing efficiency can be improved.
Fig. 6 is a block diagram of a video image sharing system according to an embodiment of the present invention, and as shown in fig. 6, the system 60 may include: a first terminal 601 and a second terminal 602.
The first terminal 601 is configured to, in a process of playing a video, obtain a currently played video image if a picture sharing operation is detected.
The first terminal 601 is configured to add skip information to the video image; the skip information comprises a uniform resource locator of the video and the current playing progress.
The first terminal 601 is configured to send the video image to which the skip information is added to the second terminal 602.
The second terminal 602 is configured to receive the video image added with the skip information sent by the first terminal 601.
The second terminal 602 is configured to display the video image.
The second terminal 602 is configured to, if a first specified operation on the video image is detected, start playing the video from the playing progress based on the uniform resource locator of the video and the playing progress.
In summary, in the system provided in the embodiment of the present invention, in the process of playing a video, if a picture sharing operation is detected by a first terminal, a currently played video image is obtained, then skip information is added to the video image, where the skip information includes a uniform resource locator of the video and a current playing progress, and finally, the added video image is sent to a second terminal, and the second terminal receives the video image added with the skip information, and starts to play the video from the playing progress based on the uniform resource locator and the playing progress of the video when a first specific operation on the video image is detected. Therefore, the user of the second terminal does not need to perform manual checking, receiving and adjusting operations, and the second terminal can start playing the video from the corresponding playing progress position based on the skipping information in the video image, so that the operation steps can be simplified, and the sharing efficiency is improved.
Fig. 7 is a schematic diagram of a hardware structure of a mobile terminal for implementing various embodiments of the present invention, where the mobile terminal 700 includes but is not limited to: a radio frequency unit 701, a network module 702, an audio output unit 703, an input unit 704, a sensor 705, a display unit 706, a user input unit 707, an interface unit 708, a memory 709, a processor 710, a power supply 711, and the like. Those skilled in the art will appreciate that the mobile terminal architecture shown in fig. 7 is not intended to be limiting of mobile terminals, and that a mobile terminal may include more or fewer components than shown, or some components may be combined, or a different arrangement of components. In the embodiment of the present invention, the mobile terminal includes, but is not limited to, a mobile phone, a tablet computer, a notebook computer, a palm computer, a vehicle-mounted mobile terminal, a wearable device, a pedometer, and the like.
The processor 710 is configured to, in a process of playing a video, obtain a currently played video image if a picture sharing operation is detected; adding skip information to the video image; the skip information comprises a uniform resource locator of the video and the current playing progress; and sending the video image added with the skip information to a second terminal, wherein the video image is used for indicating the second terminal to play the video according to the skip information.
In the embodiment of the invention, in the process of playing a video, if the picture sharing operation is detected, the first terminal acquires a currently played video image, then adds the skip information to the video image, wherein the skip information comprises the uniform resource locator of the video and the current playing progress, and finally sends the added video image to the second terminal. The user often controls the first terminal to execute the sharing operation when the video is played to the interesting plot, namely, the picture sharing instruction can be sent when the user sees the plot to be shared, so that the user of the second terminal does not need to manually check and adjust the operation, and the second terminal can play the video from the corresponding playing progress position based on the skip information in the video image, thereby simplifying the operation steps and improving the sharing efficiency.
It should be understood that, in the embodiment of the present invention, the radio frequency unit 701 may be used for receiving and sending signals during a message transmission and reception process or a call process, and specifically, receives downlink data from a base station and then processes the received downlink data to the processor 710; in addition, the uplink data is transmitted to the base station. In general, radio frequency unit 701 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like. In addition, the radio frequency unit 701 may also communicate with a network and other devices through a wireless communication system.
The mobile terminal provides the user with wireless broadband internet access via the network module 702, such as helping the user send and receive e-mails, browse web pages, and access streaming media.
The audio output unit 703 may convert audio data received by the radio frequency unit 701 or the network module 702 or stored in the memory 709 into an audio signal and output as sound. Also, the audio output unit 703 may also provide audio output related to a specific function performed by the mobile terminal 700 (e.g., a call signal reception sound, a message reception sound, etc.). The audio output unit 703 includes a speaker, a buzzer, a receiver, and the like.
The input unit 704 is used to receive audio or video signals. The input Unit 704 may include a Graphics Processing Unit (GPU) 7041 and a microphone 7042, and the Graphics processor 7041 processes image data of a still picture or video obtained by an image capturing device (e.g., a camera) in a video capturing mode or an image capturing mode. The processed image frames may be displayed on the display unit 106. The image frames processed by the graphic processor 7041 may be stored in the memory 709 (or other storage medium) or transmitted via the radio unit 701 or the network module 702. The microphone 7042 may receive sounds and may be capable of processing such sounds into audio data. The processed audio data may be converted into a format output transmittable to a mobile communication base station via the radio frequency unit 701 in case of a phone call mode.
The mobile terminal 700 also includes at least one sensor 705, such as a light sensor, motion sensor, and other sensors. Specifically, the light sensor includes an ambient light sensor that can adjust the brightness of the display panel 7061 according to the brightness of ambient light, and a proximity sensor that can turn off the display panel 7061 and/or a backlight when the mobile terminal 700 is moved to the ear. As one of the motion sensors, the accelerometer sensor can detect the magnitude of acceleration in each direction (generally three axes), detect the magnitude and direction of gravity when stationary, and can be used to identify the posture of the mobile terminal (such as horizontal and vertical screen switching, related games, magnetometer posture calibration), and vibration identification related functions (such as pedometer, tapping); the sensors 705 may also include fingerprint sensors, pressure sensors, iris sensors, molecular sensors, gyroscopes, barometers, hygrometers, thermometers, infrared sensors, etc., which are not described in detail herein.
The display unit 706 is used to display information input by the user or information provided to the user. The Display unit 106 may include a Display panel 7061, and the Display panel 7061 may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like.
The user input unit 707 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the mobile terminal. Specifically, the user input unit 707 includes a touch panel 7071 and other input devices 7072. The touch panel 7071, also referred to as a touch screen, may collect touch operations by a user on or near the touch panel 7071 (e.g., operations by a user on or near the touch panel 7071 using a finger, a stylus, or any other suitable object or attachment). The touch panel 7071 may include two parts of a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 710, receives a command from the processor 710, and executes the command. In addition, the touch panel 7071 can be implemented by various types such as resistive, capacitive, infrared, and surface acoustic wave. The user input unit 707 may include other input devices 7072 in addition to the touch panel 7071. In particular, the other input devices 7072 may include, but are not limited to, a physical keyboard, function keys (such as volume control keys, switch keys, etc.), a trackball, a mouse, and a joystick, which are not described herein again.
Further, the touch panel 7071 may be overlaid on the display panel 7061, and when the touch panel 7071 detects a touch operation on or near the touch panel 7071, the touch operation is transmitted to the processor 710 to determine the type of the touch event, and then the processor 710 provides a corresponding visual output on the display panel 7061 according to the type of the touch event. Although the touch panel 7071 and the display panel 7061 are shown in fig. 7 as two separate components to implement the input and output functions of the mobile terminal, in some embodiments, the touch panel 7071 and the display panel 7061 may be integrated to implement the input and output functions of the mobile terminal, which is not limited herein.
The interface unit 708 is an interface through which an external device is connected to the mobile terminal 700. For example, the external device may include a wired or wireless headset port, an external power supply (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, an audio input/output (I/O) port, a video I/O port, an earphone port, and the like. The interface unit 708 may be used to receive input (e.g., data information, power, etc.) from external devices and transmit the received input to one or more elements within the mobile terminal 700 or may be used to transmit data between the mobile terminal 700 and external devices.
The memory 709 may be used to store software programs as well as various data. The memory 709 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the cellular phone, and the like. Further, the memory 709 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
The processor 710 is a control center of the mobile terminal, connects various parts of the entire mobile terminal using various interfaces and lines, and performs various functions of the mobile terminal and processes data by operating or executing software programs and/or modules stored in the memory 709 and calling data stored in the memory 709, thereby integrally monitoring the mobile terminal. Processor 710 may include one or more processing units; preferably, the processor 710 may integrate an application processor, which mainly handles operating systems, user interfaces, application programs, etc., and a modem processor, which mainly handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into processor 710.
The mobile terminal 700 may also include a power supply 711 (e.g., a battery) for powering the various components, and the power supply 711 may be logically coupled to the processor 710 via a power management system that may enable managing charging, discharging, and power consumption by the power management system.
In addition, the mobile terminal 700 includes some functional modules that are not shown, and thus will not be described in detail herein.
Optionally, an embodiment of the present invention further provides a mobile terminal, including a processor 710, a memory 709, and a computer program stored in the memory 709 and capable of running on the processor 710, where the computer program, when executed by the processor 710, implements each process of the video image sharing method embodiment, and can achieve the same technical effect, and is not described herein again to avoid repetition.
The embodiment of the present invention further provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the computer program implements each process of the video image sharing method embodiment, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here. The computer-readable storage medium may be a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. The term "comprising" is used to specify the presence of stated features, integers, steps, operations, elements, components, operations.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present invention may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a mobile terminal (such as a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present invention.
While the present invention has been described with reference to the embodiments shown in the drawings, the present invention is not limited to the embodiments, which are illustrative and not restrictive, and it will be apparent to those skilled in the art that various changes and modifications can be made therein without departing from the spirit and scope of the invention as defined in the appended claims.

Claims (18)

1. A video image sharing method is applied to a first terminal, and is characterized by comprising the following steps:
in the process of playing a video, if picture sharing operation is detected, acquiring a currently played video image;
adding skip information to the video image; the skip information comprises a uniform resource locator of the video and the current playing progress;
and sending the video image added with the skip information to a second terminal, wherein the video image is used for indicating the second terminal to play the video according to the skip information.
2. The method according to claim 1, wherein after the detecting of the picture sharing operation and before the adding of the skip information to the video image, the method further comprises:
determining an acquisition source of video data used when the video is played;
if the acquisition source is the first terminal, uploading video data corresponding to the video to a designated server to acquire a uniform resource locator of the video; or,
and if the acquisition source is any equipment except the first terminal, determining the uniform resource locator of the video based on the acquisition address of the video data.
3. The method of claim 2, wherein adding skip information to the video image comprises:
and adding the skip information into a header of picture data corresponding to the video image so as to hide the skip information in the video image.
4. The method according to any one of claims 1 to 3, wherein before sending the video image with the skip information added to the second terminal, the method further comprises:
acquiring attribute information of the video; the attribute information comprises at least one of the name, the type, the duration and the playing place of the video;
and adding the attribute information into a packet class of picture data corresponding to the video image so as to display the attribute information on the surface of the video image.
5. The method according to any one of claims 1 to 3, wherein before sending the video image with the skip information added to the second terminal, the method further comprises:
receiving comment texts input by a user;
and adding the comment text into a header or a packet class of the picture data corresponding to the video image.
6. A video image sharing method is applied to a second terminal, and is characterized by comprising the following steps:
receiving a video image added with skip information and sent by a first terminal; the skip information comprises a uniform resource locator and a playing progress of a video to which the video image belongs;
displaying the video image;
and if the first appointed operation on the video image is detected, starting to play the video from the playing progress based on the uniform resource locator of the video and the playing progress.
7. The method according to claim 6, wherein the video image is further added with comment text;
after the video is played from the play progress, the method further includes:
extracting the comment text;
and displaying the comment text on a playing interface of the video in the process of playing the video.
8. The method according to claim 6 or 7, wherein the video image is further added with attribute information of the video;
after receiving the video image added with the skip information sent by the first terminal, the method further comprises:
if a second specified operation on the video image is detected, performing amplification display on the area of the attribute information in the video image; and/or the presence of a gas in the gas,
and if the third specified operation on the video image is detected and the attribute information comprises the type of the video, displaying a dynamic image matched with the type of the video.
9. The utility model provides a video image shares device, is applied to first terminal, its characterized in that, the device includes:
the first acquisition module is used for acquiring a currently played video image if picture sharing operation is detected in the process of playing a video;
the first adding module is used for adding jump information to the video image; the skip information comprises a uniform resource locator of the video and the current playing progress;
and the sending module is used for sending the video image added with the skip information to a second terminal, and the video image is used for indicating the second terminal to play the video according to the skip information.
10. The apparatus of claim 9, further comprising:
the first determining module is used for determining an acquisition source of video data used when the video is played;
the uploading module is used for uploading the video data corresponding to the video to a designated server to acquire a uniform resource locator of the video if the acquisition source is the first terminal; or,
a second determining module, configured to determine, if the acquisition source is any device other than the first terminal, a uniform resource locator of the video based on the acquisition address of the video data.
11. The apparatus of claim 10, wherein the first adding module is configured to:
and adding the skip information into a header of picture data corresponding to the video image so as to hide the skip information in the video image.
12. The apparatus of any of claims 9 to 11, further comprising:
the second acquisition module is used for acquiring the attribute information of the video; the attribute information comprises at least one of the name, the type, the duration and the playing place of the video;
and the second adding module is used for adding the attribute information into the packet class of the picture data corresponding to the video image so as to display the attribute information on the surface of the video image.
13. The apparatus of any of claims 9 to 11, further comprising:
the receiving module is used for receiving comment texts input by users;
and the writing module is used for adding the comment text into a header or a packet class of the picture data corresponding to the video image.
14. The utility model provides a video image shares device, is applied to the second terminal, its characterized in that, the device includes:
the receiving module is used for receiving the video image added with the skip information and sent by the first terminal; the skip information comprises a uniform resource locator and a playing progress of a video to which the video image belongs;
the first display module is used for displaying the video image;
and the playing module is used for starting playing the video from the playing progress based on the uniform resource locator of the video and the playing progress if the first specified operation on the video image is detected.
15. The apparatus according to claim 14, wherein the video image is further added with comment text;
the device further comprises:
the extraction module is used for extracting the comment text;
and the second display module is used for displaying the comment text on a playing interface of the video in the process of playing the video.
16. The apparatus according to claim 14 or 15, wherein the video image is further added with attribute information of the video;
the device further comprises:
the amplifying module is used for amplifying and displaying the area in the video image, wherein the area is displayed with the attribute information, if the second specified operation on the video image is detected; and/or the presence of a gas in the gas,
and the third display module is used for displaying the dynamic image matched with the type of the video if the third specified operation on the video image is detected and the attribute information comprises the type of the video.
17. A video image sharing system is characterized by comprising a first terminal and a second terminal;
the first terminal is used for acquiring a currently played video image if a picture sharing operation is detected in the process of playing a video;
the first terminal is used for adding skip information to the video image; the skip information comprises a uniform resource locator of the video and the current playing progress;
the first terminal is used for sending the video image added with the skip information to the second terminal;
the second terminal is used for receiving the video image added with the skip information and sent by the first terminal;
the second terminal is used for displaying the video image;
and the second terminal is used for starting to play the video from the playing progress based on the uniform resource locator of the video and the playing progress if the first appointed operation on the video image is detected.
18. A mobile terminal comprising a processor, a memory, and a computer program stored on the memory and executable on the processor, the computer program when executed by the processor implementing the steps of the video image sharing method according to any one of claims 1 to 5 or 6 to 8.
CN201910464515.XA 2019-05-30 2019-05-30 A kind of video image sharing method, device and mobile terminal Pending CN110087149A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910464515.XA CN110087149A (en) 2019-05-30 2019-05-30 A kind of video image sharing method, device and mobile terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910464515.XA CN110087149A (en) 2019-05-30 2019-05-30 A kind of video image sharing method, device and mobile terminal

Publications (1)

Publication Number Publication Date
CN110087149A true CN110087149A (en) 2019-08-02

Family

ID=67422702

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910464515.XA Pending CN110087149A (en) 2019-05-30 2019-05-30 A kind of video image sharing method, device and mobile terminal

Country Status (1)

Country Link
CN (1) CN110087149A (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110536028A (en) * 2019-08-15 2019-12-03 咪咕文化科技有限公司 Video color ring realization method, color ring platform and terminal
CN110784771A (en) * 2019-10-30 2020-02-11 维沃移动通信有限公司 Video sharing method and electronic equipment
CN110933509A (en) * 2019-12-09 2020-03-27 北京字节跳动网络技术有限公司 Information publishing method and device, electronic equipment and storage medium
CN111369848A (en) * 2020-02-14 2020-07-03 广州视源电子科技股份有限公司 Courseware content interaction based method and device, storage medium and electronic equipment
WO2021179931A1 (en) * 2020-03-13 2021-09-16 华为技术有限公司 Url screen projection method and apparatus
CN114727141A (en) * 2022-03-21 2022-07-08 康键信息技术(深圳)有限公司 Video synchronous playing method and device, electronic equipment and storage medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103716703A (en) * 2012-10-09 2014-04-09 腾讯科技(深圳)有限公司 Video playing method and apparatus
CN103942327A (en) * 2014-04-29 2014-07-23 联想(北京)有限公司 Information sharing method and device
JP2015026904A (en) * 2013-07-24 2015-02-05 株式会社リコー Image projection system, operation device and image projection device specification method
CN106470147A (en) * 2015-08-18 2017-03-01 腾讯科技(深圳)有限公司 Video sharing method and apparatus, video broadcasting method and device

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103716703A (en) * 2012-10-09 2014-04-09 腾讯科技(深圳)有限公司 Video playing method and apparatus
JP2015026904A (en) * 2013-07-24 2015-02-05 株式会社リコー Image projection system, operation device and image projection device specification method
CN103942327A (en) * 2014-04-29 2014-07-23 联想(北京)有限公司 Information sharing method and device
CN106470147A (en) * 2015-08-18 2017-03-01 腾讯科技(深圳)有限公司 Video sharing method and apparatus, video broadcasting method and device

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110536028A (en) * 2019-08-15 2019-12-03 咪咕文化科技有限公司 Video color ring realization method, color ring platform and terminal
CN110536028B (en) * 2019-08-15 2021-10-26 咪咕文化科技有限公司 Video color ring realization method, color ring platform, terminal and storage medium
CN110784771A (en) * 2019-10-30 2020-02-11 维沃移动通信有限公司 Video sharing method and electronic equipment
CN110784771B (en) * 2019-10-30 2022-02-08 维沃移动通信有限公司 Video sharing method and electronic equipment
CN110933509A (en) * 2019-12-09 2020-03-27 北京字节跳动网络技术有限公司 Information publishing method and device, electronic equipment and storage medium
CN111369848A (en) * 2020-02-14 2020-07-03 广州视源电子科技股份有限公司 Courseware content interaction based method and device, storage medium and electronic equipment
WO2021179931A1 (en) * 2020-03-13 2021-09-16 华为技术有限公司 Url screen projection method and apparatus
CN114727141A (en) * 2022-03-21 2022-07-08 康键信息技术(深圳)有限公司 Video synchronous playing method and device, electronic equipment and storage medium

Similar Documents

Publication Publication Date Title
CN110087117B (en) Video playing method and terminal
CN108737904B (en) Video data processing method and mobile terminal
CN107846352B (en) Information display method and mobile terminal
CN109525874B (en) Screen capturing method and terminal equipment
CN110784771B (en) Video sharing method and electronic equipment
CN111314784B (en) Video playing method and electronic equipment
CN110087149A (en) A kind of video image sharing method, device and mobile terminal
CN110109593B (en) Screen capturing method and terminal equipment
CN107908765B (en) Game resource processing method, mobile terminal and server
CN111309218A (en) Information display method and device and electronic equipment
CN109857297B (en) Information processing method and terminal equipment
CN108616771B (en) Video playing method and mobile terminal
CN109922294B (en) Video processing method and mobile terminal
CN107734170B (en) Notification message processing method, mobile terminal and wearable device
CN109189303B (en) Text editing method and mobile terminal
CN109412932B (en) Screen capturing method and terminal
CN110719527A (en) Video processing method, electronic equipment and mobile terminal
CN108616772B (en) Bullet screen display method, terminal and server
CN109618218B (en) Video processing method and mobile terminal
CN108366221A (en) A kind of video call method and terminal
CN109495638B (en) Information display method and terminal
CN106101764A (en) A kind of methods, devices and systems showing video data
CN108600079B (en) Chat record display method and mobile terminal
CN111383175A (en) Picture acquisition method and electronic equipment
CN107809674A (en) A kind of customer responsiveness acquisition, processing method, terminal and server based on video

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20190802

RJ01 Rejection of invention patent application after publication