CN112751744A - Method and device for controlling video playing, computing equipment and storage medium - Google Patents

Method and device for controlling video playing, computing equipment and storage medium Download PDF

Info

Publication number
CN112751744A
CN112751744A CN201911056014.4A CN201911056014A CN112751744A CN 112751744 A CN112751744 A CN 112751744A CN 201911056014 A CN201911056014 A CN 201911056014A CN 112751744 A CN112751744 A CN 112751744A
Authority
CN
China
Prior art keywords
video
interface
playing
triggering
chat
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201911056014.4A
Other languages
Chinese (zh)
Other versions
CN112751744B (en
Inventor
彭傲
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN201911056014.4A priority Critical patent/CN112751744B/en
Priority to CN202210868991.XA priority patent/CN115051965B/en
Publication of CN112751744A publication Critical patent/CN112751744A/en
Application granted granted Critical
Publication of CN112751744B publication Critical patent/CN112751744B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L51/00User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail
    • H04L51/04Real-time or near real-time messaging, e.g. instant messaging [IM]
    • H04L51/046Interoperability with other network applications or services
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/478Supplemental services, e.g. displaying phone caller identification, shopping application
    • H04N21/4788Supplemental services, e.g. displaying phone caller identification, shopping application communicating with other users, e.g. chatting

Landscapes

  • Engineering & Computer Science (AREA)
  • Signal Processing (AREA)
  • Multimedia (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Engineering & Computer Science (AREA)
  • Information Transfer Between Computers (AREA)
  • User Interface Of Digital Computer (AREA)
  • Television Signal Processing For Recording (AREA)

Abstract

The application discloses a method, a device, a computing device and a storage medium for controlling video playing, and belongs to the technical field of computers. The method comprises the following steps: receiving a first trigger operation aiming at video entrance indication information in a chat conversation interface; displaying the chat conversation interface and a video playing interface corresponding to the target video according to the triggering of the first triggering operation; or displaying a video selection interface according to the triggering of the first triggering operation, and displaying the chat conversation interface and a video playing interface corresponding to the target video according to the triggering of the selection operation of the target video aiming at the video selection interface. Through the triggering operation of the video entry indication information embedded in the chat conversation interface, the playing of videos can be quickly associated from the chat conversation interface, an interaction mode that a chat conversation window and the videos are played concurrently is provided, and meanwhile, the video playing efficiency can also be improved.

Description

Method and device for controlling video playing, computing equipment and storage medium
Technical Field
The present application relates to the field of computer technologies, and in particular, to a method and an apparatus for controlling video playing, a computing device, and a storage medium.
Background
With the development of communication technology and electronic technology, the life of users is more and more convenient, for example, instant messaging has gone deep into the life of people, users can utilize instant messaging technology to carry out network social contact, and can realize instant chat interaction, which is very convenient. In addition, the current video service also brings a better entertainment experience for users, and based on the video service provided by an application provider, the users can use various terminal devices to watch videos online, such as watching television shows, sports events, entertainment for integrated art, and the like.
When a user chats with other people, a chat conversation window exists, the chat conversation window is used for presenting chat contents with other people, and when the user watches videos, a video playing window also needs to be presented, so that the problem that how to realize the co-screen coexistence of the instant messaging function and the video playing function needs to be considered.
Disclosure of Invention
The embodiment of the application provides a method, a device, a computing device and a storage medium for controlling video playing, and provides an interaction mode of coexistence of an instant messaging function and a video playing function on the same screen.
In one aspect, a method for controlling video playing is provided, the method comprising:
receiving a first trigger operation aiming at video entrance indication information in a chat conversation interface;
displaying the chat conversation interface and a video playing interface corresponding to the target video according to the triggering of the first triggering operation; alternatively, the first and second electrodes may be,
and displaying a video selection interface according to the triggering of the first triggering operation, and displaying the chat conversation interface and a video playing interface corresponding to the target video according to the triggering of the selection operation of the target video aiming at the video selection interface.
In one aspect, an apparatus for controlling video playback is provided, the apparatus comprising:
the system comprises a receiving module, a sending module and a receiving module, wherein the receiving module is used for receiving a first trigger operation aiming at video entrance indication information in a chat conversation interface;
the display module is used for displaying the chat conversation interface and a video playing interface corresponding to the target video according to the triggering of the first triggering operation; or the chat conversation interface and the video playing interface corresponding to the target video are displayed according to the triggering of the first triggering operation.
In one possible implementation, the display module is configured to:
determining the target video according to a preset recommendation strategy according to the triggering of the first triggering operation, and displaying the chat conversation interface and a video playing interface corresponding to the target video; alternatively, the first and second electrodes may be,
determining the video selection interface according to the operation type of the first trigger operation, and displaying the video selection interface, wherein the video selection interface comprises a local video selection interface or an online video selection interface.
In a possible implementation manner, the receiving module is further configured to receive a second trigger operation for the video playing interface;
and the display module is further used for displaying a video playing operation control associated with the video playing interface according to the triggering of the second triggering operation.
In a possible implementation manner, the video playing operation control includes a minimization control, and the receiving module is further configured to receive a third trigger operation for the minimization control;
the display module is further configured to hide the video playing window according to the trigger of the third trigger operation, and display minimized playing indication information, where the minimized playing indication information includes a video identifier of the target video currently being played.
In a possible implementation manner, the receiving module is further configured to receive a fourth trigger operation for the minimized playing indication information after the display module displays the minimized playing indication information;
the display module is further configured to terminate playing of the target video according to the triggering of the fourth triggering operation, and hide display of the minimized playing indication information; or hiding the minimized playing indication information and resuming to display the video playing window according to the triggering of the fourth triggering operation.
In one possible implementation, the display module is further configured to:
obtaining at least one video playing indication information corresponding to a video currently watched by at least one chat object corresponding to the chat conversation interface; the video playing indication information comprises a video identifier, or the video playing indication information comprises the video identifier and playing progress information;
and displaying the at least one video playing indication information in the chat conversation interface.
In one possible implementation, the display module is configured to:
if the chat conversation interface is a private chat conversation interface, displaying a video playing indication message in a first preset area in the chat conversation interface;
and if the chat conversation interface is a group chat conversation interface, dynamically displaying the at least one piece of video playing indication information in a second preset area in the chat conversation interface, or displaying the corresponding associated video playing indication information in an associated specified area of the user identifier of each chat object.
In a possible implementation manner, the receiving module is further configured to receive a fifth trigger operation for a target video playing indication information in the at least one video playing indication information;
and the display module is further configured to display, according to the triggering of the fifth triggering operation, video content corresponding to the video indicated by the target video playing indication information through the video playing interface.
In a possible implementation manner, the receiving module is further configured to receive a sixth trigger operation for the video playing interface or the minimized playing indication information;
the device further comprises a sending module, configured to send video watching invitation information to a chat object corresponding to the chat conversation interface according to the trigger of the sixth trigger operation, where the video watching invitation information is determined according to the play address of the target video.
In a possible implementation manner, the receiving module is further configured to receive a seventh triggering operation for triggering the playing of the voice chat message in the chat conversation interface;
the device further comprises a volume control module, which is used for reducing the playing volume of the target video when the receiving module receives the seventh trigger operation.
In a possible implementation manner, the receiving module is further configured to receive an eighth trigger operation for the video playing interface after the display module displays the chat conversation interface and the video playing interface corresponding to the target video;
the display module is further configured to hide the chat conversation interface and switch interface display according to the triggering of the eighth triggering operation, and display the switched application interface and the video playing interface when the switched application interface is a predetermined application interface.
In a possible implementation manner, the receiving module is further configured to receive a ninth triggering operation for the video switching indication information;
the display module is further configured to switch the video played by the video playing interface from the target video to the video to be switched and played according to the trigger of the ninth trigger operation.
In one aspect, a computing device is provided, which includes a memory, a processor, and a computer program stored on the memory and executable on the processor, and when the processor executes the computer program, the method for controlling video playing includes the steps as described in the above various possible implementations.
In one aspect, a storage medium is provided, which stores computer-executable instructions for causing a computer to perform the steps included in the method for controlling video playing described in the above-mentioned various possible implementations.
In one aspect, a computer program product containing instructions is provided, which when run on a computer causes the computer to perform the steps included in the method for controlling video playback described in the above various possible implementations.
In the embodiment of the application, a shortcut entrance for playing the video is embedded in the chat conversation interface, the shortcut entrance is equivalent to a centralized push entrance of the video, the centralized push portal is associated with a video that can be played, e.g., the shortcut portal is indicated with video portal indication information, after the user performs the first triggering operation on the video entry indication information, for example, clicks on the video entry indication information, the system may automatically determine or determine the video (for example, referred to as a target video) that needs to be played currently according to the selection operation of the user, further playing the target video, and simultaneously displaying the current chat conversation interface and the video playing interface for playing the target video, therefore, the same-screen display of the chat conversation interface and the video playing interface is realized, and a product interaction scheme that the chat conversation window and the video playing window coexist in the same screen is provided.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings needed to be used in the description of the embodiments or the prior art will be briefly introduced below, it is obvious that the drawings in the following description are only the embodiments of the present application, and for those skilled in the art, other drawings can be obtained according to the provided drawings without creative efforts.
FIG. 1 is a schematic diagram of a chat list interface;
FIG. 2 is a schematic diagram of a chat conversation interface;
FIG. 3 is a schematic diagram of an application scenario in which the present application is applied;
fig. 4 is a flowchart of a method for controlling video playback in an embodiment of the present application;
FIG. 5 is another schematic diagram of a chat list interface in an embodiment of the application;
fig. 6 is a schematic diagram illustrating that the video play entry indication information is displayed in the chat window in the embodiment of the present application;
FIG. 7 is a schematic diagram of selecting an online video in an embodiment of the present application;
FIG. 8 is a diagram illustrating selection of a local video in an embodiment of the present application;
fig. 9 is a schematic diagram of an outgoing video playing operation control in an embodiment of the present application;
fig. 10 is a schematic diagram illustrating minimization control of a video playback window in an embodiment of the present application;
fig. 11 is a schematic diagram of performing full-screen playing on a video playing window in the embodiment of the present application;
fig. 12 is a schematic diagram of closing a video playing window in the embodiment of the present application;
fig. 13 is a diagram illustrating video playback instruction information for displaying a chat object in an embodiment of the present application;
fig. 14 is a schematic diagram illustrating one-key video synchronization playing in the embodiment of the present application;
fig. 15 is another schematic diagram of performing one-key video synchronous playback in the embodiment of the present application;
FIG. 16 is a schematic diagram of inviting users to watch videos in an embodiment of the present application;
FIG. 17 is a schematic diagram illustrating that a video continues to be played in a predetermined application interface in an embodiment of the present application;
FIG. 18 is another schematic diagram of continuing to play a video at a predetermined application interface in an embodiment of the present application;
fig. 19a is a block diagram of an apparatus for controlling video playback according to an embodiment of the present application;
fig. 19b is another block diagram of the apparatus for controlling video playback in the embodiment of the present application;
fig. 20 is a schematic structural diagram of a computing device in an embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the technical solutions in the embodiments of the present application will be described clearly and completely with reference to the accompanying drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by one of ordinary skill in the art from the embodiments given herein without making any creative effort, shall fall within the scope of the claimed protection. In the present application, the embodiments and features of the embodiments may be arbitrarily combined with each other without conflict. Also, while a logical order is shown in the flow diagrams, in some cases, the steps shown or described may be performed in an order different than here.
The terms "first" and "second" in the description and claims of the present application and the above-described drawings are used for distinguishing between different objects and not for describing a particular order. Furthermore, the term "comprises" and any variations thereof, which are intended to cover non-exclusive protection. For example, a process, method, system, article, or apparatus that comprises a list of steps or elements is not limited to only those steps or elements listed, but may alternatively include other steps or elements not listed, or inherent to such process, method, article, or apparatus. The "plurality" in the present application may mean at least two, for example, two, three or more, and the embodiments of the present application are not limited.
In addition, the term "and/or" herein is only one kind of association relationship describing an associated object, and means that there may be three kinds of relationships, for example, a and/or B, which may mean: a exists alone, A and B exist simultaneously, and B exists alone. In addition, the character "/" in this document generally indicates that the preceding and following related objects are in an "or" relationship unless otherwise specified.
Some technical terms referred to herein are explained below to facilitate understanding by those skilled in the art.
1. Instant Messaging (IM), also called Instant Messaging, refers to a service capable of sending and receiving internet messages and the like in real time, allowing two or more people to communicate text messages, documents, voice and video in real time using a network, and has been developed into a comprehensive information platform integrating communication, information, entertainment, search, electronic commerce, office collaboration, enterprise customer service and the like.
2. A chat list, in the instant messaging service, the chat list may be used to carry window identifications of chat conversation windows between instant messaging objects (also referred to as chat objects, for example), and the window identifications are used to identify corresponding chat conversation windows, for example, a group identification (e.g., a group chat name) of a group chat or a user identification (e.g., a user nickname) of a chat object of a private chat, and may further include last chat information in the chat conversation window, that is, the chat conversation window may be identified by the window identifications of the chat conversation window.
The chat list can be presented through a single interface or can be presented through a chat list window, and in an understanding manner, the interface for separately presenting the chat list can be regarded as a maximized chat list window, so the chat list can be also called as a chat list window or a chat list interface.
Referring to the schematic diagram of the chat list shown in fig. 1, where "MQ" represents the application name of the instant messaging application, followed by "32" representing that 32 messages are unread, the upper right corner of the page is represented by dashed boxes and dashed circles as some application controls, e.g., where "+" refers to adding a buddy control, the control circled by the dashed circle is a contact control, and the avatar framed by the rightmost dashed box is the avatar of the user currently using the instant messaging application. The current displayed 'recent chat' page lists window identifications corresponding to four chat conversation windows, namely 'leaf guess', 'baby health communication group', 'ant fast run', 'flower world', wherein the 'leaf guess' and the 'ant fast run' correspond to private chat windows, and the 'baby health communication group' and the 'flower world' correspond to group chat windows.
3. The chat conversation interface, which may also be called a chat conversation window or a chat window, is a window for directly performing instant message interaction with a chat object. The chat conversation window may include a group chat window and a private chat window, the group chat window is a chat window corresponding to an instant communication group, the instant communication group may include a plurality of chat objects, information sent by one chat object in the group chat window may be seen by all the other chat objects, and the private chat window only relates to message interaction between two users, for example, on the premise that the user a and the user B are social friends, the user a may chat online through the private chat window and the user B, or the user a may chat online through the group chat window and the user B corresponding to the group chat that the user a and the user B both join.
Referring to fig. 2, after the user clicks "leaf guess" from the chat list interface shown in the left diagram of fig. 2, the page jumps from the chat list interface to the chat conversation interface with "leaf guess" shown in the right diagram of fig. 2, so that the chat content between the user himself (i.e. Zhang three) and the chat object with the nickname "leaf guess" can be seen, and during the chat with "leaf guess", the user may communicate with the right diagram of fig. 2 through words as shown in the right diagram of fig. 2, or may also communicate with the message through voice information communication, and may also perform voice call or video call, etc.
The chat conversation window between "leaf guesses" and "leaf guesses" as shown in the right diagram of fig. 2 is a private chat window. Assuming that the user clicks the group of the baby health communication group in the chat list interface, the user can jump to the group chat window corresponding to the baby health communication group.
As mentioned above, how to implement the coexistence of chat conversation and video playing is a problem to be considered. In the related technology, the coexistence of the chat conversation window and the video playing window can be realized in a split screen mode, that is, the chat conversation window and the video playing window are simultaneously displayed on one page. That is to say, in a split-screen mode in the related art, two applications are independent from each other and isolated, a user needs to operate the two applications respectively and then further perform split-screen operation to enable coexistence of two windows in the same display interface, in a same-screen coexistence scheme of split-screen display, the two applications are independent from each other, so that the user needs to operate the two applications respectively to enable coexistence of a chat conversation window and a video playing window, the user needs to switch back and forth between the two applications and perform additional split-screen operation to enable coexistence of the two windows in the same screen, and the user is troublesome to operate.
In view of the above, the inventor of the present application considers how to reduce the user operations and further realize the co-screen coexistence of the chat conversation window and the video playing window, and based on this, the inventor of the present application considers that the association between the chat conversation window and the video playing window is established, and the two functions of online chat and playing video are linked through the association relationship. On the basis, the inventor of the application provides a scheme for controlling video playing, and specifically, a shortcut entry for playing a video is embedded in a chat conversation interface, the shortcut entry indicates through video entry indication information, the shortcut entry is equivalent to a centralized pushing entry of the video, the centralized pushing entry is associated with the video capable of being played, after a user clicks the shortcut entry, a system can automatically determine or determine the video needing to be played currently according to selection operation of the user, the video is played, the current chat conversation interface and a video playing window for playing the video are displayed on the same screen, the same-screen display of the chat conversation window and the video playing window is realized, and a product interaction scheme that the chat conversation window and the video playing window coexist on the same screen is provided.
Through the technical scheme provided by the embodiment of the application, a user can directly play videos through the solidified video playing shortcut entry in the chat conversation interface, the co-screen coexistence of the videos and the chat conversation window can be realized by simply carrying out one-time operation in the chat conversation interface, and the back-and-forth operation and switching between two mutually independent applications are not needed as required in the related technology, so that the efficiency of controlling the video playing in the process of chatting with other users can be improved, the use of the user is facilitated, and the use experience of the user can be further improved. The quick and friendly man-machine interaction mode can realize the co-screen coexistence of the instant messaging function and the video playing function of the user, and is beneficial to improving the user viscosity.
In order to better understand the technical solution provided by the embodiment of the present application, some brief descriptions are provided below for application scenarios to which the technical solution provided by the embodiment of the present application is applicable, and it should be noted that the application scenarios described below are only used for illustrating the embodiment of the present application and are not limited. In specific implementation, the technical scheme provided by the embodiment of the application can be flexibly applied according to actual needs.
Referring to fig. 3, fig. 3 is an Application scenario applicable to the embodiment of the present Application, where the Application scenario includes a terminal device 301 and an integrated server 302, a client corresponding to an integrated Application is installed in the terminal device 301, and the integrated Application in the embodiment of the present Application refers to an Application program capable of providing an instant messaging service and a video playing service at the same time, that is, the embodiment of the present Application is improved from an Application (APP) level and provides an Application integrating an instant messaging function and a video playing function at the same time, where the video playing service may be an online video service or a local video service, in other words, the integrated Application integrates functions of a conventional instant messaging Application and a video Application in the related art, so as to provide functions corresponding to the two applications at the same time. The integrated server 302 is a background server providing service support for the integrated application, and communication can be performed between the terminal device 301 and the integrated server 302. A user may log in a client of the integrated application installed in the terminal device 301, and after a certain chat window is opened, may operate a video playing shortcut entry displayed in the chat window, request the integrated server 302 to provide a video playing service, and may further simultaneously display the video playing window in a display interface of the chat window, so as to implement co-screen coexistence of the video playing window and the chat window.
It should be noted that, in the embodiment of the present application, only "integrated application" indicates an application that integrates both the instant messaging function and the video function, and in the implementation process, the "integrated application" may have other application names, for example, the "MQ" shown in fig. 2 may be understood as an application name of a specific "integrated application", and the embodiment of the present application is not limited.
The terminal device 301 in fig. 3 may be a mobile phone, a tablet computer, a Personal Digital Assistant (PDA), a notebook computer, an intelligent wearable device (e.g., an intelligent watch and an intelligent helmet), a Personal computer, an intelligent television, an intelligent sound box, an in-vehicle intelligent device, or other devices capable of supporting both an instant messaging function and a video playing function. And, the integrated server 302 in FIG. 3 may be a personal computer, a midrange computer, a cluster of computers, and so forth.
That is to say, the user can use the terminal device 301 as shown in fig. 3 to implement co-screen coexistence of the chat conversation window and the video playing window by using the method for controlling video playing in the embodiment of the present application, and a specific method for controlling video playing will be described in detail later.
To further illustrate the technical solutions provided by the embodiments of the present application, the following detailed description is made with reference to the accompanying drawings and the detailed description. Although the embodiments of the present application provide the method operation steps as shown in the following embodiments or figures, more or less operation steps may be included in the method based on the conventional or non-inventive labor. In steps where no necessary causal relationship exists logically, the order of execution of the steps is not limited to that provided by the embodiments of the present application. The method can be executed in sequence or in parallel according to the method shown in the embodiment or the figure when the method is executed in an actual processing procedure or a device.
The embodiment of the present application provides a method for controlling video playing, which may be executed by a device having both an instant messaging function and a video playing function, for example, the method may be executed by the terminal device 301 in fig. 3. The method for controlling video playing provided by the embodiment of the application is shown in fig. 4, and the flowchart shown in fig. 4 is described as follows.
Step 401: a first trigger operation for video entry indication information in a chat conversation interface is received.
The video entry indication information in the embodiment of the present application may be understood as the aforementioned shortcut entry for playing a video, and the video entry indication information is an indication identifier for indicating video playing. The video entry indication information is displayed in the chat conversation interface, namely, a centralized push entry for playing the video is embedded in the chat conversation interface, namely, a function of quickly playing the video is linked in the chat conversation interface, so that the user can directly and quickly play the video in the chat conversation interface, and the operation of the user is facilitated.
Referring to fig. 5, fig. 5 is a schematic diagram showing a plurality of chat conversation interface identifiers displayed in a chat list interface, and taking the chat conversation interface identifier corresponding to the "leaf guess" outlined by a dashed line frame in fig. 5 as an example, by clicking the chat conversation interface identifier, the chat conversation interface corresponding to the "leaf guess" can be accessed, that is, the chat conversation interface identifier is used for linking and indicating a chat conversation window with a chat object.
Referring to fig. 6, a chat session interface corresponding to "leaf guess" may be entered from the chat list interface, specifically, the user may perform, for example, a click operation or other specified operation on a chat session interface identifier (for example, several characters of "leaf guess" or an avatar corresponding to "leaf guess"), and then the terminal device may monitor the operation, and may request, from the background server, an interface display resource corresponding to the chat session interface of "leaf guess", where the interface display resource includes an instant communication content resource and an interface resource of video entry indication information, and the interface resource includes a specific display content and a display position of the video entry indication information. After the terminal device obtains the interface display resource, the terminal device can display a corresponding instant messaging chat conversation interface with the instant messaging content resource included in the interface display resource, and display video entrance indication information at a specified position (corresponding to the previous display position) of the chat conversation interface.
In a specific implementation process, a specific presentation form of the video entry indication information may be, for example, an icon as shown in fig. 6, for example, an icon in a "player" style, or may also be a character, for example, a text of "video shortcut entry" or an english word of "vedio", and the like.
The display content of the chat conversation interface can be divided into two parts, one part is the interface description content of the chat conversation interface, and the other part is the chat content which can be dynamically changed in the chat conversation interface. Wherein, the interface description content can be understood as the display content that is fixed and unchangeable in the chat conversation interface, for example, as shown in the right diagram in fig. 6, in the current chat conversation interface with "leaf guess", the user identification of "leaf guess" is not changed, and the input box at the bottom of the chat conversation interface and the emoticon and "+" icon at the right side of the input box, these contents are solidified in the chat conversation interface, the interface description content will be presented when entering the chat conversation interface, and the chat content between friends will be changed dynamically with the chat process, for example, the user can send message to the chat object, receive message sent by the chat object, or the user can delete the chat message on the chat conversation interface, and as the chat messages increase, chat content displayed in the chat conversation interface may also be displayed in place of new messages received later, and so on.
In this embodiment of the present application, in a possible implementation manner, the video entry indication information may be the foregoing interface description content, which is an interface fixed resource solidified in a chat conversation interface, instead of the chat content, and may be presented in the form of a floating pendant, an icon, or a text, for example; in another possible implementation, the video entry indication information may also be chat content, such as a video sharing link shared by chat objects in the chat content, or a specified keyword in the chat content may even be some specific chat emotions, and so on.
The initial location of the video entry indicator displayed in the chat conversation interface may be, for example, an area in the chat conversation window that is as small as possible to obscure the chat content, such as the user identifier (i.e., the right side of the "leaf guess") or the left side at the top of the chat conversation window as shown in fig. 6, or may be located elsewhere. After the video portal indicating information is initially displayed, the user can readjust the display position of the video portal indicating information according to the use habit or viewing requirements of the user, in other words, the video portal indicating information displayed in the chat conversation window has movable attributes, and the user can move and adjust the display position of the video portal indicating information in a clicking and dragging mode on the screen.
In the embodiment of the present application, the video entry indication information displayed in the chat conversation interface has an attribute associated with video playing in addition to the above-mentioned movable attribute, so that when a user performs a specific operation on the video entry indication information, the video entry indication information may be triggered to associate with video playing, for example, the specific operation is referred to as a first trigger operation, further, when monitoring the first trigger operation performed on the video entry indication information by the user, the terminal device confirms that the associated video playing needs to be triggered by the video entry indication information, the first trigger operation on the video entry indication information in the embodiment of the present application may be that the first trigger operation is directly applied to the video entry indication information itself, for example, directly clicking a "player" icon as in fig. 6, or when performing control by using a voice control operation, the user directly speaks a "video shortcut entry" voice, and so on. In another embodiment, the first trigger operation may also be a special operation performed in the chat conversation interface (in this case, the special operation may or may not directly act on the video entry pointing information), such as an operation of continuously clicking the screen twice in a specified area, or a gesture operation of drawing a special graphic (e.g., an L line or a circle) on the screen, or some physical key operation, and so on. That is to say, the operation type and the operation mode of the first trigger operation in the embodiment of the application may be various, and in the actual use process of these modes, the user may set the operation type and the operation mode by himself or herself, and certainly, the operation type and the operation mode may be preset by the system.
Step 402: and displaying the chat conversation interface and a video playing interface corresponding to the target video according to the triggering of the first triggering operation.
Step 403: and displaying a video selection interface according to the triggering of the first triggering operation, and displaying a chat conversation interface and a video playing interface corresponding to the target video according to the triggering of the selection operation of the target video aiming at the video selection interface.
The first triggering operation in the embodiment of the present application is an operation for triggering video playing through the video entry indication information, so that after monitoring the first triggering operation, the terminal device may determine a video that needs to be played, for example, the video that needs to be played is referred to as a target video.
In the embodiment of the present application, there may be two general ways for determining the target video according to the first trigger operation, where the first determination way is to manually select the target video to be played by the user according to the trigger of the first trigger operation, for example, this way is referred to as a user-defined selection way, and the second determination way is to perform video recommendation by default directly according to a preset recommendation policy by the system, for example, the second determination way is referred to as a system automatic recommendation way, and for convenience of understanding, these two ways are separately described below.
Firstly, a user self-defines a selection mode.
The user-defined selection mode may include the following two modes.
Mode 1: the method comprises the steps of determining a video selection interface according to an operation type of a first trigger operation, displaying the determined video selection interface, and determining a target video according to a selection operation aiming at the video selection interface, wherein the video selection interface can comprise an online video selection interface or a local video selection interface.
For example, when the first trigger operation is an operation of clicking the video entry indication information once, the corresponding determined video selection interface is an online video selection interface, and when the first trigger operation is an operation of clicking the video entry indication information twice consecutively, the corresponding determined video selection interface is a local video selection interface; for another example, when the first trigger operation is a click operation of clicking video entry indication information as shown in a diagram in fig. 7, the corresponding determined video selection interface is an online video selection interface, and when the first trigger operation is a circular gesture touch operation with surrounding video entry indication information as shown in a left diagram in fig. 8, the corresponding determined video selection interface is a local video selection interface. That is to say, in the embodiment of the application, the display to the corresponding different video selection interfaces can be directly triggered according to the difference of the operation types of the first trigger operation, so that the user can conveniently select the target video which the user actually wants to play from the associated triggered video selection interfaces.
As will be understood in conjunction with fig. 7, after the online video selection interface is triggered to be displayed according to the first trigger operation, the online video selection interface may be displayed in a full screen (as shown in b diagram in fig. 7) or may be displayed on the same screen with the chat conversation window in a certain ratio. The online video selection interface shown in the b diagram in fig. 7 may support a user to search for a video, or may perform corresponding personalized recommendation on content that may be interested in the user according to past viewing records of the user, or may also show popular content recommended by the online video platform to the user, and so on, and the user may select a target video according to the actual viewing requirement of the user, for example, as shown in the c diagram in fig. 7, the user clicks the video "tianhaakio-seamless" in the online video selection interface, that is, the video is taken as the target video, and after the selection operation of "tianhaakio-seamless" by the user is monitored, the target video may be played.
As will be understood in conjunction with fig. 8, after the local video selection interface is triggered to be displayed according to the first trigger operation, the local video selection interface may be displayed in a full screen (as shown in the right diagram of fig. 8) or may be displayed on the same screen with the chat conversation window in a certain ratio. The local video selection interface, as shown in the right diagram in fig. 8, may enable a user to select a video stored locally by the terminal device as a target video, while the video stored locally by the terminal device is placed in multiple folders, for example, multiple video folders such as "camera shooting", "american picture application", "MQ download", "video application", and so on. The 'camera shooting' comprises an original video shot by a camera carried by the terminal equipment; "art beautifying application" includes video captured or processed by some image or video beautifying application; "MQ download" includes video downloaded by MQ applications as previously described, such as video sent by a chat object of MQ or video published by a buddy ring; "video applications" include video downloaded by some video applications. Further, the user can select a target video to be played from one of the folders.
In the mode 1, the corresponding video selection interface can be directly triggered by the operation type of the first triggering operation, so that a user can quickly reach the video selection interface needing to select the video through a specific one-time operation, and the selection efficiency of the target video can be improved.
Mode 2: according to the triggering association display of the first triggering operation, through the first-level selection interface comprising the online video selection option and the local video selection option, the user can select the "online video selection" or the "local video selection" according to the own viewing requirement, after the user selects one of the options, the user can enter the second-level selection interface, for example, the user clicks the "online video selection" option, the page jump is triggered to the second-level selection interface shown in the b diagram in fig. 7, for example, the user clicks the "local video selection" option, the page jump is triggered to the second-level selection interface shown in the right diagram in fig. 8.
Therefore, in the user-defined selection mode, the user manually selects the target video again according to the trigger of the first trigger operation, so that the selected target video can meet the actual watching requirements of the user as much as possible, the selection effectiveness is high, meanwhile, various operation modes can be provided for the user, the operation pleasure in selecting the video can be improved, and the use experience of the user is further enhanced.
And secondly, automatically recommending a mode by the system.
And determining a video selection mode according to the operation type of the first trigger operation, and automatically determining a target video according to a preset recommendation strategy according to the determined video selection mode, wherein the video selection mode comprises a system automatic recommendation mode and the first user-defined selection mode.
The type of operation for the first trigger operation has been described above and will not be described again here. For example, when the user performs a long press operation of touching the video entry indication information, it indicates that a target video needs to be automatically recommended to the user by the system, for example, the recommended target video is a video (a local video or an online video) that the user interrupted watching last time, or may be a local video that the user just downloaded, or may be a video that the user just published, or may be a video that is being played before entering the current chat conversation window, or may be a video that is currently being watched by a chat object of the entering chat conversation window, and so on. By means of the automatic default recommendation mode of the system, the target video which the user wants to watch can be played quickly according to the preset recommendation strategy, the intelligence of the system is enhanced, and meanwhile the playing efficiency of the target video can be improved to a large extent, so that the watching experience of the user is enhanced.
In the automatic recommendation mode of the system, after receiving a first trigger operation of a user, the terminal device directly determines a target video from a background, and then a video playing interface and a chat conversation interface can be directly displayed according to the trigger of the first trigger operation.
After the target video is determined, the target video can be played, for example, a playing component of the system is called to play the target video, when the video is played, the video is generally presented in a video playing interface mode, so that a video playing interface also corresponds to the target video.
As shown in fig. 7, after the user selects the target video "seamless" to be played in the online video selection interface as shown in the diagram c in fig. 7, the interface may automatically jump from the diagram c in fig. 7 to the diagram d in fig. 7, that is, directly go back from the online video selection interface to the chat conversation interface, and pop up the video playing interface corresponding to the playing component (e.g., player), directly play the video, and after playing the target video, the video entry indication information originally displayed in the chat list interface is hidden and displayed. In the specific implementation process, the video playing interface and the chat conversation interface may be displayed on the same screen in a certain proportion, that is, after the chat conversation interface originally displayed in a full screen is compressed, the video playing interface is displayed in an area outside the chat conversation interface, that is, the chat conversation interface and the video playing interface are two independent display interfaces, the display between the two interfaces is not affected, the chat conversation interface and the video playing interface are displayed in parallel on the same screen, for example, as shown in d in fig. 7, in a vertical screen state of the terminal device, in order to not affect the instant messaging chat of the user, the video playing interface may be set in an upper area outside the chat conversation interface, and the display area of the video playing interface is smaller than the display area of the chat conversation interface, of course, other display proportions may be used to display the two interfaces on the same screen, or the relative display position between the two interfaces may be adjusted, for example, the video playing interface is readjusted from the upper region of the chat conversation interface shown in d of fig. 7 to the lower region of the chat conversation interface.
In another embodiment, for example, the video playing interface may be directly displayed in the chat conversation interface, that is, the video playing interface may be displayed as a part of the content of the chat conversation interface itself, and in this case, the video playing interface may be considered to be embedded in the chat conversation interface for displaying.
In another embodiment, the video playing interface may be overlaid directly on the chat conversation interface to perform a layered display, for example, the video playing interface is adjusted to 1/3 size of the chat conversation interface and then is layered and displayed in the top area of the chat conversation interface, so that only some interfaces displayed at the top of the chat conversation interface describe content occlusion, for example, only a user nickname displayed in the top area of the chat conversation interface is occluded, and occlusion of the chat content displayed below is reduced as much as possible.
In addition, in the process of watching, the user can also dynamically adjust the display proportion of the chat conversation interface and the video playing interface, or can also dynamically adjust the relative display positions of the chat conversation interface and the video playing interface so as to meet the differentiated watching requirements of the user.
After the chat conversation interface and the video playing interface are displayed on the same screen, the user may perform a second trigger operation on the video playing interface, where the second trigger operation is, for example, a click operation of touching the video playing interface, or, for example, a slide gesture operation of sliding from left to right in the video playing interface as shown in the left diagram in fig. 9, and so on. Therefore, after monitoring a second trigger operation for the video playing interface, the terminal device may request a control display resource from the background server according to the trigger of the second trigger operation, and after obtaining the control display resource, may display a video playing operation control corresponding to the control display resource, where the control display resource includes one or more control icons of the video playing operation control to be displayed, and a display position corresponding to each control icon, for example, as shown in the right diagram in fig. 9, a total of 6 video playing operation controls are displayed at different positions, where 4 video playing operation controls are located in the video chat conversation window, and 2 video playing operation controls are located outside the video playing window.
That is to say, after the target video is played through the video playing interface, a plurality of operation entries of the player can be quickly called through shortcut operation, and then the video being played can be correspondingly controlled or some additional operations can be performed through the operation entries. The video playing operation control in the embodiment of the present application may include, for example, a minimization control, a full screen control, a close control, a change video control, a video screenshot control, a capture small video control, a pause/play control, a share control, and the like.
Taking the minimized control as an example, please refer to fig. 10, after the user performs a third trigger operation on the exhaled minimized control, for example, after clicking the minimized control, the system may monitor the third trigger operation, and further hide the video playing window according to the third trigger operation, and simultaneously display the minimized playing indication information, for example, display the minimized playing indication information at the originally displayed position of the video entry indication information, that is, control the video synchronous playing of the video playing interface to be minimized, and simultaneously resume the full-screen proportional display of the chat conversation interface. The minimized playing indication information is indication information indicating that the target video is in the minimized window playing, the minimized playing indication information may be in the form of a floating pendant or in other forms, as shown in the right diagram in fig. 10, meanwhile, the minimized playing indication information may include a video identifier of the target video currently being played, that is, a video identifier of the target video currently being played, or the minimized playing indication information may include both a video identifier and a playing progress of the target video currently being played, where the video identifier is, for example, a video name. Meanwhile, the minimized playing indication information may further include a control for controlling the minimized playing indication information, for example, an "x" shaped closing indication information shown in the right diagram of fig. 10 and located at the upper right of the minimized playing indication information, and after a specific operation is performed on the closing indication information, the display of the minimized playing indication information may be closed, and at the same time, the playing of the target video is exited, that is, the target video is closed.
Taking a full-screen control as an example, please refer to the left diagram in fig. 11, after a user performs a specified operation on the called full-screen control, for example, after clicking the full-screen control, the system may monitor the specified operation, and then display the video playing window in full screen according to the specified operation, that is, as shown in the right diagram in fig. 11, thereby implementing full-screen playing of the target video, where the video playing window is displayed maximally, completely covers the chat conversation window, and can be played in a horizontal screen manner in order to facilitate full-screen viewing of the user.
Taking a video replacing control as an example, for example, as the right video replacing control in the rectangle dashed frame in the right diagram of fig. 9, after the user performs the ninth triggering operation on the video replacing control, the terminal device may determine the video to be switched and played according to the ninth triggering operation, for example, the system automatically determines the video to be switched and played or the user manually selects the video to be switched and played, and then switches the video played in the video playing window from the target video played before to the video determined again and to be switched and played, so that the real-time switching of the video can be realized in the process that the video playing interface and the chat conversation interface coexist on the same screen. The video switching instruction information may be used to explicitly instruct the user about the operation entry of video switching.
It should be noted that, in addition to the manner of calling out through the second trigger operation introduced in the embodiment of the present application, the video playing operation controls may also be displayed simultaneously when the video playing interface is displayed, and the manner of calling out through the second trigger operation may facilitate the user to display when the user needs to use, and because the video playing operation controls may shield the content of the played video to a certain extent when displayed, the manner of not displaying when not needed may avoid the interference and influence on the viewing of the target video as much as possible, thereby improving the viewing experience of the user.
For example, the video playing interface shown in the left diagram of fig. 10 is referred to as a small screen playing mode, the video playing interface shown in the right diagram of fig. 10 is referred to as a mini-pendant mode, and the video playing interface shown in the right diagram of fig. 11 is referred to as a full screen playing mode. Different playing modes can meet different watching requirements of users, or are biased to watch videos, or are biased to online chatting, or both the watching videos and the online chatting need to be considered simultaneously, so that different using requirements of the users can be met.
In a possible implementation manner, when the user performs a fourth triggering operation (as shown in the upper left diagram in fig. 12) on the closing control for the video playing window of the outgoing call described above, for example, performs a touch operation of clicking the closing control, the playing of the target video may be terminated, i.e., the target video is closed, and the display interface of the chat conversation interface including the video entry indication information at the beginning (as shown in the right diagram in fig. 12) is returned, i.e., the display of the video entry indication information is resumed.
In another possible implementation, when the user performs the fourth triggering operation on the aforementioned fourth triggering operation for the minimized playing indication information (as shown in the lower left diagram of fig. 12), for example, on the "x" shaped closing indication information at the upper right of the minimized playing indication information, the playing of the target video may also be terminated, i.e., the target video is closed, and the display interface of the chat conversation window including the video entry indication information at the beginning (as shown in the right diagram of fig. 12) is returned, i.e., the display of the video entry indication information is resumed.
The above-mentioned two ways of closing the video playing are illustrated by fig. 12, so that the user can select the corresponding way to quit the playing of the video in the process of chatting and watching the video coexistence, which is convenient for operation.
In the embodiment of the present application, please refer to fig. 13, when a chat conversation interface just opened or a target video is closed as shown in an upper left diagram in fig. 13, or when a chat conversation interface and a video playing interface (when a video playing operation control is not called) coexist on the same screen as shown in an upper right diagram in fig. 13, or when a chat conversation interface and a video playing interface (when a video playing operation control has been called) coexist on the same screen as shown in a lower left diagram in fig. 13, or when a video playing interface shown in a lower right diagram in fig. 13 presents a mini-pendant mode, at least one piece of video playing indication information corresponding to a video currently being viewed by at least one chat object corresponding to the chat conversation interface may be obtained, and then the at least one piece of video playing indication information is displayed in a display interface of a chat conversation window. The video playing indication information is used to indicate the video being watched by the current chat object, and as shown in the upper right diagram and the lower left diagram in fig. 13, the video playing indication information only includes the video identifier (i.e., the video name), or as shown in the upper left diagram and the lower right diagram in fig. 13, the video playing indication information includes both the video identifier and the playing progress information of the video. Through the indication of the video playing indication information, the user can clearly know the video situation of the current chat object, and further can better know the chat object, so as to more harmoniously carry out chat communication. In the specific implementation process, the user can customize the visible attribute of the video playing indication information to the friend, for example, the user can set that the video playing indication information that the user is watching the video is only presented to a part of friends (such as good friends or relatives of the user), and the video playing indication information is not presented to unfamiliar friends or strangers, so that privacy protection can be facilitated to a certain extent.
In the embodiment of the present application, if the chat conversation interface is a private chat conversation interface, one piece of video playing indication information corresponding to one chat object may be displayed in a first predetermined area in the private chat conversation interface, for example, as shown in the upper left diagram in fig. 13, the video playing indication information may be displayed below the personal information (i.e., the user nickname) of the top chat object, or may be displayed in another designated area. If the chat conversation window is a group chat conversation interface, at least one piece of video playing indication information may be dynamically displayed in a second predetermined area in the group chat conversation interface, for example, a plurality of pieces of video playing indication information corresponding to a plurality of chat objects are scroll-displayed under a top group chat name, or, corresponding associated video playing indication information is displayed in an association designated area of a user identifier of each chat object, for example, video playing indication information corresponding to a sub-group chat object itself is displayed in a nearby empty area of a group chat portrait of each user group chat object, so that a user can clearly know a video situation currently being watched by each group chat object itself, and the association display is strong.
When at least one piece of video playing indication information is displayed in a display interface of a chat conversation interface, a user can utilize the video playing indication information to realize fast switching of own video, for example, the user can perform a fifth trigger operation on target video playing indication information in the at least one piece of video playing indication information, for example, perform the fifth trigger operation of long-pressing the target video playing indication information, and further, the terminal device can monitor the fifth trigger operation, and switch the video played by a video playing window from the currently played target video to the video indicated by the target video playing indication information, thereby realizing one-key synchronous playing of the video being watched by a friend.
For example, as shown in fig. 14, when the user clicks the target video playing indication information of "viewing ABAB" set 3 ", the video played in the video playing window is switched from" seamless "being played to" ABAB "set 3" having a leaf guess being viewed, so as to realize fast and synchronous video viewing between two users, and when playing ABAB set 3, the video can be played from the beginning of set 3, or can be played from the place where the user last viewed ABAB set 3 was interrupted, or can be played from a predetermined playing schedule, which is not limited in the embodiment of the present application. As shown in fig. 15, the target video playing indication information includes not only the video name of the ABAB set 3, but also the playing progress information of the "8 th minute", so that after the user clicks the target video playing indication information, the local video playing window can be triggered to start playing from the 8 th minute of the ABAB set 3, so as to realize absolute synchronization with the video playing of the opposite chat user, so that the two chat users can discuss and communicate in time according to the same viewing progress.
In the playing process of the target video in the embodiment of the application, the user can also send a video watching invitation to the chat object of the opposite end. For example, the user may perform a sixth triggering operation on the video playing interface or the minimized playing indication information as described above, and further generate, according to the sixth triggering operation, video viewing invitation information that includes a playing address of a video being played (for example, the target video), and then send the video viewing invitation information to all chat objects corresponding to the chat conversation interface, and through quick sending of the viewing invitation, the user may send the viewing invitation to an opposite-end chat object, thereby implementing online sharing of the video. The video viewing invitation information may be presented in a form of "text + URL address", or may be only "URL address", or may also be in a form of two-dimensional code, and the like.
For example, the user may perform a click operation on the sharing control or the minimized playing indication information in the video playing interface as described above, that is, the video viewing invitation information corresponding to the target video may be directly input into the input box of the current chat conversation window, the user may click the "send" button again to send the video viewing invitation information to the opposite side, or the system may directly send the video viewing invitation information to the opposite side.
For example, as shown in the left diagram in fig. 16, the user may perform a sliding operation from the video playing interface to the end of the input box in the chat conversation interface, as if the video playing interface is directly dragged into the input box, and the sliding operation simulates the meaning of the invitation conveyed to the opposite-end user, so that the video watching invitation information may be generated and sent, for example, to the private chat object or all the group chat objects, that is, as shown in the right diagram in fig. 16, the sending of the video watching invitation information is completed.
In a possible implementation manner, after a video playing interface corresponding to a target video is displayed in a display interface of a chat conversation interface, a user may close the video playing interface through an eighth trigger operation acting on the video playing interface, after receiving the eighth trigger operation, the terminal device may close the chat conversation interface according to a trigger of the eighth trigger operation to quit a current chat conversation interface, that is, hide the chat conversation interface and switch the interface display, since the chat conversation interface is quitted, the system may pause playing the target video in order not to affect a continuous viewing of the target video by the user, after closing the chat conversation interface, if a predetermined application interface is displayed by switching, the user may continue playing the target video in the predetermined application interface according to the playing progress information when the target video is paused, if the predetermined application interface is not displayed by switching, the playing of the target video can be directly exited.
Here, as shown in fig. 17, the predetermined application interface is, for example, a chat list interface, so that after returning to the chat list interface from the private chat window guessed by the leaf guess, the target video can be continuously played on the same screen in the chat list from the playing progress at the time when the private chat window is closed. As shown in fig. 18, the predetermined application interface is, for example, the rest chat conversation windows (of course, the chat conversation window returned before is also available), in fig. 18, the private chat window guessed by leaf is switched to the group chat window of the "flower world", and the playing progress started in the group chat window is the same as the playing progress in the exited private chat window, so that the playing of the target video can be paused during the switching of the two chat conversation windows, so as to ensure that the user views the video normally and continuously, improve the continuity of video viewing, and enhance the viewing experience of the user.
In any of the above-described processes of playing a video, for example, in the process of playing a target video or playing a switched video, a user may play the voice chat information in the chat list interface, for example, the user triggers the voice chat information to play through a seventh trigger operation (for example, an operation of clicking a certain piece of voice chat information), and when the device detects the seventh trigger operation, the playing volume of the video may be reduced, so that the user may hear the voice chat information more clearly, and after the voice chat information is played, the playing volume of the video may be recovered. That is, when the video and the voice chat information are played synchronously, the volume of the video playing can be weakened, so that the interference on the receiving of the voice chat information of the user can be reduced as much as possible.
In a plurality of terminal application functions, instant messaging application is an application function which is widely applied at present, video playing is an application function which is high in entertainment, most users have more time and conditions for using the instant messaging function such as a chat function, and therefore the embodiment of the application integrates video inlet indication information in a chat conversation interface, namely, an inlet for video quick playing is embedded in a chat scene which is widely applied, so that the users can quickly call the video playing function in the chat process, and a more friendly and humanized man-machine interaction mode can be realized through the co-screen coexistence of the chat conversation interface and the video playing interface.
According to the technical scheme provided by the embodiment of the application, in order to realize the co-screen coexistence of the chat conversation and the video playing, a quick entry of the chat conversation interface, such as the chat conversation interface with a preset user or a chat window with latest information, can be embedded into the video playing interface, so that the user can call the chat conversation interface directly through quick operation on the quick entry of the embedded chat conversation interface in the process of watching the video, and the chat conversation interface and the video playing interface can be displayed on the same screen, thereby realizing the purpose of one screen with multiple purposes.
The embodiment of the application can meet the requirement that a user and other people perform online instant chat and can simultaneously have better experience to play and watch videos, namely, a product interaction mode with a single chat conversation window and video playing coexisting is provided, and the requirement of the user for watching chat is met.
In the embodiment of the application, the video entry indication information on the chat conversation window can be displayed in the form of a pendant, and a user can conveniently select a desired video without influencing chatting with other people. And when the video is selected, for example, when the video is selected online, multi-dimensional selection such as searching, personalized recommendation, hot recommendation and the like can be supported. By the user, after the user opens the video, the user can conveniently continue chatting with other people in the process of watching the video in the co-screen coexistence mode of the video playing window and the chatting window. In addition, when the user's attention center needs to be put on the chat content, the video window can be minimized very conveniently, i.e. the video playing window presents a simulated hanging mode, so that the interface of the chat window is more sufficient, and the normal playing of the video is not influenced. Moreover, when the video does not need to be watched, the playing can be quickly closed, the chat content is completely focused, and the use is flexible.
Based on the same inventive concept, the embodiment of the present application provides a device for controlling video playing, where the device for controlling video playing may be a hardware structure, a software module, or a hardware structure plus a software module. The apparatus for processing corpus may be, for example, the terminal device 301 in fig. 3, or may be a functional apparatus provided in the terminal device 301. Referring to fig. 19a, an apparatus for controlling video playback in the embodiment of the present application includes a receiving module 1901 and a display module 1902, where:
a receiving module 1901, configured to receive a first trigger operation for video entry indication information in a chat conversation interface;
the display module 1902 is configured to display, according to the trigger of the first trigger operation, a chat conversation interface and a video playing interface corresponding to the target video; or the chat conversation interface and the video playing interface corresponding to the target video are displayed according to the triggering of the first triggering operation.
In one possible implementation, the display module 1902 is configured to:
according to the triggering of the first triggering operation, determining a target video according to a preset recommendation strategy, and displaying a chat conversation interface and a video playing interface corresponding to the target video; alternatively, the first and second electrodes may be,
and determining a video selection interface according to the operation type of the first trigger operation, and displaying the video selection interface, wherein the video selection interface comprises a local video selection interface or an online video selection interface.
In a possible implementation manner, the receiving module 1901 is further configured to receive a second trigger operation for the video playing interface;
the display module 1902 is further configured to display a video playing operation control associated with the video playing interface according to the trigger of the second trigger operation.
In a possible implementation manner, the video playing operation control includes a minimization control, and the receiving module 1901 is further configured to receive a third triggering operation for the minimization control;
the display module 1902 is further configured to hide the video playing window according to a trigger of the third trigger operation, and display the minimized playing indication information, where the minimized playing indication information includes a video identifier of a target video currently being played.
In a possible implementation manner, the receiving module 1901 is further configured to receive a fourth trigger operation for the minimized playing indication information after the display module displays the minimized playing indication information;
the display module 1902, further configured to terminate playing the target video according to the trigger of the fourth trigger operation, and hide the display of the minimized playing indication information; or hiding the minimized playing indication information and resuming to display the video playing window according to the triggering of the fourth triggering operation.
In one possible implementation, the display module 1902 is further configured to:
obtaining at least one video playing indication information corresponding to a video currently watched by at least one chat object corresponding to the chat conversation interface; the video playing indication information comprises a video identifier, or the video playing indication information comprises the video identifier and playing progress information;
and displaying at least one video playing indication message in the chat conversation interface.
In one possible implementation, the display module 1902 is configured to:
if the chat conversation interface is a private chat conversation interface, displaying a video playing indication message in a first preset area in the chat conversation interface;
and if the chat conversation interface is a group chat conversation interface, dynamically displaying at least one piece of video playing indication information in a second preset area in the chat conversation interface, or displaying corresponding associated video playing indication information in an associated specified area of the user identifier of each chat object.
In a possible implementation manner, the receiving module 1901 is further configured to receive a fifth trigger operation for the target video playing indication information in the at least one video playing indication information;
the display module 1902 is further configured to display, according to the trigger of the fifth trigger operation, video content corresponding to the video indicated by the target video playing indication information through the video playing interface.
In a possible implementation manner, the receiving module 1901 is further configured to receive a sixth triggering operation for the video playing interface or the minimized playing indication information;
referring to fig. 19b, the apparatus for controlling video playing in the embodiment of the present application further includes a sending module 1903, configured to send, according to the trigger of the sixth trigger operation, the video viewing invitation information to the chat object corresponding to the chat conversation interface, where the video viewing invitation information is determined according to the playing address of the target video.
In a possible implementation manner, the receiving module 1901 is further configured to receive a seventh triggering operation for triggering the playing of the voice chat message in the chat conversation interface;
referring to fig. 19b, the apparatus for controlling video playback in the embodiment of the present application further includes a volume control module 1904, configured to reduce the playback volume of the target video when the receiving module receives the seventh trigger operation.
In a possible implementation manner, the receiving module 1901 is further configured to receive, after the display module displays the chat conversation interface and the video playing interface corresponding to the target video, an eighth trigger operation for the video playing interface;
the display module 1902 is further configured to hide the chat conversation interface and switch the interface display according to the trigger of the eighth trigger operation, and display the switched application interface and the video playing interface when the switched application interface is a predetermined application interface.
In a possible implementation manner, the receiving module 1901 is further configured to receive a ninth triggering operation for the video switching indication information;
the display module 1902 is further configured to switch, according to the trigger of the ninth trigger operation, the video played on the video playing interface from the target video to the view to be switched and played.
All relevant contents of each step involved in the embodiment of the method for controlling video playing described above may be cited to the functional description of the functional module corresponding to the apparatus for controlling video playing in the embodiment of the present application, and are not described herein again.
The division of the modules in the embodiments of the present application is schematic, and only one logical function division is provided, and in actual implementation, there may be another division manner, and in addition, each functional module in each embodiment of the present application may be integrated in one processor, may also exist alone physically, or may also be integrated in one module by two or more modules. The integrated module can be realized in a hardware mode, and can also be realized in a software functional module mode.
Based on the same inventive concept, the embodiment of the present application further provides a computing device, for example, the terminal device 301 in fig. 3 described above. Referring to fig. 20, the computing device in this embodiment of the application includes at least one processor 2001 and a memory 2002 connected to the at least one processor, and a specific connection medium between the processor 2001 and the memory 2002 is not limited in this embodiment of the application, for example, the processor 2001 and the memory 2002 may be connected by a bus, and the bus may be divided into an address bus, a data bus, a control bus, and the like.
In the embodiment of the present application, the memory 2002 stores instructions executable by the at least one processor 2001, and the at least one processor 2001 may execute the steps included in the foregoing video processing method by executing the instructions stored in the memory 2002.
The Processor 2001 may be a general-purpose Processor, such as a Central Processing Unit (CPU), a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other Programmable logic device, a discrete Gate or transistor logic device, or a discrete hardware component, which may implement or perform the methods, steps, and logic blocks disclosed in the embodiments of the present Application. A general purpose processor may be a microprocessor or any conventional processor or the like. The steps of a method disclosed in connection with the embodiments of the present application may be directly implemented by a hardware processor, or may be implemented by a combination of hardware and software modules in a processor.
The memory 2002, which is a non-volatile computer-readable storage medium, may be used to store non-volatile software programs, non-volatile computer-executable programs, and modules. The Memory may include at least one type of storage medium, and may include, for example, a flash Memory, a hard disk, a multimedia card, a card-type Memory, a Random Access Memory (RAM), a Static Random Access Memory (SRAM), a Programmable Read Only Memory (PROM), a Read Only Memory (ROM), a charged Erasable Programmable Read Only Memory (EEPROM), a magnetic Memory, a magnetic disk, an optical disk, and so on. The memory is any other medium that can be used to carry or store desired program code in the form of instructions or data structures and that can be accessed by a computer, but is not limited to such. The memory 2002 in the embodiments of the present application may also be circuitry or any other device capable of performing a storage function for storing program instructions and/or data.
The processor 2001 is a control center of the computing device, and may connect various parts of the entire computing device by using various interfaces and lines, and perform various functions of the computing device and process data by operating or executing instructions stored in the memory 2002 and calling data stored in the memory 2002, thereby performing overall monitoring of the computing device. Optionally, the processor 2001 may include one or more processing units, and the processor 2001 may integrate an application processor and a modem processor, wherein the application processor mainly handles an operating system, a user interface, an application program, and the like, and the modem processor mainly handles wireless communication. It will be appreciated that the modem processor described above may not be integrated into the processor 2001. In some embodiments, the processor 2001 and the memory 2002 may be implemented on the same chip, or in some embodiments, they may be implemented separately on separate chips.
Further, the computing device in the embodiment of the present application may further include an input unit 2003, a display unit 2004, a radio frequency unit 2005, an audio circuit 2006, a speaker 2007, a microphone 2008, a Wireless Fidelity (WiFi) module 2009, a bluetooth module 2010, a power source 2011, an external interface 2012, a headphone jack 2013, and other components. Those skilled in the art will appreciate that FIG. 20 is merely exemplary of a computing device and is not intended to limit the computing device, which may include more or fewer components than those shown, or may combine certain components, or different components.
The input unit 2003 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the video playback apparatus. For example, the input unit 2003 may include a touch screen 2014 and other input devices 2015. The touch screen 2014 may collect touch operations of a user (such as operations of the user on or near the touch screen 2014 by using any suitable object such as a finger, a joint, a stylus, and the like), that is, the touch screen 2014 may be used to detect touch pressure and touch input position and touch input area and drive the corresponding connection device according to a preset program. The touch panel 2014 may detect a touch operation of the touch panel 2014 by a user, convert the touch operation into a touch signal and send the touch signal to the processor 2001, or may transmit touch information of the touch operation to the processor 2001, and may receive and execute a command sent by the processor 2001. The touch information may include at least one of pressure magnitude information and pressure duration information. The touch screen 2014 may provide an input interface and an output interface between the video playback device and the user. In addition, the touch screen 2014 may be implemented in various types, such as resistive, capacitive, infrared, and surface acoustic wave. The input unit 2003 may include other input devices 2015 in addition to the touch screen 2014. For example, other input devices 2015 can include, but are not limited to, one or more of a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, a joystick, and the like.
The display unit 2004 may be used to display information input by or provided to the user and various menus of the video playback device. Further, the touch screen 2014 may cover the display unit 2004, and when the touch screen 2014 detects a touch operation on or near the touch screen 2014, the touch screen 2014 transmits the pressure information of the touch operation to the processor 2001 to determine the pressure information of the touch operation. In the embodiment of the present application, the touch screen 2014 and the display unit 2004 may be integrated into one component to implement the input, output and display functions of the video playing device. For convenience of description, the embodiment of the present application is schematically illustrated by taking the touch screen 2014 as an example of representing a functional set of the touch screen 2014 and the display unit 2004, but in some embodiments, the touch screen 2014 and the display unit 2004 may be taken as two separate components.
When the display unit 2004 and the touch panel are superimposed on each other in the form of layers to form the touch screen 2014, the display unit 2004 may serve as an input device and an output device, and when serving as an output device, may be used to display images, for example, to enable playing of various videos. The Display unit 2004 may include at least one of a Liquid Crystal Display (LCD), a Thin Film Transistor Liquid Crystal Display (TFT-LCD), an Organic Light Emitting Diode (OLED) Display, an Active Matrix Organic Light Emitting Diode (AMOLED) Display, an In-Plane Switching (IPS) Display, a flexible Display, a 3D Display, and the like. Some of these displays may be configured to be transparent to allow a user to view from the outside, which may be referred to as transparent displays, and according to certain desired embodiments, the computing device may include two or more display units (or other display devices), e.g., the computing device may include an external display unit (not shown in fig. 20) and an internal display unit (not shown in fig. 20).
The rf unit 2005 may be used for transceiving information or receiving and transmitting signals during a call. Typically, the radio frequency circuitry includes, but is not limited to, an antenna, at least one Amplifier, a transceiver, a coupler, a Low Noise Amplifier (LNA), a duplexer, and the like. In addition, the radio frequency unit 2005 can also communicate with a network device and other devices through wireless communication.
Audio circuitry 2006, speaker 2007, microphone 2008 can provide an audio interface between a user and a video playback device. The audio circuit 2006 may transmit the electrical signal converted from the received audio data to the speaker 2007, and convert the electrical signal into an audio signal for output by the speaker 2007. On the other hand, the microphone 2008 converts the collected sound signal into an electrical signal, which is received by the audio circuit 2006 and converted into audio data, and then the audio data is processed by the audio data output processor 2001 and sent to another electronic device through the radio frequency unit 2005, or the audio data is output to the memory 2002 for further processing, and the audio circuit may also include a headphone jack 2013 for providing a connection interface between the audio circuit and a headphone.
WiFi belongs to short-range wireless transmission technology, and the computing device can help the user send and receive e-mails, browse webpages, access streaming media, and the like through the WiFi module 2009, which provides wireless broadband internet access for the user. Although fig. 20 shows the WiFi module 2009, it is understood that it does not belong to the essential constitution of the computing device, and may be omitted entirely as needed within the scope not changing the essence of the invention.
Bluetooth is a short-range wireless communication technology. By using the bluetooth technology, the communication between mobile communication video playing devices such as a palm computer, a notebook computer, a mobile phone and the like can be effectively simplified, the communication between the devices and the Internet (Internet) can also be successfully simplified, the video playing devices enable the data transmission between the computing devices and the Internet to be more rapid and efficient through the bluetooth module 2010, and the way is widened for wireless communication. Bluetooth technology is an open solution that enables wireless transmission of voice and data. Although fig. 20 shows the bluetooth module 2010, it is understood that it does not belong to the essential constitution of the computing device, and may be omitted entirely as needed within the scope not changing the essence of the invention.
The computing device may also include a power source 2011 (such as a battery) for receiving external power or powering various components within the computing device. Preferably, the power source 2011 may be logically connected to the processor 901 through a power management system, so as to implement functions of managing charging, discharging, power consumption management, and the like through the power management system.
The computing device may also include an external interface 2012, where the external interface 2012 may include a standard Micro USB interface or may include a multi-pin connector that may be used to connect the computing device to communicate with other devices or to connect a charger to charge the computing device.
Although not shown, the computing device in the embodiment of the present application may further include a camera, a flash, and other possible functional modules, which are not described herein again.
Based on the same inventive concept, the present application also provides a storage medium, which may be a computer-readable storage medium, and the storage medium stores computer instructions, which, when executed on a computer, cause the computer to perform the steps of the method for controlling video playing as described above.
Based on the same inventive concept, the embodiment of the present application further provides a chip system, where the chip system includes a processor and may further include a memory, and is used to implement the steps of the method for controlling video playing as described above. The chip system may be formed by a chip, and may also include a chip and other discrete devices.
In some possible implementations, various aspects of the method for controlling video playback provided by the embodiments of the present application can also be implemented in the form of a program product including program code for causing a computer to perform the steps in the method for controlling video playback according to various exemplary implementations of the present application described above when the program product runs on the computer.
As will be appreciated by one skilled in the art, embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, optical storage, and the like) having computer-usable program code embodied therein.
It will be apparent to those skilled in the art that various changes and modifications may be made in the present application without departing from the spirit and scope of the application. Thus, if such modifications and variations of the present application fall within the scope of the claims of the present application and their equivalents, the present application is intended to include such modifications and variations as well.

Claims (15)

1. A method of controlling video playback, the method comprising:
receiving a first trigger operation aiming at video entrance indication information in a chat conversation interface;
displaying the chat conversation interface and a video playing interface corresponding to the target video according to the triggering of the first triggering operation; alternatively, the first and second electrodes may be,
and displaying a video selection interface according to the triggering of the first triggering operation, and displaying the chat conversation interface and a video playing interface corresponding to the target video according to the triggering of the selection operation of the target video aiming at the video selection interface.
2. The method of claim 1, wherein displaying the chat conversation interface and the video playing interface corresponding to the target video according to the trigger of the first trigger operation comprises:
determining the target video according to a preset recommendation strategy according to the triggering of the first triggering operation, and displaying the chat conversation interface and a video playing interface corresponding to the target video;
according to the triggering of the first triggering operation, displaying a video selection interface, comprising:
determining the video selection interface according to the operation type of the first trigger operation, and displaying the video selection interface, wherein the video selection interface comprises a local video selection interface or an online video selection interface.
3. The method of claim 1, wherein the method further comprises:
receiving a second trigger operation aiming at the video playing interface;
and displaying a video playing operation control associated with the video playing interface according to the triggering of the second triggering operation.
4. The method of claim 3, wherein the video playback operation control comprises a minimize control, the method further comprising:
receiving a third trigger operation for the minimized control;
and hiding the video playing window according to the triggering of the third triggering operation, and displaying minimized playing indication information, wherein the minimized playing indication information comprises a video identifier of the target video which is currently played.
5. The method of claim 4, wherein after displaying the minimized play indication information, the method further comprises:
receiving a fourth trigger operation aiming at the minimized playing indication information;
according to the triggering of the fourth triggering operation, stopping playing the target video and hiding the minimized playing indication information; alternatively, the first and second electrodes may be,
and hiding the minimized playing indication information and restoring to display the video playing interface according to the triggering of the fourth triggering operation.
6. The method of claim 1, wherein the method further comprises:
obtaining at least one video playing indication information corresponding to a video currently watched by at least one chat object corresponding to the chat conversation interface; the video playing indication information comprises a video identifier, or the video playing indication information comprises the video identifier and playing progress information;
and displaying the at least one video playing indication information in the chat conversation interface.
7. The method of claim 6, wherein displaying the at least one video playback indicator in the chat conversation interface comprises:
if the chat conversation interface is a private chat conversation interface, displaying a video playing indication message in a first preset area in the chat conversation interface;
and if the chat conversation interface is a group chat conversation interface, dynamically displaying the at least one piece of video playing indication information in a second preset area in the chat conversation interface, or displaying the corresponding associated video playing indication information in an associated specified area of the user identifier of each chat object.
8. The method of claim 6 or 7, wherein the method further comprises:
receiving a fifth trigger operation aiming at target video playing indication information in the at least one piece of video playing indication information;
and displaying the video content corresponding to the video indicated by the target video playing indication information through the video playing interface according to the triggering of the fifth triggering operation.
9. The method of claim 1 or 4, wherein the method further comprises:
receiving a sixth trigger operation aiming at the video playing interface or the minimized playing indication information;
and sending video watching invitation information to a chat object corresponding to the chat conversation interface according to the triggering of the sixth triggering operation, wherein the video watching invitation information is determined according to the playing address of the target video.
10. The method of claim 1, wherein the method further comprises:
and when a seventh trigger operation of triggering the voice chat information in the chat conversation interface to be played is received, reducing the playing volume of the target video.
11. The method of claim 1, wherein after displaying the chat conversation interface and the video playback interface corresponding to the target video, the method further comprises:
receiving an eighth trigger operation aiming at the video playing interface;
hiding the chat conversation interface and switching interface display according to the triggering of the eighth triggering operation, and displaying the switched application interface and the video playing interface when the switched application interface is a preset application interface.
12. The method of claim 1, wherein the method further comprises:
receiving a ninth trigger operation aiming at the video switching indication information;
and switching the video played by the video playing interface from the target video to the video to be switched and played according to the triggering of the ninth triggering operation.
13. An apparatus for controlling video playback, the apparatus comprising:
the system comprises a receiving module, a sending module and a receiving module, wherein the receiving module is used for receiving a first trigger operation aiming at video entrance indication information in a chat conversation interface;
the display module is used for displaying the chat conversation interface and a video playing interface corresponding to the target video according to the triggering of the first triggering operation; or the chat conversation interface and the video playing interface corresponding to the target video are displayed according to the triggering of the first triggering operation.
14. A computing device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, the processor implementing the steps comprised by the method according to any one of claims 1 to 12 when executing the computer program.
15. A storage medium storing computer-executable instructions for causing a computer to perform the steps comprising the method of any one of claims 1-12.
CN201911056014.4A 2019-10-31 2019-10-31 Method and device for controlling video playing, computing equipment and storage medium Active CN112751744B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201911056014.4A CN112751744B (en) 2019-10-31 2019-10-31 Method and device for controlling video playing, computing equipment and storage medium
CN202210868991.XA CN115051965B (en) 2019-10-31 2019-10-31 Method and device for controlling video playing, computing equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911056014.4A CN112751744B (en) 2019-10-31 2019-10-31 Method and device for controlling video playing, computing equipment and storage medium

Related Child Applications (1)

Application Number Title Priority Date Filing Date
CN202210868991.XA Division CN115051965B (en) 2019-10-31 2019-10-31 Method and device for controlling video playing, computing equipment and storage medium

Publications (2)

Publication Number Publication Date
CN112751744A true CN112751744A (en) 2021-05-04
CN112751744B CN112751744B (en) 2022-06-21

Family

ID=75645127

Family Applications (2)

Application Number Title Priority Date Filing Date
CN201911056014.4A Active CN112751744B (en) 2019-10-31 2019-10-31 Method and device for controlling video playing, computing equipment and storage medium
CN202210868991.XA Active CN115051965B (en) 2019-10-31 2019-10-31 Method and device for controlling video playing, computing equipment and storage medium

Family Applications After (1)

Application Number Title Priority Date Filing Date
CN202210868991.XA Active CN115051965B (en) 2019-10-31 2019-10-31 Method and device for controlling video playing, computing equipment and storage medium

Country Status (1)

Country Link
CN (2) CN112751744B (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113489937A (en) * 2021-07-02 2021-10-08 北京字跳网络技术有限公司 Video sharing method, device, equipment and medium
CN113542902A (en) * 2021-07-13 2021-10-22 北京字跳网络技术有限公司 Video processing method and device, electronic equipment and storage medium
CN114296617A (en) * 2022-03-11 2022-04-08 北京麟卓信息科技有限公司 Method for automatically switching playing video of android application
CN115278378A (en) * 2022-07-27 2022-11-01 维沃移动通信有限公司 Information display method, information display device, electronic apparatus, and storage medium

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103391451A (en) * 2013-07-09 2013-11-13 百度在线网络技术(北京)有限公司 Method, system and device for communication during video watching
US20150163453A1 (en) * 2009-08-28 2015-06-11 Apple Inc. Method and apparatus for initiating and managing chat sessions
CN105577519A (en) * 2015-12-21 2016-05-11 曾友国 Method for mutual-help shopping
CN105898627A (en) * 2016-05-31 2016-08-24 北京奇艺世纪科技有限公司 Video playing method and device
CN107948714A (en) * 2017-11-01 2018-04-20 北京小蓦机器人技术有限公司 Video broadcasting method, equipment and storage medium under more people's video-see scenes
CN109842819A (en) * 2017-11-28 2019-06-04 腾讯数码(天津)有限公司 A kind of video playing interactive approach, device, system, user terminal and medium

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101226560B1 (en) * 2011-03-29 2013-01-25 (주)티아이스퀘어 System and method for providing multidedia content sharing service during communication service
US8943141B2 (en) * 2012-06-18 2015-01-27 Lutebox Ltd. Social networking system and methods of implementation
US9191618B2 (en) * 2012-10-26 2015-11-17 Speedcast, Inc. Method and system for producing and viewing video-based group conversations
CN106412712A (en) * 2016-09-26 2017-02-15 北京小米移动软件有限公司 Video playing method and apparatus
CN106533924A (en) * 2016-12-19 2017-03-22 广州华多网络科技有限公司 Instant messaging method and device
CN106657657B (en) * 2016-12-30 2019-08-30 广州华多网络科技有限公司 Method, the system of mobile terminal and mobile terminal video browsing
CN109089146A (en) * 2018-08-30 2018-12-25 维沃移动通信有限公司 A kind of method and terminal device controlling video playing

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150163453A1 (en) * 2009-08-28 2015-06-11 Apple Inc. Method and apparatus for initiating and managing chat sessions
CN103391451A (en) * 2013-07-09 2013-11-13 百度在线网络技术(北京)有限公司 Method, system and device for communication during video watching
CN105577519A (en) * 2015-12-21 2016-05-11 曾友国 Method for mutual-help shopping
CN105898627A (en) * 2016-05-31 2016-08-24 北京奇艺世纪科技有限公司 Video playing method and device
CN107948714A (en) * 2017-11-01 2018-04-20 北京小蓦机器人技术有限公司 Video broadcasting method, equipment and storage medium under more people's video-see scenes
CN109842819A (en) * 2017-11-28 2019-06-04 腾讯数码(天津)有限公司 A kind of video playing interactive approach, device, system, user terminal and medium

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113489937A (en) * 2021-07-02 2021-10-08 北京字跳网络技术有限公司 Video sharing method, device, equipment and medium
CN113489937B (en) * 2021-07-02 2023-06-20 北京字跳网络技术有限公司 Video sharing method, device, equipment and medium
CN113542902A (en) * 2021-07-13 2021-10-22 北京字跳网络技术有限公司 Video processing method and device, electronic equipment and storage medium
CN113542902B (en) * 2021-07-13 2023-02-24 北京字跳网络技术有限公司 Video processing method and device, electronic equipment and storage medium
CN114296617A (en) * 2022-03-11 2022-04-08 北京麟卓信息科技有限公司 Method for automatically switching playing video of android application
CN114296617B (en) * 2022-03-11 2022-05-06 北京麟卓信息科技有限公司 Method for automatically switching playing video of android application
CN115278378A (en) * 2022-07-27 2022-11-01 维沃移动通信有限公司 Information display method, information display device, electronic apparatus, and storage medium

Also Published As

Publication number Publication date
CN115051965B (en) 2023-03-31
CN115051965A (en) 2022-09-13
CN112751744B (en) 2022-06-21

Similar Documents

Publication Publication Date Title
CN112751744B (en) Method and device for controlling video playing, computing equipment and storage medium
US10942614B2 (en) Terminal device and method for displaying an associated window thereof
WO2023274144A1 (en) Message processing method and apparatus, electronic device, and storage medium
US10630629B2 (en) Screen display method, apparatus, terminal, and storage medium
US20140280603A1 (en) User attention and activity in chat systems
CN109844706B (en) Message processing method and device
US10673790B2 (en) Method and terminal for displaying instant messaging message
CN110536008B (en) Screen projection method and mobile terminal
CN112540821B (en) Information sending method and electronic equipment
CN109521918B (en) Information sharing method and device, electronic equipment and storage medium
KR101719992B1 (en) Mobile terminal and Method for applying metadata thereof
CN111343073B (en) Video processing method and device and terminal equipment
US11652763B2 (en) Information display method and apparatus, and electronic device
CN110830813B (en) Video switching method and device, electronic equipment and storage medium
WO2022171062A1 (en) Method and apparatus for sharing picture, and electronic device and storage medium
CN107729098B (en) User interface display method and device
CN113676395B (en) Information processing method, related device and readable storage medium
EP2838225A1 (en) Message based conversation function execution method and electronic device supporting the same
CN110636318A (en) Message display method, message display device, client device, server and storage medium
CN108989191B (en) Method for withdrawing picture file, control method and device thereof, and mobile terminal
WO2022267433A1 (en) Video resource processing method and apparatus
CN115378893A (en) Message processing method and device, electronic equipment and readable storage medium
CN115718581A (en) Information display method and device, electronic equipment and storage medium
CN112801752B (en) Page display method, device, equipment and medium based on application mall
CN112752160B (en) Method and device for controlling video playing, computing equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
REG Reference to a national code

Ref country code: HK

Ref legal event code: DE

Ref document number: 40044209

Country of ref document: HK

GR01 Patent grant
GR01 Patent grant