CN113179445A - Video sharing method based on interactive article and interactive article - Google Patents

Video sharing method based on interactive article and interactive article Download PDF

Info

Publication number
CN113179445A
CN113179445A CN202110406610.1A CN202110406610A CN113179445A CN 113179445 A CN113179445 A CN 113179445A CN 202110406610 A CN202110406610 A CN 202110406610A CN 113179445 A CN113179445 A CN 113179445A
Authority
CN
China
Prior art keywords
interactive
video
terminal
information
target video
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110406610.1A
Other languages
Chinese (zh)
Other versions
CN113179445B (en
Inventor
邬文捷
陈镜州
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN202110406610.1A priority Critical patent/CN113179445B/en
Publication of CN113179445A publication Critical patent/CN113179445A/en
Application granted granted Critical
Publication of CN113179445B publication Critical patent/CN113179445B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/478Supplemental services, e.g. displaying phone caller identification, shopping application
    • H04N21/4781Games
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/478Supplemental services, e.g. displaying phone caller identification, shopping application
    • H04N21/4788Supplemental services, e.g. displaying phone caller identification, shopping application communicating with other users, e.g. chatting
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/488Data services, e.g. news ticker
    • H04N21/4884Data services, e.g. news ticker for displaying subtitles

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • General Engineering & Computer Science (AREA)
  • Information Transfer Between Computers (AREA)

Abstract

The application relates to a video sharing method based on interactive articles and the interactive articles. The method comprises the following steps: when a first interactive operation occurs between a first terminal and an interactive object, triggering a video acquisition operation to obtain corresponding target video information; when a second interactive operation occurs between the first terminal and the interactive object, the target video information is triggered to be transmitted to the interactive object; and the target video information is transmitted to the interactive object and used for indicating the second terminal to play the target video corresponding to the target video information when third interactive operation occurs between the second terminal and the interactive object. By the method, video sharing efficiency can be improved.

Description

Video sharing method based on interactive article and interactive article
Technical Field
The application relates to the technical field of computers, in particular to a video sharing method based on interactive articles and the interactive articles.
Background
With the rapid development of the human-computer interaction technology, more and more game applications appear on the touch terminal, and a user can form a team, fight and the like through the game applications. During the game, players often want to share the game highlight moments with other players. For example, when a player releases a bid to win a critical win, the player may want to share the game process of releasing the bid and winning the critical win to other players.
The current sharing method is to record a video during a game process to obtain a recorded video, upload the recorded video to an online multimedia playing platform, and share the video through the online multimedia playing platform. However, the video sharing through the online multimedia playing platform is limited, and generally, a player needs to create a user account, convert a video to be shared into a specific format specified by the multimedia playing platform, and share the video after being audited by the multimedia playing platform, so that the video sharing efficiency is low.
Disclosure of Invention
In view of the foregoing, it is desirable to provide a video sharing method, device, computer device, storage medium and computer program based on interactive articles, which can improve video sharing efficiency.
A video sharing method based on interactive goods, the method comprising:
when a first interactive operation occurs between a first terminal and an interactive object, triggering a video acquisition operation to obtain corresponding target video information;
when a second interactive operation occurs between the first terminal and the interactive object, triggering the transmission of the target video information to the interactive object; and the target video information is transmitted to the interactive object and is used for indicating the second terminal to play the target video corresponding to the target video information when a third interactive operation is carried out between the second terminal and the interactive object.
A video sharing device based on interactive articles comprises:
and the video acquisition module is used for triggering video acquisition operation to obtain corresponding target video information when first interactive operation occurs between the first terminal and the interactive object.
The video transmission module is used for triggering the target video information to be transmitted to the interactive object when a second interactive operation is carried out between the first terminal and the interactive object; and the target video information is transmitted to the interactive object and is used for indicating the second terminal to play the target video corresponding to the target video information when a third interactive operation is carried out between the second terminal and the interactive object.
In one embodiment, the video acquisition module is further configured to trigger the cloud server to execute a video acquisition operation through the first terminal when a first interaction operation occurs between the first terminal and an interactive article, so as to obtain corresponding target video information; when second interactive operation occurs between the first terminal and the interactive object, the cloud server is triggered by the first terminal to transmit the target video information to the interactive object.
In one embodiment, the video capture module is further configured to display an interactive article interaction interface when a first interaction operation occurs between the first terminal and the interactive article; and responding to the video acquisition operation triggered by the interactive object interactive interface, and acquiring the corresponding target video information by video acquisition.
In one embodiment, the video sharing device based on the interactive article further comprises an interactive interface display module, configured to display an application home page of an interactive application logged in with a first application account; the application home page displays a bound article bound with the first application account; when a first terminal running an interactive application logged in by a first application account number and an interactive article have first interactive operation, determining whether the interactive article belongs to the bound article; and when the interactive object belongs to the bound object, displaying an interactive object interactive interface through the interactive application.
In one embodiment, the video capture module further comprises a template selection module, configured to display a video capture setting interface in response to a capture start operation triggered at the interactive item interaction interface, and display a set of video templates through the video capture setting interface; in response to a selection operation for the video template set, determining at least one target video template selected by the selection operation; and responding to the video acquisition operation triggered by the video acquisition setting interface, and acquiring the video based on the target video template to obtain target video information.
In one embodiment, the target video information comprises a target video, and the target template comprises a video display effect and a video capture mode; the template selection module is also used for responding to the video acquisition operation triggered by the video acquisition setting interface, and acquiring a video by video acquisition to obtain an acquired video; performing video interception processing on the acquired video in the video interception mode to obtain a video clip; and editing the video frames in the video clips through the video display effect to obtain a target video to be shared.
In one embodiment, the video capture module is further configured to respond to a start operation for a target game application triggered on the video capture setting interface, start the target game application, and display a corresponding game picture through the target game application; and carrying out video recording on the displayed game picture to obtain a collected video.
In one embodiment, the video sharing device based on the interactive article further includes a meeting receiving module, configured to display meeting invitation information transmitted by the interactive article when a fifth interaction operation occurs between the first terminal and the interactive article; the meeting invitation information is generated by the second terminal responding to the meeting invitation operation and is transmitted to the interactive article when a fourth interactive operation occurs between the second terminal and the interactive article; the meeting invitation information is used for requesting to meet with the uploading party of the target video; and responding to a response operation which occurs aiming at the meeting invitation information, and triggering to transmit response information which is specified by the response operation to the second terminal.
In one embodiment, the meeting receiving module is further configured to display a special reminding mark in an application home page of an interactive application when a fifth interactive operation occurs between the first terminal and the interactive item; and responding to the information reading operation aiming at the special reminding mark, and displaying the meeting invitation information corresponding to the special reminding mark.
In one embodiment, the application home page shows a first user head portrait corresponding to a first application account logged in to a first terminal; the meeting receiving module is also used for displaying the special reminding mark at a position corresponding to the first user head portrait; when a preset touch operation aiming at the head portrait of the first user occurs, displaying meeting invitation information corresponding to the special reminding mark; and the meeting invitation information comprises a second user head portrait corresponding to a second application account logged in the second terminal.
In one embodiment, the meeting invitation information comprises a second application account number logged in to the second terminal; the meeting receiving module is also used for responding to the response operation triggered by the meeting invitation information and displaying a response information input area; acquiring meeting time input through the response information input area, and generating corresponding response information according to the meeting time; and sending the response information to the second terminal according to the second application account.
In one embodiment, the meeting receiving module is further configured to present a response information input area in response to a response operation triggered by the meeting invitation information; responding to a response input operation triggered in the response information input area, and displaying response information input through the response input operation; when a sixth interactive operation occurs between the first terminal and the interactive object, triggering to transmit the response information to the interactive object; and the response information is transmitted to the interactive article and is used for indicating the second terminal to display the response information when a seventh interactive operation is carried out between the second terminal and the interactive article.
In one embodiment, the first terminal is provided with a first electronic tag, the interactive article is provided with a second electronic tag, and the second terminal is provided with a third electronic tag; the video sharing device based on the interactive article is further used for triggering video acquisition operation to obtain corresponding target video information when the first electronic tag of the first terminal touches the second electronic tag of the interactive article for the first time; when the first electronic tag of the first terminal touches the second electronic tag of the interactive article again, transmitting the target video information to the interactive article; and the target video information transmitted to the interactive article is used for indicating the second terminal to play the target video corresponding to the target video information when the third electronic tag of the second terminal touches the second electronic tag of the interactive article.
A computer device comprising a memory and a processor, the memory storing a computer program, the processor implementing the following steps when executing the computer program:
when a first interactive operation occurs between a first terminal and an interactive object, triggering a video acquisition operation to obtain corresponding target video information;
when a second interactive operation occurs between the first terminal and the interactive object, triggering the transmission of the target video information to the interactive object; and the target video information is transmitted to the interactive object and is used for indicating the second terminal to play the target video corresponding to the target video information when a third interactive operation is carried out between the second terminal and the interactive object.
A computer-readable storage medium, on which a computer program is stored which, when executed by a processor, carries out the steps of:
when a first interactive operation occurs between a first terminal and an interactive object, triggering a video acquisition operation to obtain corresponding target video information;
when a second interactive operation occurs between the first terminal and the interactive object, triggering the transmission of the target video information to the interactive object; and the target video information is transmitted to the interactive object and is used for indicating the second terminal to play the target video corresponding to the target video information when a third interactive operation is carried out between the second terminal and the interactive object.
A computer program product or computer program, the computer program product or computer program comprising computer instructions stored in a computer readable storage medium, the computer instructions being read by a processor of a computer device from the computer readable storage medium, the processor executing the computer instructions to cause the computer device to perform the steps of:
when a first interactive operation occurs between a first terminal and an interactive object, triggering a video acquisition operation to obtain corresponding target video information;
when a second interactive operation occurs between the first terminal and the interactive object, triggering the transmission of the target video information to the interactive object; and the target video information is transmitted to the interactive object and is used for indicating the second terminal to play the target video corresponding to the target video information when a third interactive operation is carried out between the second terminal and the interactive object.
According to the video sharing method and device based on the interactive article, the computer equipment, the storage medium and the computer program, the first interactive operation generated between the first terminal and the interactive article is responded, the video acquisition operation can be triggered based on the first interactive operation, and the video clip to be acquired is automatically acquired according to the video acquisition operation, so that the target video information is obtained. The target video information can be triggered to be transmitted to the interactive object by responding to the second interactive operation between the first terminal and the interactive object, so that the second terminal can play the target video corresponding to the target video information through the third interactive operation between the second terminal and the interactive object, and the target video can be shared. Because the first terminal only needs to upload the automatically acquired target video information to the interactive object through simple interactive operation, the interactive object becomes a quick uploading channel of the target video, the uploading efficiency of uploading the video is improved, and the sharing efficiency of sharing the video is improved. Because the second terminal also only needs to pull the target video information from the interactive object through simple interactive operation, the interactive object becomes a convenient viewing channel, and the video sharing efficiency is further improved.
A video playback method based on an interactive article, the method comprising:
when a third interactive operation occurs between the second terminal and the interactive object, displaying a video playing interface;
responding to a video playing operation triggered in the video playing interface, and playing a corresponding target video based on target video information received from the interactive object; the target video information is obtained by video acquisition after a first interactive operation is carried out between the first terminal and the interactive object, and is transmitted to the interactive object when a second interactive operation is carried out between the first terminal and the interactive object.
A video playback device based on interactive articles, comprising:
the playing interface display module is used for displaying a video playing interface when a third interactive operation is carried out between the second terminal and the interactive object;
the video playing module is used for responding to video playing operation triggered in the video playing interface and playing a corresponding target video based on target video information received from the interactive object; the target video information is obtained by video acquisition after a first interactive operation is carried out between the first terminal and the interactive object, and is transmitted to the interactive object when a second interactive operation is carried out between the first terminal and the interactive object.
In one embodiment, the playing interface display module is further configured to pull target video information and comment information from the interactive object and display a video playing interface corresponding to the target video information when a third interactive operation occurs between the second terminal and the interactive object.
In one embodiment, the video playing module is further configured to play, in response to a video playing operation triggered in the video playing interface, a target video corresponding to the target video information through the video playing interface, and display comment information of the target video.
In one embodiment, the interactive article-based video playing device further comprises a comment publishing module, configured to display the input to-be-published comment in response to a comment input operation triggered in the message input area; and responding to comment publishing operation triggered by the message input area, and uploading the to-be-published comments to the interactive object when eighth interactive operation occurs between the second terminal and the interactive object so as to publish the to-be-published comments.
In one embodiment, the video playing device based on the interactive article further includes a meeting invitation module, configured to generate meeting invitation information in response to a meeting invitation operation triggered through the video playing interface; when the second terminal and the interactive article perform fourth interactive operation, transmitting the meeting invitation information to the interactive article; and the meeting invitation information transmitted to the interactive article is used for inviting a first user corresponding to the first terminal to meet a second user corresponding to the second terminal.
In one embodiment, the target video information includes a target video, the target video is a video obtained by recording a game picture of a target game application, the target video includes a first game account corresponding to a first user, and the interactive article-based video playing device is further configured to respond to a request for team formation operation occurring in the video playing interface and initiate a team formation request to the first terminal based on the first game account; the team formation request is used for indicating the first terminal to perform team formation interaction with a second game account logged in the second terminal when entering a target game application based on the first game account.
A computer device comprising a memory and a processor, the memory storing a computer program, the processor implementing the following steps when executing the computer program:
when a third interactive operation occurs between the second terminal and the interactive object, displaying a video playing interface;
responding to a video playing operation triggered in the video playing interface, and playing a corresponding target video based on target video information received from the interactive object; the target video information is obtained by video acquisition after a first interactive operation is carried out between the first terminal and the interactive object, and is transmitted to the interactive object when a second interactive operation is carried out between the first terminal and the interactive object.
A computer-readable storage medium, on which a computer program is stored which, when executed by a processor, carries out the steps of:
when a third interactive operation occurs between the second terminal and the interactive object, displaying a video playing interface;
responding to a video playing operation triggered in the video playing interface, and playing a corresponding target video based on target video information received from the interactive object; the target video information is obtained by video acquisition after a first interactive operation is carried out between the first terminal and the interactive object, and is transmitted to the interactive object when a second interactive operation is carried out between the first terminal and the interactive object.
A computer program product or computer program, the computer program product or computer program comprising computer instructions stored in a computer readable storage medium, the computer instructions being read by a processor of a computer device from the computer readable storage medium, the processor executing the computer instructions to cause the computer device to perform the steps of:
when a third interactive operation occurs between the second terminal and the interactive object, displaying a video playing interface;
responding to a video playing operation triggered in the video playing interface, and playing a corresponding target video based on target video information received from the interactive object; the target video information is obtained by video acquisition after a first interactive operation is carried out between the first terminal and the interactive object, and is transmitted to the interactive object when a second interactive operation is carried out between the first terminal and the interactive object.
According to the playing method, the playing device, the computer equipment, the storage medium and the computer program, the video playing interface can be displayed through third interactive operation by responding to the third interactive operation between the second terminal and the interactive object, so that when the video playing operation is triggered, the target video corresponding to the target video information can be played through the video playing operation. Because the second terminal only needs to pull target video information from the interactive articles through simple interactive operation, the interactive articles become convenient ornamental channels, and the video sharing efficiency is improved.
Drawings
FIG. 1 is a diagram of an exemplary video sharing method based on interactive articles;
FIG. 2 is a schematic flow chart illustrating a video sharing method based on interactive articles according to an embodiment;
FIG. 3 is a schematic interface diagram of an interactive item interaction interface in one embodiment;
FIG. 4 is a schematic diagram of a page of an application home page in one embodiment;
FIG. 5 is a schematic interface diagram of a video capture settings interface in one embodiment;
FIG. 6 is a diagram of a video preview interface in one embodiment;
FIG. 7 is an interface diagram of a target gaming application in one embodiment;
FIG. 8 is a diagram of meeting invitation information in one embodiment;
FIG. 9 is a diagram of a special reminder mark in one embodiment;
FIG. 10 is a diagram illustrating the presentation of response messages in one embodiment;
FIG. 11 is an interface diagram of a response message preview interface in one embodiment;
FIG. 12 is a flowchart illustrating a video sharing method based on interactive articles according to an embodiment;
FIG. 13 is a diagram of a video playback interface in one embodiment;
FIG. 14 is a schematic diagram illustrating a meeting invitation information preview interface in one embodiment;
FIG. 15 is a flowchart illustrating a video sharing method based on interactive articles according to an exemplary embodiment;
FIG. 16 is a flowchart illustrating a method for video playback based on interactive articles according to an exemplary embodiment;
FIG. 17 is a diagram illustrating an embodiment of a scene with face-to-face communication based on a target video;
FIG. 18 is a diagram illustrating a scenario for playing a target video in one embodiment;
FIG. 19 is a diagram illustrating a scenario for playing a target video in one embodiment;
FIG. 20 is a block diagram of an embodiment of an interactive article based video sharing apparatus;
FIG. 21 is a block diagram of an interactive article based video player device according to an embodiment;
FIG. 22 is a diagram illustrating an internal structure of a computer device according to an embodiment.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application.
Fig. 1 is a diagram of an application environment of a video sharing method based on interactive articles according to an embodiment. Referring to fig. 1, the video sharing method based on interactive goods is applied to a video sharing system 100 based on interactive goods. The interactive item based video sharing system 100 includes a first terminal 102, a second terminal 104, an interactive item 106, and a server 108. When a first interaction operation occurs between the first terminal 102 and the interactive item 106, the first terminal 102 may perform video acquisition to obtain target video information, or trigger the server 108 to perform video acquisition to obtain the target video information. When a second interactive operation occurs between the first terminal 102 and the interactive object 106, the first terminal 102 may upload the target video information to the interactive object 106, so that when a third interactive operation occurs between the second terminal 104 and the interactive object 106, the target video corresponding to the target video information is displayed through the second terminal 102. Or, when the second interactive operation occurs between the first terminal 102 and the interactive item 106, the trigger server 108 transmits the target video information to the interactive item 106, so that when the third interactive operation occurs between the second terminal 104 and the interactive item 106, the target video corresponding to the target video information is displayed through the second terminal 102. The first terminal 102 and the second terminal 104 may be, but are not limited to, various personal computers, notebook computers, smart phones, tablet computers, and portable wearable devices, and the interactive item 104 may be an interactive handheld capable of interacting with the terminals. The server 108 may be an independent physical server, a server cluster or a distributed system formed by a plurality of physical servers, or a cloud server providing basic cloud computing services such as a cloud service, a cloud database, cloud computing, a cloud function, cloud storage, a network service, cloud communication, middleware service, a domain name service, a security service, a CDN, a big data and artificial intelligence platform, and the like. The terminal may be, but is not limited to, a smart phone, a tablet computer, a laptop computer, a desktop computer, a smart speaker, a smart watch, and the like.
It should be understood that the use of "first," "second," and similar terms in the present disclosure are not intended to indicate any order, quantity, or importance, but rather are used to distinguish one element from another. The singular forms "a," "an," or "the" and similar referents do not denote a limitation of quantity, but rather denote the presence of at least one, unless the context clearly dictates otherwise. The first to eighth interactions in the present disclosure may be at least one of a touch operation, a scan operation, a start operation, and a press operation.
In an embodiment, as shown in fig. 2, a video sharing method based on interactive articles is provided, which is described by taking as an example that the method is applied to a computer device in fig. 1 (the computer device may specifically be the first terminal 102 and the server 108 in fig. 1), and includes the following steps:
step S202, when a first interactive operation occurs between the first terminal and the interactive object, a video acquisition operation is triggered to obtain corresponding target video information.
The interactive object refers to an object capable of interacting, for example, the interactive object may be an object that performs information interaction with a terminal through Near Field Communication (NFC), a network, an optical fiber, and the like. In one embodiment, the interactive object may be a handheld object, in which an insulating coil is installed inside the interactive object, the interactive object can receive an incoming call signal through touching the insulating coil in the terminal, and the interactive object can transmit information with the terminal through the incoming call signal. Interactive operation refers to operation of interacting with an interactive article. In one embodiment, the interaction operation includes at least one of a touch operation, a scan operation, a turn-on operation, and a press operation. The target video information may specifically be at least one of video data of the target video and a target storage address of the target video.
Specifically, before a target video to be shared is acquired, a first user can hold a first terminal by hand to perform first interaction operation with an interactive object, so that the first terminal can trigger video acquisition operation based on the first interaction operation to obtain corresponding target video information.
In one embodiment, when a first interaction operation occurs between the first terminal and the interactive object, the first terminal can record a video picture played by the first terminal to obtain a target video. For example, when the first terminal touches the interactive object, the first terminal can record an application picture played by the target application to obtain a target video.
In one embodiment, the server may specifically be a cloud server providing a cloud game, and when a first user runs the cloud game through a first terminal and a first interaction operation occurs between the first terminal and an interactive object, the first terminal may generate a video acquisition instruction and send the video acquisition instruction to the cloud server, so that the cloud server performs video recording on a game process of the cloud game based on the video acquisition instruction to obtain a target video and a target storage address of the target video in the cloud server.
In one embodiment, the first terminal has a first electronic tag for incoming communication, and the interactive object has a second electronic tag for incoming communication. The server can be specifically a cloud server for providing a cloud game, when a first user runs the cloud game through the first terminal, the distance between the first terminal and the interactive object is smaller than or equal to the standard distance required by the incoming communication, and the first terminal and the interactive object carry out the incoming communication through the first electronic tag and the second electronic tag, the first terminal can generate a video acquisition instruction and send the video acquisition instruction to the cloud server, so that the cloud server carries out video recording on the game process of the cloud game based on the video acquisition instruction. When it is determined that the incoming communication cannot be performed due to the fact that the distance between the first terminal and the interactive object is larger than the standard distance required by the incoming communication, the cloud server finishes video acquisition to obtain the target video and the target storage address of the target video in the cloud server. For example, in the process that a first user plays a cloud game through a first terminal, when the first terminal is close to an interactive object, the cloud server can automatically perform video acquisition on the game process of the cloud game; when the first terminal is far away from the interactive object, the cloud server finishes video acquisition of the game process to obtain a target video.
In one embodiment, when a first interaction operation occurs between a first terminal and an interactive object, a video capture operation is triggered to obtain corresponding target video information, including: when a first interactive operation occurs between the first terminal and the interactive object, displaying an interactive object interactive interface; and responding to the video acquisition operation triggered on the interactive object interactive interface, and acquiring the corresponding target video information by video acquisition.
Specifically, before a target video to be shared is acquired, a first user can hold a first terminal by hand to perform first interactive operation with an interactive object, so that the first terminal displays an interactive object interactive interface based on the first interactive operation. The interactive object interactive interface refers to an interface which is correspondingly set in the process of acquiring the target video. The first user can trigger video acquisition operation through the displayed interactive article interactive interface, so that the first terminal performs video acquisition based on the video acquisition operation of the first user to obtain a target video. For example, the first user may trigger the first terminal to perform video capture by clicking a "yes" control in fig. 3, so as to obtain the target video information.
In one embodiment, the interactive object and the first terminal are both provided with an insulated coil for generating near field communication, and before the target video to be shared is collected, the first user can touch the insulated coil in the first terminal with the insulated coil of the interactive object, so that the first terminal responds to the touch operation of the first user to display an interactive object interactive interface.
In one embodiment, referring to fig. 3, when the first interactive operation occurs, the first terminal may correspondingly display the interactive article interaction interface as shown in fig. 3. The interactive article interaction interface comprises a video acquisition starting selection control 302, so that the first terminal can determine whether to start a video acquisition mode according to the triggering operation of the first user on the video acquisition starting selection control. For example, when a first user clicks a yes control for representing the starting of a video acquisition mode, a first terminal starts the video acquisition mode and performs video acquisition on a corresponding video to obtain a target video; and when the first user clicks a 'No' control for closing the video acquisition mode, the first terminal closes the video acquisition mode and suspends acquisition of the target video. FIG. 3 is a diagram illustrating an interactive item interaction interface, according to an embodiment.
In one embodiment, the first interactive operation includes, but is not limited to, at least one of a touch operation, a scan operation, a turn-on operation, and a press operation. For example, when the first terminal is provided with a first electronic tag for performing NFC communication and the interactive article is provided with a second electronic tag for performing NFC communication, the first terminal may display the interactive article interaction interface by touching the second electronic tag of the interactive article. For another example, when the interactive article is displayed with the two-dimensional code, the first terminal may display the interactive article interactive interface by scanning the two-dimensional code. For another example, when the interactive object has a switch for triggering the first terminal to display the interactive object interactive interface, the first user can trigger the switch to prompt the interactive object to send an interface opening instruction to the first terminal, so that the first terminal opens the interactive object interactive interface according to the interface opening instruction.
In one embodiment, the video capture operation may specifically be a start operation for the target application. The interactive article interactive interface can display an application icon of a preset target application, when it is determined that a first user starts the target application by clicking the application icon, the first terminal can display an application picture of the started target application and record videos of the displayed application picture to obtain target video information. For example, when the target application is a confrontation game application, the first user can start the confrontation game application through the interactive article interaction interface and play a confrontation game in the confrontation game application, so that the first terminal can record videos of the confrontation game process to obtain target video information.
In one embodiment, the video capture operation includes a video capture mode start operation and a launch operation for the target application. When the first user starts the video capture mode through the interactive article interaction interface shown in fig. 3, the first terminal monitors whether the target application is started in real time, and records an application picture displayed by the target application when the first user is determined to start the target application, so as to obtain a target video. The first user can start the target application by clicking an application icon in an application icon display interface of the first terminal. The application icon display interface is used for displaying the application icons.
In one embodiment, the video capture operation may be a video download operation. The interactive interface can display a download link of the target video to be downloaded, and when the user clicks the download connection, the first terminal can download the target video based on the download link.
In one embodiment, when a first user enters a multimedia playing application through an interactive object interaction interface, a first terminal can determine a video to be played selected by the first user, and play and record the video to be played, so that a target video is obtained.
Step S204, when a second interactive operation occurs between the first terminal and the interactive object, the target video information is triggered to be transmitted to the interactive object; and the target video information is transmitted to the interactive object and used for indicating the second terminal to play the target video corresponding to the target video information when third interactive operation occurs between the second terminal and the interactive object.
Specifically, after the target video information to be shared is obtained, when the first terminal performs a second interactive operation with the interactive object, a communication link is established with the interactive object, and the target video is transmitted to the interactive object through the established communication link. When the interactive object determines that the second terminal has the third interaction with the interactive object, the interactive object can transmit the target video information to the second terminal so as to trigger the second terminal to play the target video corresponding to the target video information, and therefore the target video is shared.
In one embodiment, the target video information includes a target video, and when the target video is obtained, the first user may touch a first electronic tag carried by the first terminal and a second electronic tag carried by the interactive object, so as to establish a near field communication link between the first terminal and the interactive object through the electronic tags, so that the first terminal may transmit the target video to the interactive object through the near field communication link. When a second user with a second terminal desires to view a target video uploaded to the interactive object by the first user, the second user can touch a third electronic tag carried by the second terminal with a second electronic tag carried by the interactive object, so that the interactive object can transmit the target video to the second terminal through the near field communication link, and the second terminal can play the target video.
In one embodiment, the target video information includes a target storage address of the target video in the cloud server. When the cloud server obtains the target video based on the video acquisition operation, the cloud server can send the target storage address of the target video to the first terminal through the network, and therefore when the first terminal carries out second interactive operation with the interactive object, the target storage address is sent to the interactive object through the first terminal. And the target storage address sent to the interactive article is used for indicating the second terminal to play the target video on line through the target storage address when third interactive operation occurs between the second terminal and the interactive article.
In one embodiment, when the cloud server obtains a target video based on a video acquisition operation, and when a second interaction operation occurs between the first terminal and an interactive article, the first terminal can generate a video sharing instruction and send the video sharing instruction to the cloud server, so that the cloud server finishes video acquisition, and can send the acquired target video or a target storage address of the target video to the interactive article through a network.
In one embodiment, the interactive article can display a sharing two-dimensional code for video sharing, and when the first terminal transmits the target video to the interactive article, the second terminal can acquire the target video by scanning the sharing two-dimensional code and play the acquired target video.
In one embodiment, when a third interactive operation occurs between the second terminal and the interactive item, the second terminal may display a first prompt message for prompting whether to acquire the target video, for example, may display "whether to download the target video? And when determining that the second user triggers the acquisition operation for acquiring the target video, for example, when determining that the second user clicks the "acquire target video" control, pulling the target video from the interactive object.
According to the video sharing method based on the interactive articles, the video acquisition operation can be triggered based on the first interactive operation by responding to the first interactive operation between the first terminal and the interactive articles, and the video clip to be acquired is automatically acquired according to the video acquisition operation, so that the target video information is obtained. The target video information can be triggered to be transmitted to the interactive object by responding to the second interactive operation between the first terminal and the interactive object, so that the second terminal can play the target video corresponding to the target video information through the third interactive operation between the second terminal and the interactive object, and the target video can be shared. Because the first terminal only needs to upload the automatically acquired target video information to the interactive object through simple interactive operation, the interactive object becomes a quick uploading channel of the target video, the uploading efficiency of uploading the video is improved, and the sharing efficiency of sharing the video is improved. Because the second terminal also only needs to pull the target video from the interactive object through simple interactive operation, the interactive object becomes a convenient viewing channel, and the video sharing efficiency is further improved.
In an embodiment, the video sharing method based on the interactive article can be specifically executed through a cloud server. The video sharing method based on the interactive articles comprises the following steps: when a first interactive operation occurs between a first terminal and an interactive article, triggering a cloud server to execute a video acquisition operation through the first terminal to obtain corresponding target video information; when a second interactive operation occurs between the first terminal and the interactive object, the cloud server is triggered by the first terminal to transmit the target video information to the interactive object; and the target video information is transmitted to the interactive object and used for indicating the second terminal to play the target video corresponding to the target video information when third interactive operation occurs between the second terminal and the interactive object.
Specifically, when the first terminal runs a cloud game and first interaction operation occurs between the first terminal and an interactive object, for example, when the first terminal touches the interactive object, the first terminal can generate a video acquisition instruction and send the video acquisition instruction to a cloud server through a network. When the cloud server receives the video acquisition instruction, the video frames generated when the cloud game runs can be acquired based on the video acquisition instruction. When the second interactive operation occurs between the first terminal and the interactive object, for example, the first terminal collides with the interactive object again, or the distance between the first terminal and the interactive object exceeds a preset distance, the first terminal can generate a video sharing instruction and send the video sharing instruction to the cloud server. The cloud server responds to the video sharing instruction to stop video recording, obtains a target video based on the acquired video frames, and correspondingly stores the obtained target video. The cloud server can send the target video or the target storage address to the interactive object through the network. And the target storage address sent to the interactive article is used for indicating the second terminal to play the target video on line based on the target storage address when third interactive operation occurs between the second terminal and the interactive article.
In one embodiment, when a second interactive operation occurs between the first terminal and the interactive object, the cloud server may send the target video or the target storage address to the first terminal through the network. And when the interactive operation occurs again between the first terminal and the interactive object, the target video or the target storage address is transmitted to the interactive object through the first terminal.
In this embodiment, through interaction between the first terminal and the interactive object, the cloud server can be triggered to automatically record and collect a video, and the target video or the target storage address is sent to the interactive object through the cloud server. Therefore, when the third interactive operation is carried out between the second terminal and the interactive object, the target video or the template storage address can be obtained from the interactive object. When the target storage address is acquired by the second terminal, the second terminal can play the corresponding target video on line based on the target storage address, so that the sharing efficiency of the target video is improved.
In one embodiment, when a first interaction operation occurs between the first terminal and the interactive object, the displaying of the interactive object interactive interface includes: displaying an application home page of the interactive application logged in by the first application account; the application home page displays a bound article bound with the first application account; when a first terminal running an interactive application logged in by a first application account and an interactive article are subjected to first interactive operation, determining whether the interactive article belongs to a bound article; and when the interactive object belongs to the bound object, displaying the interactive object interactive interface through the interactive application.
Specifically, an interactive application runs in the first terminal, and the first user can log in the interactive application through the first application account and interact with the interactive object through the interactive application. When the first user desires to interact with the interactive article, the first user can start the interactive application, so that the first terminal responds to the application starting operation and displays the application home page. And the application home page displays a bound article bound with the first application account. When a first interactive operation occurs between the first terminal and the interactive object, the first terminal can determine whether the interactive object with the first interactive operation is a bound object according to the pre-stored object identifier of the bound object and the first application account, and display an interactive object interactive interface when the interactive object is determined to be the bound object.
In one embodiment, referring to fig. 4, when the first user starts the interactive application, the first terminal may present an application home page as shown in fig. 4. Where 402 is the displayed bound item. FIG. 4 is a diagram of a page of an application home page in one embodiment.
In one embodiment, the application home page may also show a first user avatar 404 of the first user and an item image 406 of the interactive item. The object image may be a two-dimensional image of the interactive object, or may be a CG (Computer Graphics, Computer animation) three-dimensional animation for displaying the interactive object.
In one embodiment, the application home page is exposed with an interaction mode initiation control 408 to initiate an interaction mode. When the first user starts the interactive mode by clicking the interactive mode starting control, the first terminal can respond to the interactive operation with the interactive object, for example, the interactive object interactive interface can be displayed in response to the first interactive operation. Wherein, the interactive mode refers to a mode allowing interaction with the interactive object. When the interactive mode is in the open state, the first terminal can respond to the interactive operation with the interactive object.
In one embodiment, the application home page shows a target application starting control 410, and when it is determined that the first user clicks the target application starting control 410, the first terminal starts the target application and records an application picture of the target application to obtain a target video.
In one embodiment, the first user may bind the interactive item to be bound with the first application account of the first user through the interactive application, so as to obtain a bound item.
In the embodiment, when the interactive object is determined to be the bound object, the interactive object interactive interface is displayed, so that the interactive safety between the first terminal and the interactive object can be improved, and the probability that the target video is uploaded to the interactive object by the non-bound terminal is reduced.
In one embodiment, in response to a video capture operation triggered on the interactive object interaction interface, performing video capture to obtain corresponding target video information includes: responding to the acquisition starting operation triggered on the interactive article interactive interface, displaying a video acquisition setting interface, and displaying a video template set through the video acquisition setting interface; in response to a selection operation for the video template set, determining at least one target video template selected by the selection operation; and responding to the video acquisition operation triggered by the video acquisition setting interface, and acquiring the video based on the target video template to obtain the target video information.
Specifically, when the first user desires to acquire the target video, the first user can trigger acquisition starting operation through the interactive article interaction interface and start the video acquisition mode, so that the first terminal can respond to the acquisition starting operation and display a video acquisition setting interface. The video acquisition mode refers to a mode allowing acquisition of a target video, and when the video acquisition mode is in an open state, the first terminal can respond to a video acquisition operation triggered in a video acquisition setting interface to acquire a video to be acquired.
Further, a video template set is displayed in the video acquisition setting interface. The first terminal can respond to the selection operation of the video template set displayed on the video setting interface, determine at least one target video template selected by the first user, and edit the acquired video acquired through the determined target video template to obtain the target video.
In one embodiment, referring to fig. 5, the first terminal may present the video template 502 through a video capture setting interface, and the first user may select a target video template in the set of video templates through the video capture setting interface. For example, the first user may select the target video template by swiping left or right. FIG. 5 illustrates an interface diagram of a video capture settings interface in one embodiment.
In one embodiment, the first terminal can send the target video template selected by the first user to the cloud server, so that the cloud server collects the video to be collected based on the target video template to obtain the target video and the target storage address. For example, the cloud server may capture the game progress of the cloud game based on the target video template.
In one embodiment, the target video information comprises a target video, and the target template comprises a video display effect and a video interception mode; responding to the video acquisition operation triggered by the video acquisition setting interface, and carrying out video acquisition based on a target video template to obtain target video information, wherein the video acquisition operation comprises the following steps: responding to a video acquisition operation triggered by a video acquisition setting interface, and acquiring a video to obtain an acquired video; performing video interception processing on the acquired video in a video interception mode to obtain a video clip; and editing the video frames in the video clips through the video display effect to obtain the target video to be shared.
Specifically, the first terminal responds to a video acquisition operation triggered in a video acquisition setting interface to obtain an acquired video, and carries out video interception processing on the acquired video through a video interception mode in a target video template to obtain a video clip. For example, when the video capture mode is to capture the last two minutes of the captured video, the first terminal may capture a video clip of the last two minutes from the captured video.
When the video clip is obtained, the first terminal can edit each video frame in the video clip through the video display effect in the target video template to obtain the target video to be shared. For example, the first terminal may add filters to video frames in the video clip based on the video presentation effect. For another example, the first terminal may adjust the brightness and color saturation of the respective video frame in the video clip based on the video presentation effect.
In one embodiment, referring to fig. 6, when the target video is obtained, the first terminal may display the target video through a video preview interface, so that the first user previews the acquired target video through the video preview interface. FIG. 6 illustrates a schematic diagram of a video preview interface in one embodiment.
In one embodiment, the target template may also contain only video presentation effects. When the acquired video is obtained, the first terminal can edit the video frame of the acquired video through the video display effect in the target template to obtain an edited video. The video preview interface may contain a video capture control for video capture of the edited video. When the first terminal previews and displays the edited video through the video previewing interface, the first user can perform video intercepting processing on the previewed and displayed edited video through the video intercepting control to obtain a target video. For example, the video capture control may be a progress drag bar, and the first user may determine a capture start point and a capture terminal of the video by dragging the progress drag bar, so that the first terminal performs video capture processing on the edited video according to the determined capture start point and capture terminal.
In one embodiment, the first terminal may display a second prompt message through the video preview interface to prompt the first user to trigger a second interaction with the interactive item. For example. The first terminal can display prompt information that the target video can be transmitted to the interactive object by touching the interactive object.
Easily understood, when a video acquisition instruction sent by the first terminal is received, the cloud server can also acquire a video based on a video display effect and a video capture mode in the target video template to obtain a target video.
In the embodiment, by selecting the target video template, the display effect of the target video can be flexibly configured based on the video template in the video template set, so that the user experience is greatly enhanced.
In one embodiment, in response to a video capture operation triggered through a video capture setting interface, performing video capture to obtain a captured video includes: starting the target game application in response to the starting operation aiming at the target game application triggered on the video acquisition setting interface, and displaying a corresponding game picture through the target game application; and carrying out video recording on the displayed game picture to obtain a collected video.
In particular, the target video may be a video clip captured for highlight moments during the game. The first terminal may start the target game application in response to a start operation for the target game application triggered in the video capture setting interface, so that the first user may enter the game through the started target game application. When the first user carries out game interaction, the first terminal can carry out video recording on the game process of the game interaction in real time to obtain a collected video.
In one embodiment, referring to fig. 5, a target game application launch control 504 may also be displayed in the video capture setting interface, and the first user may launch the target game application by clicking on the target game application launch control to enter the target game.
In one embodiment, referring to fig. 7, when the target gaming application is launched into the game, the first terminal may present a video recording end control 702 in the game interface. And when the first terminal determines that the first user clicks the video recording ending control, the first terminal ends recording the game process to obtain the collected video. For example, when the first user releases the game invitation to defeat the opposite player, the first user can click the video recording ending control to end the recording of the video to obtain the collected video, and intercept the highlight game process of the released game invitation to defeat the opposite player in the collected video through the selected target video template to obtain the target video. FIG. 7 illustrates an interface diagram of a target gaming application in one embodiment.
In one embodiment, the game interface may further display a continuous recording control, and when the first user clicks the video recording ending control to obtain the target video and wants to acquire the next segment of the target video, the first user may click the continuous recording control, so that the first terminal responds to the click operation of the first user to record the game picture again to obtain the next segment of the target video.
In one embodiment, when video recording is performed on the target game application, the first terminal may display a video recording prompt 704 as shown in fig. 7 in an application interface of the target game application to prompt the first user that the recording is being performed on the game process.
In the above embodiment, the first user can record the video in the game process only by triggering the starting operation for the target game application, so that the recording efficiency of the video recording is greatly improved.
In one embodiment, when a third interactive operation occurs between the second terminal and the interactive item and the target video is played, the second terminal may send meeting invitation information to the first terminal, where the process of the first terminal receiving and responding to the meeting invitation information includes: when the fifth interactive operation is carried out between the first terminal and the interactive article, the meeting invitation information transmitted by the interactive article is displayed; the meeting invitation information is generated by the second terminal responding to the meeting invitation operation and is transmitted to the interactive article when the fourth interactive operation occurs between the second terminal and the interactive article; the meeting invitation information is used for requesting to meet with the uploading party of the target video; and triggering the response information specified by the response operation to be transmitted to the second terminal in response to the response operation occurring aiming at the meeting invitation information.
Specifically, after the second user corresponding to the second terminal watches the target video, the second user may select whether to meet the first user. When a second user desires to meet the first user, the second terminal can respond to the meeting invitation operation of the second user to generate meeting invitation information, and when a fourth interactive operation is performed between the second terminal and the interactive object, the meeting invitation information is transmitted to the interactive object bound with the first application account.
Further, when a fifth interactive operation occurs between the first terminal and the interactive article, the interactive article can send the meeting invitation information transmitted by the second terminal to the first terminal, so that the first terminal displays the meeting invitation information. The first user can select whether to answer the meeting invitation, and when determining to answer the meeting invitation, the first terminal responds to the answering operation occurring aiming at the meeting invitation information, generates answering information and transmits the answering information to the second terminal.
In one embodiment, when the fifth interactive operation occurs between the first terminal and the interactive item, the first terminal may display the meeting invitation message in the form of a pop-up window. As shown in fig. 8, the first terminal may present a meeting invitation message 804 through a session popup 802. The meeting invitation information 804 includes a second user avatar 806 corresponding to a second application account of the second terminal and a meeting signal 808. By displaying the head portrait of the second user and the meeting signal, the first user and the second user can confirm the identity of the other party based on the head portrait of the second user and the meeting information, and therefore the probability of mistaken meeting is reduced. FIG. 8 illustrates a diagram that shows meeting invitation information in one embodiment.
In one embodiment, referring to fig. 8, the first terminal may further display a response control 810 through the session popup 802, and when it is determined that the first user clicks the "accept" control, the first terminal generates corresponding response information according to a click operation of the first user, and sends the response information to the second terminal.
In one embodiment, when the second terminal generates the meeting invitation information, the second user may touch the third electronic tag in the second terminal to the second electronic tag of the interactive item, so as to transmit the meeting invitation information to the interactive item through a touch operation. Correspondingly, the first user can touch the first electronic tag in the first terminal with the second electronic tag of the interactive article so as to receive the meeting invitation information transmitted by the interactive article through touch operation. Because only simple touch operation needs to be executed, the meeting invitation information can be transmitted to the first terminal through the interactive articles, and therefore the invitation efficiency of meeting invitation is greatly improved.
In the above embodiment, by generating the meeting invitation information, meeting with the uploading side of the target video can be requested based on the meeting invitation information, so that the user object at the position of the user can be facilitated to be acquainted and listened to and exchanged with the different user objects face to face.
In one embodiment, when a fifth interactive operation occurs between the first terminal and the interactive item, the displaying of the meeting invitation information transmitted by the interactive item includes: when a fifth interactive operation occurs between the first terminal and the interactive object, displaying a special reminding mark in an application home page of the interactive application; and displaying the meeting invitation information corresponding to the special reminding mark in response to the information reading operation aiming at the special reminding mark.
Specifically, when a fifth interactive operation occurs between the first terminal and the interactive article, the first terminal may display a preset special reminding mark in an application home page of the interactive application, and when it is determined that the first user triggers an information reading operation for the special reminding mark, the first terminal displays the corresponding meeting invitation information in response to the information reading operation. For example, the first terminal may display the special reminder mark in the application home page in a form of a popup window, and when it is determined that the first user clicks the popup window, display the meeting invitation information corresponding to the special reminder mark.
In this embodiment, the first user can be reminded of having received the meeting invitation information by displaying the special reminding mark, so that the first user can select whether to check the meeting invitation information according to the self requirement, and thus, the user experience is greatly improved.
In one embodiment, the application home page shows a first user head portrait corresponding to a first application account logged in to a first terminal; displaying a special reminder mark in an application home page of the interactive application, comprising: displaying a special reminding mark at a position corresponding to the first user head portrait; responding to the information reading operation aiming at the special reminding mark, and displaying the meeting invitation information corresponding to the special reminding mark, wherein the meeting invitation information comprises the following steps: when a preset touch operation aiming at the head portrait of the first user occurs, displaying meeting invitation information corresponding to the special reminding mark; the meeting invitation information comprises a second user head portrait corresponding to a second application account logged in the second terminal.
Specifically, the application home page may further display a first user avatar bound to the first application account, and when meeting invitation information transmitted by the interactive article is received, the first terminal may display a special reminding mark at a position corresponding to the first user avatar, so that the first user may touch the first user avatar through a preset touch operation to trigger the first terminal to display the meeting invitation information. For example, the first user may trigger the first terminal to present the meeting invitation information by clicking on the first user avatar.
In one embodiment, referring to fig. 9, the first terminal may display a special reminder 902 at the avatar of the first user, so that when the first user touches the avatar of the first user, the corresponding meeting invitation information is displayed. The meeting invitation information includes a second user avatar 904 corresponding to a second application account logged in to the second terminal. FIG. 9 illustrates a schematic diagram of a special reminder flag in one embodiment.
In the embodiment, the first user can trigger the first terminal to display the meeting invitation information only by presetting the touch operation, so that the display efficiency of the meeting invitation information is improved.
In one embodiment, the meeting invitation information comprises a second application account number logged in to the second terminal; triggering transmission of response information specified by the response operation to the second terminal in response to the response operation occurring with respect to the meeting invitation information, including: responding to a response operation triggered by the meeting invitation information, and displaying a response information input area; acquiring meeting time input through a response information input area, and generating corresponding response information according to the meeting time; and sending the response information to the second terminal according to the second application account.
Specifically, the first terminal responds to a response operation triggered by the meeting invitation message, displays a response information input area, and extracts a second application account number logged in the second terminal from the meeting invitation message. Further, the first user can input meeting time in the response information input area, so that the first terminal can generate corresponding response information according to the meeting time input by the first user, and send the response information to the second terminal through the second application account. For example, the interactive application running in the first terminal may send the response information to the second terminal through the network according to the second application account.
In one embodiment, referring to fig. 10, when the second terminal receives the response message, the second terminal may present a response reading reminding mark 1002 at the second user avatar in the application home page, and present a corresponding response message 1004 when it is determined that the second user clicks the response reading reminding mark. FIG. 10 is a diagram that illustrates the presentation of response information in one embodiment.
In this embodiment, the response information is sent to the second terminal according to the second application account, so that the second terminal can receive the response information returned by the first terminal in time, and the timeliness of the response information is improved.
In one embodiment, in response to a response operation occurring with respect to the meeting invitation information, triggering transmission of response information specified by the response operation to the second terminal includes: responding to a response operation triggered by the meeting invitation information, and displaying a response information input area; responding to a response input operation triggered in the response information input area, and displaying response information input through the response input operation; when sixth interactive operation occurs between the first terminal and the interactive object, response information is triggered to be transmitted to the interactive object; and the response information is transmitted to the interactive article and used for indicating the second terminal to display the response information when seventh interactive operation occurs between the second terminal and the interactive article.
Specifically, the first terminal presents the response information input area in response to a response operation triggered for the meeting invitation information, so that the first user can input response information, for example, input response time and leave a message, in the response information input area. Further, when it is determined that sixth interactive operation occurs between the first terminal and the interactive object, the first terminal can transmit the response information to the interactive object, so that when seventh interactive operation occurs between the second terminal and the interactive object, the interactive object can transmit the response information to the second terminal.
In one embodiment, referring to fig. 11, when the response message is generated, the first terminal may display the generated response message 1102 on the response message preview interface, and prompt the first user to transmit the response message to the interactive item through a sixth interactive operation through the third prompt message 1104. FIG. 11 is a diagram that illustrates an interface of a response message preview interface in one embodiment.
In the above embodiment, the response information is transmitted to the second terminal by means of the interactive article, so that the second terminal can still obtain the response information from the interactive article through interactive operation when the network fails.
In one embodiment, a first electronic tag is mounted in a first terminal, a second electronic tag is mounted in an interactive article, and a third electronic tag is mounted in a second terminal; when a first interactive operation occurs between a first terminal and an interactive object, triggering a video acquisition operation to obtain corresponding target video information, comprising: when a first electronic tag of a first terminal touches a second electronic tag of an interactive article for the first time, triggering video acquisition operation to obtain corresponding target video information; when a second interactive operation occurs between the first terminal and the interactive object, the target video information is triggered to be transmitted to the interactive object, and the method comprises the following steps: when the first electronic tag of the first terminal touches the second electronic tag of the interactive article again, transmitting the target video information to the interactive article; and the target video information transmitted to the interactive article is used for indicating the second terminal to play the target video corresponding to the target video information when the third electronic tag of the second terminal touches the second electronic tag of the interactive article.
Specifically, when a first user desires to acquire a target video, the first user can touch a first electronic tag carried in a first terminal and a second electronic tag carried in an interactive article to trigger the first terminal to display an interactive article interaction interface, so that the target video is acquired through the interactive article interaction interface. When the first user desires to upload the acquired target video to the interactive article, the first user can touch the first electronic tag carried in the first terminal with the second electronic tag carried in the interactive article again to generate near field communication, so that the first terminal uploads the target video to the interactive article through the near field communication. When a second user desires to view the target video uploaded by the first user, the second user can touch a third electronic tag carried in the second terminal with a second electronic tag of the interactive article to generate near field communication, so that the interactive article can transmit the target video to the second terminal according to the touch operation of the second terminal and through the near field communication.
In one embodiment, when a first user touches a first electronic tag carried in a first terminal with a second electronic tag carried in an interactive object, the first terminal can trigger a cloud server to perform video acquisition, obtain a target video and a target storage address, and send the target storage address to the first terminal. When the first user touches the first electronic tag carried in the first terminal with the second electronic tag carried in the interactive object again, the first terminal can send the target storage address to the interactive object. When a second user touches a third electronic tag carried in a second terminal to a second electronic tag of the interactive article, the interactive article can send the target storage address to the second terminal so as to trigger the second terminal to play the target video online.
In one embodiment, when a first user touches a first electronic tag carried in a first terminal with a second electronic tag carried in an interactive object, the first terminal can trigger a cloud server to perform video acquisition, so that a target video and a target storage address are obtained. When the first user touches the first electronic tag carried in the first terminal with the second electronic tag carried in the interactive object again, the first terminal can trigger the cloud server to send the target storage address or the target video to the interactive object. When a second user touches a third electronic tag carried in a second terminal to a second electronic tag of the interactive article, the interactive article can send a target storage address or a target video to the second terminal so as to trigger the second terminal to play the target video online or offline.
In one embodiment, the first electronic tag, the second electronic tag, and the third electronic tag may be NFC electronic chips.
In this embodiment, the interactive object can become a fast uploading channel of the target video, so that the uploaded target video has higher timeliness. Because the interactive object can become a rapid viewing channel of the target video, the second user can rapidly find the first user nearby in real life, and the communication negotiation between the first user and the second user is promoted.
In one embodiment, as shown in fig. 12, a video playing method based on an interactive article is provided, which is described by taking the method as an example for being applied to the second terminal in fig. 1, and includes the following steps:
step S1202, when a third interactive operation occurs between the second terminal and the interactive object, displaying a video playing interface.
Specifically, the interactive article can monitor the interactive operation in real time, and when the interactive article determines that the third interactive operation occurs between the interactive article and the second terminal, the interactive article transmits the target video information to the second terminal so as to trigger the second terminal to display a video playing interface corresponding to the target video information, namely, a video playing page for playing the target video. In one embodiment, the target video information includes a target video, and when a plurality of target videos are stored in the interactive object, the interactive object can simultaneously send the plurality of target videos to the second terminal, and can also send the target video uploaded by the first terminal newly to the second terminal. The present embodiment is not limited thereto.
Step S1204, responding to the video playing operation triggered in the video playing interface, and playing a corresponding target video based on the target video information received from the interactive object; the target video information is obtained by video acquisition after a first interactive operation is carried out between the first terminal and the interactive object, and is transmitted to the interactive object when a second interactive operation is carried out between the first terminal and the interactive object.
Specifically, the second terminal may play the target video corresponding to the target video information through the video play interface in response to a video play operation triggered in the video play interface. For example, when the target video is expected to be played, the second user may click a video playing control in the video playing interface, and trigger the second terminal to play the target video corresponding to the target video information online or locally by clicking the video playing control. The target video information is obtained by video acquisition after a first interactive operation is carried out between the first terminal and the interactive object, and is transmitted to the interactive object when a second interactive operation is carried out between the first terminal and the interactive object.
In one embodiment, when the second terminal receives a plurality of target videos, the second terminal may display a video list of the target videos, and the second user may select a target video to be played from the video list through a selection operation, so that the second terminal may play the target video to be played selected by the second user.
According to the video playing method based on the interactive object, the third interactive operation is carried out between the second terminal and the interactive object in response, the video playing interface can be displayed through the third interactive operation, and when the video playing operation is triggered, the target video corresponding to the target video information is played through the video playing operation. Because the second terminal only needs to pull target video information from the interactive articles through simple interactive operation, the interactive articles become convenient ornamental channels, and the video sharing efficiency is improved.
In one embodiment, when a third interactive operation occurs between the second terminal and the interactive object, the displaying the video playing interface includes: when a third interactive operation occurs between the second terminal and the interactive object, pulling target video information and comment information from the interactive object, and displaying a video playing interface corresponding to the target video information; responding to a video playing operation triggered in a video playing interface, and playing a corresponding target video based on target video information received from an interactive article, wherein the video playing operation comprises the following steps: and responding to the video playing operation triggered in the video playing interface, playing the target video corresponding to the target video information through the video playing interface, and displaying the comment information of the target video.
Specifically, when a third interactive operation occurs between the second terminal and the interactive object, the second terminal can pull target video information and comment information generated by commenting on the target video from the interactive object, and play the target video corresponding to the target video information and display the comment information of the target video through the video play interface when the video play operation is determined to be triggered.
In one embodiment, referring to fig. 13, when obtaining the target video information and the comment information of the target video, the second terminal may display the target video corresponding to the target video information through a video playing area 1302 in the video playing interface, and display the comment information of the target video in the video playing area 1302 in a pop-up screen manner. FIG. 13 illustrates a schematic diagram of a video playback interface in one embodiment.
In the embodiment, the comment information is displayed, so that the participation sense of the user can be greatly enhanced, and the user experience is improved.
In one embodiment, the video playing interface includes a message input area, and when viewing the target video, the second user can also post a comment for the target video, and the posting of the comment information includes: responding to comment input operation triggered in the message input area, and displaying input to-be-issued comments; and responding to comment publishing operation triggered by the message input area, and uploading the to-be-published comments to the interactive articles when eighth interactive operation occurs between the second terminal and the interactive articles so as to publish the to-be-published comments.
Specifically, the video playing interface further comprises a message input area for inputting comment information, and in the process of watching the target video, a second user can input comment information for the target video through the message input area, so that the second terminal displays the to-be-sent comment through comment input operation of the second user. When the second user desires to publish the input to-be-published comments, the second user can trigger comment publishing operation, for example, the second user can click the comment information publishing control, and when the eighth interaction operation occurs between the second terminal and the interactive object, the second terminal uploads the to-be-published comments to the interactive object so as to publish the to-be-published comments.
In one embodiment, when the ninth interactive operation occurs between the third terminal and the interactive object, the interactive object can send the target video uploaded by the first terminal and the comment information published by the second terminal to the third terminal, so that the third terminal can display the comment information while playing the target video.
In one embodiment, the second user can also approve the displayed comment information and transmit the approval information to the interactive item through the second terminal.
In one embodiment, referring to fig. 13, the second terminal may display a message input area 1304 through the video playing interface and display a comment information posting control 1306 through the message input area.
In the embodiment, the comment to be published is published, so that other users can view the published comment information, and thus, communication between the users can be realized, and the participation of the users is enhanced.
In one embodiment, the step of transmitting the meeting invitation information includes: generating meeting invitation information in response to meeting invitation operation triggered by a video playing interface; when the second terminal and the interactive object perform fourth interactive operation, transmitting the area invitation information to the interactive object; and the meeting invitation information transmitted to the interactive article is used for inviting a first user corresponding to the first terminal to meet a second user corresponding to the second terminal.
Specifically, when the second user desires to meet the first user, the second user may trigger a meeting invitation operation, and the second terminal may generate meeting invitation information in response to the meeting invitation operation. For example, referring to fig. 13, the video playing interface further includes a meeting invitation control 1308, and when the second user clicks the meeting invitation control 1308, the second terminal may generate meeting invitation information correspondingly, and perform preview display through a meeting invitation information preview interface shown in fig. 14. FIG. 14 illustrates a meeting invitation information preview interface in an embodiment.
Further, when the second terminal determines that fourth interactive operation is performed with the interactive article, for example, when a third electronic tag in the second terminal collides with a second electronic tag in the interactive article, the second terminal may send the generated meeting invitation information to the interactive article, so that when fifth interactive operation is performed between the first terminal and the interactive article, the meeting invitation information in the interactive article is transmitted to the first terminal, and therefore a first user corresponding to the first terminal is invited to meet a second user corresponding to the second terminal.
In the embodiment, the first user corresponding to the first terminal can be invited to meet through the meeting invitation information by generating the meeting invitation information, so that the interactivity among the users is increased, and the entertainment of interactive application is increased.
In one embodiment, the target video information includes a target video, the target video is a video obtained by recording a game screen of a target game application, the target video includes a first game account corresponding to a first user, when the second terminal plays the target video, a second user corresponding to the second terminal may also initiate a team formation request to the first user corresponding to the first terminal, and the step of performing team formation based on the team formation request further includes: responding to a request team forming operation generated in a video playing interface, and initiating a team forming request to a first terminal based on a first game account; the team formation request is used for indicating the first terminal to perform team formation interaction with a second game account logged in the second terminal when the first terminal enters the target game application based on the first game account.
Specifically, when the target application is a target game application, the second terminal may play a target video uploaded by the first terminal and acquired for a game process of the first user, so that the second user may determine whether to perform team interaction with the first user through the viewed target video. When a second user desires to perform team formation interaction with a first user, the second user may trigger a request for a team formation operation through the video playing interface, for example, the second user may click a team formation control in the video playing interface, so that the second terminal, in response to the request for the team formation operation, extracts a first game account corresponding to the first user from the target video, and generates a team formation request according to the first game account. The first game account refers to an account used by the first user to log in the target game application.
When the team forming request is generated, the second terminal can send the team forming request to the first terminal according to the first game account, so that when a first user enters a target game application through the first account, team forming interaction can be performed with a second game account corresponding to the second user according to the team forming request.
In one embodiment, the second user can trigger the request for the fight operation through the video playing interface, so that the second terminal can respond to the request for the fight operation, generate the fight request and send the fight request to the first terminal. The fighting request is used for indicating that the first terminal carries out fighting interaction with a second game account logged in the second terminal when entering the target game application based on the first game account.
In this embodiment, by generating the team formation request and performing team formation interaction based on the team formation request, the interactivity of the interactive application can be greatly increased, thereby promoting game interaction for users of the interactive application.
In one embodiment, an interactive article is provided, and a communication module is deployed in the interactive article, wherein the communication module is used for triggering execution of a video acquisition operation when a first interactive operation occurs between a first terminal and the interactive article; the communication module is also used for receiving target video information generated based on video acquisition operation when a second interactive operation is carried out between the first terminal and the interactive object; the communication module is also used for transmitting the target video information to the second terminal when third interactive operation is carried out between the second terminal and the interactive object; the transmitted target video information is used for indicating the second terminal to play the target video corresponding to the target video information.
In particular, a communication module is deployed in the interactive article, through which near field communication can be triggered. In a specific embodiment, the approach communication module may be specifically an NFC electronic chip. When a first interactive operation occurs between the first terminal and the interactive object, the interactive object can trigger a video acquisition operation through the communication module. For example, the interactive article may trigger the first terminal to perform video acquisition, or the interactive article may trigger the first terminal to generate a video acquisition instruction, and send the video acquisition instruction to the cloud server. When the second interactive operation is carried out between the first terminal and the interactive object, the interactive object can receive target video information generated based on the video acquisition operation through the communication module. For example, when a second interactive operation occurs between the first terminal and the interactive object, the interactive object may receive a target video sent by the first terminal, or receive a target storage address sent by the cloud server. When a third interactive operation occurs between the second terminal and the interactive object, the interactive object can transmit the target video information to the second terminal through the communication module, so that the second terminal plays the target video corresponding to the target video information. For example, when the third interactive operation occurs, the interactive article may send the target video or the target storage address to the second terminal through the communication module, so that the second terminal plays the target video online or locally.
In a specific embodiment, as shown in fig. 15, the video sharing method based on interactive articles provided by the present application includes the following steps:
s1502, displaying an application home page of the interactive application logged in by the first application account; and displaying a bound article bound with the first application account on the application home page.
S1504, when a first terminal running with the interactive application logged in by the first application account number performs a first interactive operation with the interactive object, determining whether the interactive object belongs to the bound object, and displaying an interactive object interactive interface through the interactive application when the interactive object belongs to the bound object.
And S1506, responding to the acquisition starting operation triggered on the interactive article interactive interface, displaying a video acquisition setting interface, and displaying a video template set through the video acquisition setting interface.
S1508, in response to the selection operation aiming at the video template set, determining at least one target video template selected by the selection operation; the target template comprises a video display effect and a video interception mode.
And S1510, responding to the starting operation aiming at the target game application triggered on the video acquisition setting interface, starting the target game application, displaying a corresponding game picture through the target game application, and performing video recording on the displayed game picture to obtain an acquired video.
S1512, performing video interception processing on the acquired video in a video interception mode to obtain a video clip, and performing editing processing on a video frame in the video clip through a video display effect to obtain target video information.
S1514, when a second interactive operation occurs between the first terminal and the interactive object, triggering to transmit the target video information to the interactive object; and the target video information is transmitted to the interactive object and used for indicating the second terminal to play the target video corresponding to the target video information when third interactive operation occurs between the second terminal and the interactive object.
S1516, when the fifth interactive operation occurs between the first terminal and the interactive article, displaying a special reminding mark in the application home page of the interactive application, responding to the information reading operation aiming at the special reminding mark, displaying the meeting invitation information corresponding to the special reminding mark, wherein the meeting invitation information is generated by the second terminal responding to the meeting invitation operation, and is transmitted to the interactive article when the fourth interactive operation occurs between the second terminal and the interactive article; the meeting invitation information is used for requesting meeting with the uploading party of the target video.
S1518, the meeting invitation information includes a second application account logged in the second terminal; and presenting the response information input area in response to a response operation triggered by the meeting invitation information.
And S1520, acquiring the meeting time input through the response information input area, generating corresponding response information according to the meeting time, and sending the response information to the second terminal according to the second application account.
According to the video sharing method based on the interactive article, the first interactive operation generated between the first terminal and the interactive article is responded, the video acquisition operation can be triggered based on the first interactive operation, and the video clip to be acquired is automatically acquired according to the video acquisition operation, so that the target video information is obtained. The target video information can be triggered to be transmitted to the interactive object by responding to the second interactive operation between the first terminal and the interactive object, so that the second terminal can play the target video corresponding to the target video information through the third interactive operation between the second terminal and the interactive object, and the target video can be shared. Because the first terminal only needs to upload the automatically acquired target video to the interactive object through simple interactive operation, the interactive object becomes a quick uploading channel of the target video information, the uploading efficiency of uploading the video is improved, and the sharing efficiency of sharing the video is improved. Because the second terminal also only needs to pull the target video information from the interactive object through simple interactive operation, the interactive object becomes a convenient viewing channel, and the video sharing efficiency is further improved.
In a specific embodiment, as shown in fig. 16, the method for playing a video based on an interactive article provided by the present application includes the following steps:
and S1602, when a third interactive operation occurs between the second terminal and the interactive object, pulling the target video information and the comment information from the interactive object, and displaying a video playing interface corresponding to the target video information.
And S1604, responding to the video playing operation triggered in the video playing interface, playing the target video corresponding to the target video information through the video playing interface, and displaying the comment information of the target video.
S1606, in response to a video playing operation triggered in the video playing interface, playing a target video received from the interactive article; the target video is obtained by video acquisition after a first interactive operation is carried out between the first terminal and the interactive object, and is transmitted to the interactive object when a second interactive operation is carried out between the first terminal and the interactive object.
S1608, the video playing interface includes a message input area, and displays the input pending statement comment in response to the comment input operation triggered in the message input area.
And S1610, responding to the comment publishing operation triggered by the message input area, and uploading the to-be-published comment to the interactive object when the eighth interactive operation occurs between the second terminal and the interactive object, so as to publish the to-be-published comment.
S1612, generating meeting invitation information in response to the meeting invitation operation triggered by the video play interface.
S1614, when the fourth interactive operation is carried out between the second terminal and the interactive article, the meeting invitation information is transmitted to the interactive article; and the meeting invitation information transmitted to the interactive article is used for inviting a first user corresponding to the first terminal to meet a second user corresponding to the second terminal.
S1616, recording a game picture of the target game application to obtain a target video, wherein the target video comprises a first game account corresponding to a first user, responding to a request for team formation operation in a video playing interface, and initiating a team formation request to the first terminal based on the first game account; the team formation request is used for indicating the first terminal to perform team formation interaction with a second game account logged in the second terminal when the first terminal enters the target game application based on the first game account.
According to the video playing method based on the interactive article, the video playing interface can be displayed through third interactive operation by responding to the third interactive operation between the second terminal and the interactive article, so that when the video playing operation is triggered, the target video information received from the interactive article can be played through the video playing operation. Because the second terminal only needs to pull target video information from the interactive articles through simple interactive operation, the interactive articles become convenient ornamental channels, and the video sharing efficiency is improved.
It should be understood that although the various steps in the flow charts of fig. 2, 12, 15-16 are shown in order as indicated by the arrows, the steps are not necessarily performed in order as indicated by the arrows. The steps are not performed in the exact order shown and described, and may be performed in other orders, unless explicitly stated otherwise. Moreover, at least some of the steps in fig. 2, 12, 15-16 may include multiple steps or multiple stages, which are not necessarily performed at the same time, but may be performed at different times, which are not necessarily performed in sequence, but may be performed alternately or in alternation with other steps or at least some of the other steps.
The application further provides an application scene, and the application scene applies the interactive article-based video sharing method. Specifically, the video sharing method based on the interactive articles is applied to the application scene as follows:
referring to fig. 17, the first user may touch the first electronic tag of the first terminal with the second electronic tag of the bound interactive object to prompt the first terminal to automatically record a game screen of the target game application, so as to obtain a target video. When the target video is obtained, the first user can touch the first electronic tag carried by the first terminal and the second electronic tag carried by the interactive object again, so that the first terminal can upload the target video to the bound interactive object through near field communication.
When a second user passes through the interactive object of the first user, the second user can touch the third electronic tag of the second terminal with the second electronic tag of the interactive object, so that the interactive object can send the target video uploaded by the first user to the second terminal through near field communication, and the second terminal is prompted to play the target video. When the first user is adjacent to the second user, the second user can perform face-to-face communication with the first user while watching the target video, so that the interactivity between the users is greatly increased. FIG. 17 illustrates a scene diagram for face-to-face communication based on target video in one embodiment.
The application further provides an application scenario applying the video playing method based on the interactive object. Specifically, the application of the video playing method based on the interactive object in the application scene is as follows:
referring to fig. 18, the second user may touch the third electronic tag of the second terminal with the second electronic tag of the interactive item to obtain and view the target video uploaded by the first user. The second user can leave a message of praise for the played target video through the video playing interface and initiate the meeting invitation to the first user while watching the target video, so that the second terminal can respond to the meeting invitation operation triggered by the second user, generate meeting invitation information and send the meeting invitation information to the interactive article. The sent meeting invitation information is used for inviting the first user to carry out meeting communication with the second user. FIG. 18 is a diagram illustrating a scene in which a target video is played in one embodiment.
The application further provides an application scenario applying the video playing method based on the interactive object. Specifically, the application of the video playing method based on the interactive object in the application scene is as follows:
for convenience of description, a target video acquired by a first terminal is referred to as a first target video, and a target video acquired by a second terminal is referred to as a second target video; the interactive article bound with the first application account is called a first interactive article, and the interactive article bound with the second application account is called a second interactive article. Referring to fig. 19, when the second user watches the first target video uploaded by the first user and desires to perform meeting communication with the first user, but the first user has not accepted the meeting invitation information, the second user may enter the target game application and perform video recording on the game process of the second user through the second terminal to obtain the second target video. The second user can determine a second interactive object bound with the second application account, touch a third electronic tag carried by the second terminal with a fourth electronic tag carried by the second interactive object, and upload a second target video to the second interactive object through touch operation. When the first user passes through the second interactive object, the first user can touch the first electronic tag carried by the first terminal and the fourth electronic tag carried by the second interactive object so as to play the second target video. FIG. 19 is a diagram illustrating a scene in which a target video is played in one embodiment.
In one embodiment, as shown in fig. 20, there is provided an interactive article-based video sharing apparatus 2000, which may be a part of a computer device using a software module or a hardware module, or a combination of the two, and the apparatus specifically includes: a video capture module 2002 and a video transmission module 2004, wherein:
the video capture module 2002 is configured to trigger a video capture operation to obtain corresponding target video information when a first interaction operation occurs between the first terminal and the interactive object.
The video transmission module 2004 is configured to trigger transmission of the target video information to the interactive object when a second interactive operation occurs between the first terminal and the interactive object; and the target video information is transmitted to the interactive object and used for indicating the second terminal to play the target video corresponding to the target video information when third interactive operation occurs between the second terminal and the interactive object.
In one embodiment, the video capture module 2002 is further configured to trigger the cloud server to perform a video capture operation through the first terminal when a first interaction operation occurs between the first terminal and the interactive item, so as to obtain corresponding target video information; when a second interactive operation occurs between the first terminal and the interactive object, the cloud server is triggered by the first terminal to transmit the target video information to the interactive object.
In one embodiment, the video capture module 2002 is further configured to display an interactive article interaction interface when a first interaction operation occurs between the first terminal and the interactive article; and responding to the video acquisition operation triggered on the interactive object interactive interface, and acquiring the corresponding target video information by video acquisition.
In one embodiment, the interactive article based video sharing apparatus 2000 further comprises an interactive interface display module 2006 further configured to display an application home page of the interactive application logged in with the first application account; the application home page displays a bound article bound with the first application account; when a first terminal running an interactive application logged in by a first application account and an interactive article are subjected to first interactive operation, determining whether the interactive article belongs to a bound article; and when the interactive object belongs to the bound object, displaying the interactive object interactive interface through the interactive application.
In one embodiment, the video capture module 2002 further comprises a template selection module 2021, configured to display a video capture setting interface in response to a capture start operation triggered on the interactive article interaction interface, and display a video template set through the video capture setting interface; in response to a selection operation for the video template set, determining at least one target video template selected by the selection operation; and responding to the video acquisition operation triggered by the video acquisition setting interface, and acquiring the video based on the target video template to obtain the target video information.
In one embodiment, the target video information comprises a target video, and the target template comprises a video display effect and a video interception mode; the template selection module 2021 is further configured to perform video acquisition to obtain an acquired video in response to a video acquisition operation triggered through the video acquisition setting interface; performing video interception processing on the acquired video in a video interception mode to obtain a video clip; and editing the video frames in the video clips through the video display effect to obtain the target video to be shared.
In one embodiment, the video capture module 2002 is further configured to start the target game application in response to a start operation for the target game application triggered on the video capture setting interface, and display a corresponding game screen through the target game application; and carrying out video recording on the displayed game picture to obtain a collected video.
In one embodiment, the video sharing apparatus 2000 based on an interactive article further includes a meeting receiving module 2008, configured to display meeting invitation information transmitted by the interactive article when a fifth interaction operation occurs between the first terminal and the interactive article; the meeting invitation information is generated by the second terminal responding to the meeting invitation operation and is transmitted to the interactive article when the fourth interactive operation occurs between the second terminal and the interactive article; the meeting invitation information is used for requesting to meet with the uploading party of the target video; and triggering the response information specified by the response operation to be transmitted to the second terminal in response to the response operation occurring aiming at the meeting invitation information.
In one embodiment, the meeting receiving module 2008 is further configured to display a special reminding mark in an application home page of the interactive application when a fifth interactive operation occurs between the first terminal and the interactive article; and displaying the meeting invitation information corresponding to the special reminding mark in response to the information reading operation aiming at the special reminding mark.
In one embodiment, the application home page shows a first user head portrait corresponding to a first application account logged in to a first terminal; the meeting receiving module 2008 is further configured to display a special reminding mark at a position corresponding to the first user avatar; when a preset touch operation aiming at the head portrait of the first user occurs, displaying meeting invitation information corresponding to the special reminding mark; the meeting invitation information comprises a second user head portrait corresponding to a second application account logged in the second terminal.
In one embodiment, the meeting invitation information comprises a second application account number logged in to the second terminal; the meeting receiving module 2008 is further configured to display a response information input area in response to a response operation triggered by the meeting invitation information; acquiring meeting time input through a response information input area, and generating corresponding response information according to the meeting time; and sending the response information to the second terminal according to the second application account.
In one embodiment, the meeting receiving module 2008 is further configured to present a response information input area in response to a response operation triggered by the meeting invitation information; responding to a response input operation triggered in the response information input area, and displaying response information input through the response input operation; when sixth interactive operation occurs between the first terminal and the interactive object, response information is triggered to be transmitted to the interactive object; and the response information is transmitted to the interactive article and used for indicating the second terminal to display the response information when seventh interactive operation occurs between the second terminal and the interactive article.
In one embodiment, a first electronic tag is mounted in a first terminal, a second electronic tag is mounted in an interactive article, and a third electronic tag is mounted in a second terminal; the video sharing device 2000 based on the interactive article is further configured to display an interactive article interaction interface when the first electronic tag of the first terminal first touches the second electronic tag of the interactive article; when the first electronic tag of the first terminal touches the second electronic tag of the interactive article again, transmitting the target video to the interactive article; and the target video transmitted to the interactive article is used for indicating the second terminal to play the target video when the third electronic tag of the second terminal touches the second electronic tag of the interactive article.
In one embodiment, as shown in fig. 21, there is provided an interactive article-based video playing apparatus 2100, which may be a part of a computer device using software modules or hardware modules, or a combination of the two, and specifically includes: a play interface display module 2102 and a video play module 2104, wherein:
a playing interface display module 2102, configured to display a video playing interface when a third interactive operation occurs between the second terminal and the interactive object;
the video playing module 2104 is configured to, in response to a video playing operation triggered in the video playing interface, play a corresponding target video based on target video information received from the interactive object; the target video information is obtained by video acquisition after a first interactive operation is carried out between the first terminal and the interactive object, and is transmitted to the interactive object when a second interactive operation is carried out between the first terminal and the interactive object.
In one embodiment, the playing interface displaying module 2102 is further configured to pull the target video information and the comment information from the interactive object and display a video playing interface corresponding to the target video information when a third interactive operation occurs between the second terminal and the interactive object.
In one embodiment, the video playing module 2104 is further configured to play, in response to a video playing operation triggered in the video playing interface, a target video corresponding to the target video information through the video playing interface, and display comment information of the target video.
In one embodiment, the interactive article based video playing apparatus 2100 further includes a comment posting module 2106 for presenting the input pending posting comment in response to a comment input operation triggered in the message input area; and responding to comment publishing operation triggered by the message input area, and uploading the to-be-published comments to the interactive articles when eighth interactive operation occurs between the second terminal and the interactive articles so as to publish the to-be-published comments.
In one embodiment, the video playing apparatus 2100 based on interactive articles further includes a meeting invitation module 2108, configured to generate meeting invitation information in response to a meeting invitation operation triggered through the video playing interface; when the second terminal and the interactive object perform fourth interactive operation, transmitting the area invitation information to the interactive object; and the meeting invitation information transmitted to the interactive article is used for inviting a first user corresponding to the first terminal to meet a second user corresponding to the second terminal.
In one embodiment, the target video information includes a target video, the target video is a video obtained by recording a game screen of a target game application, and the target video includes a first game account corresponding to a first user, and the interactive article-based video playing apparatus 2100 is further configured to respond to a request to group in a video playing interface, and initiate a request to group to the first terminal based on the first game account; the team formation request is used for indicating the first terminal to perform team formation interaction with a second game account logged in the second terminal when the first terminal enters the target game application based on the first game account.
For specific limitations of the video sharing device based on the interactive article and the video playing device based on the interactive article, reference may be made to the above limitations of the video sharing method based on the interactive article and the video playing method based on the interactive article, which are not described herein again. All modules in the video sharing device based on the interactive articles and the video playing device based on the interactive articles can be completely or partially realized through software, hardware and a combination thereof. The modules can be embedded in a hardware form or independent from a processor in the computer device, and can also be stored in a memory in the computer device in a software form, so that the processor can call and execute operations corresponding to the modules.
In one embodiment, a computer device is provided, which may be a terminal, and its internal structure diagram may be as shown in fig. 22. The computer device includes a processor, a memory, a communication interface, a display screen, and an input device connected by a system bus. Wherein the processor of the computer device is configured to provide computing and control capabilities. The memory of the computer device comprises a nonvolatile storage medium and an internal memory. The non-volatile storage medium stores an operating system and a computer program. The internal memory provides an environment for the operation of an operating system and computer programs in the non-volatile storage medium. The communication interface of the computer device is used for carrying out wired or wireless communication with an external terminal, and the wireless communication can be realized through WIFI, an operator network, NFC (near field communication) or other technologies. The computer program is executed by a processor to realize a video sharing method based on interactive articles and a video playing method based on interactive articles. The display screen of the computer equipment can be a liquid crystal display screen or an electronic ink display screen, and the input device of the computer equipment can be a touch layer covered on the display screen, a key, a track ball or a touch pad arranged on the shell of the computer equipment, an external keyboard, a touch pad or a mouse and the like.
Those skilled in the art will appreciate that the architecture shown in fig. 22 is merely a block diagram of some of the structures associated with the disclosed aspects and is not intended to limit the computing devices to which the disclosed aspects apply, as particular computing devices may include more or less components than those shown, or may combine certain components, or have a different arrangement of components.
In one embodiment, a computer device is further provided, which includes a memory and a processor, the memory stores a computer program, and the processor implements the steps of the above method embodiments when executing the computer program.
In an embodiment, a computer-readable storage medium is provided, in which a computer program is stored which, when being executed by a processor, carries out the steps of the above-mentioned method embodiments.
In one embodiment, a computer program product or computer program is provided that includes computer instructions stored in a computer-readable storage medium. The computer instructions are read by a processor of a computer device from a computer-readable storage medium, and the computer instructions are executed by the processor to cause the computer device to perform the steps in the above-mentioned method embodiments.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by hardware instructions of a computer program, which can be stored in a non-volatile computer-readable storage medium, and when executed, can include the processes of the embodiments of the methods described above. Any reference to memory, storage, database or other medium used in the embodiments provided herein can include at least one of non-volatile and volatile memory. Non-volatile Memory may include Read-Only Memory (ROM), magnetic tape, floppy disk, flash Memory, optical storage, or the like. Volatile Memory can include Random Access Memory (RAM) or external cache Memory. By way of illustration and not limitation, RAM can take many forms, such as Static Random Access Memory (SRAM) or Dynamic Random Access Memory (DRAM), among others.
The technical features of the above embodiments can be arbitrarily combined, and for the sake of brevity, all possible combinations of the technical features in the above embodiments are not described, but should be considered as the scope of the present specification as long as there is no contradiction between the combinations of the technical features.
The above-mentioned embodiments only express several embodiments of the present application, and the description thereof is more specific and detailed, but not construed as limiting the scope of the invention. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, which falls within the scope of protection of the present application. Therefore, the protection scope of the present patent shall be subject to the appended claims.

Claims (20)

1. A video sharing method based on interactive articles is characterized by comprising the following steps:
when a first interactive operation occurs between a first terminal and an interactive object, triggering a video acquisition operation to obtain corresponding target video information;
when a second interactive operation occurs between the first terminal and the interactive object, triggering the transmission of the target video information to the interactive object; and the target video information is transmitted to the interactive object and is used for indicating the second terminal to play the target video corresponding to the target video information when a third interactive operation is carried out between the second terminal and the interactive object.
2. The method according to claim 1, wherein when a first interaction operation occurs between the first terminal and the interactive object, triggering a video capture operation to obtain corresponding target video information comprises:
when a first interactive operation occurs between a first terminal and an interactive article, triggering a cloud server to execute a video acquisition operation through the first terminal to obtain corresponding target video information;
when a second interactive operation occurs between the first terminal and the interactive object, triggering the transmission of the target video information to the interactive object, including:
when second interactive operation occurs between the first terminal and the interactive object, the cloud server is triggered by the first terminal to transmit the target video information to the interactive object.
3. The method according to claim 1, wherein the method is executed by a first terminal, and when a first interaction operation occurs between the first terminal and an interactive article, a video capture operation is triggered to obtain corresponding target video information, including:
when a first interactive operation occurs between the first terminal and the interactive object, displaying an interactive object interactive interface;
and responding to the video acquisition operation triggered by the interactive object interactive interface, and acquiring the corresponding target video information by video acquisition.
4. The method according to claim 3, wherein when the first interactive operation occurs between the first terminal and the interactive object, displaying the interactive object interactive interface comprises:
displaying an application home page of the interactive application logged in by the first application account; the application home page displays a bound article bound with the first application account;
when a first terminal running an interactive application logged in by a first application account number and an interactive article have first interactive operation, determining whether the interactive article belongs to the bound article;
and when the interactive object belongs to the bound object, displaying an interactive object interactive interface through the interactive application.
5. The method according to claim 3, wherein the acquiring video to obtain corresponding target video information in response to the video acquisition operation triggered by the interactive item interaction interface comprises:
responding to the acquisition starting operation triggered on the interactive article interactive interface, displaying a video acquisition setting interface, and displaying a video template set through the video acquisition setting interface;
in response to a selection operation for the video template set, determining at least one target video template selected by the selection operation;
and responding to the video acquisition operation triggered by the video acquisition setting interface, and acquiring the video based on the target video template to obtain target video information.
6. The method of claim 5, wherein the target video information comprises a target video, and the target template comprises a video presentation effect and a video capture mode; responding to the video acquisition operation triggered by the video acquisition setting interface, and acquiring a video based on the target video template to obtain target video information, wherein the method comprises the following steps:
responding to a video acquisition operation triggered by the video acquisition setting interface, and acquiring a video to obtain an acquired video;
performing video interception processing on the acquired video in the video interception mode to obtain a video clip;
and editing the video frames in the video clips through the video display effect to obtain a target video to be shared.
7. The method of claim 6, wherein the performing a video capture to obtain a captured video in response to a video capture operation triggered via the video capture settings interface comprises:
responding to starting operation aiming at the target game application triggered on the video acquisition setting interface, starting the target game application, and displaying a corresponding game picture through the target game application;
and carrying out video recording on the displayed game picture to obtain a collected video.
8. The method of claim 1, further comprising:
when a fifth interactive operation occurs between the first terminal and the interactive article, displaying meeting invitation information transmitted by the interactive article; the meeting invitation information is generated by the second terminal responding to the meeting invitation operation and is transmitted to the interactive article when a fourth interactive operation occurs between the second terminal and the interactive article; the meeting invitation information is used for requesting to meet with the uploading party of the target video;
and responding to a response operation which occurs aiming at the meeting invitation information, and triggering to transmit response information which is specified by the response operation to the second terminal.
9. The method according to claim 8, wherein the displaying the meeting invitation information transmitted by the interactive item when a fifth interactive operation occurs between the first terminal and the interactive item comprises:
when a fifth interactive operation occurs between the first terminal and the interactive object, displaying a special reminding mark in an application home page of the interactive application;
and responding to the information reading operation aiming at the special reminding mark, and displaying the meeting invitation information corresponding to the special reminding mark.
10. The method according to claim 9, wherein the application home page displays a first user avatar corresponding to a first application account logged in to the first terminal; the displaying of the special reminding mark in the application home page of the interactive application comprises the following steps:
displaying the special reminder at a position corresponding to the first user avatar;
the displaying of the meeting invitation information corresponding to the special reminding mark in response to the information reading operation aiming at the special reminding mark comprises:
when a preset touch operation aiming at the head portrait of the first user occurs, displaying meeting invitation information corresponding to the special reminding mark; and the meeting invitation information comprises a second user head portrait corresponding to a second application account logged in the second terminal.
11. The method of claim 8, wherein the meeting invitation information includes a second application account number logged into the second terminal; the triggering, in response to a response operation occurring with respect to the meeting invitation information, transmission of response information specified by the response operation to the second terminal includes:
responding to a response operation triggered by the meeting invitation information, and displaying a response information input area;
acquiring meeting time input through the response information input area, and generating corresponding response information according to the meeting time;
and sending the response information to the second terminal according to the second application account.
12. The method of claim 8, wherein the triggering, in response to the response operation occurring with respect to the meeting invitation information, transmission of response information specified by the response operation to the second terminal comprises:
responding to a response operation triggered by the meeting invitation information, and displaying a response information input area;
responding to a response input operation triggered in the response information input area, and displaying response information input through the response input operation;
when a sixth interactive operation occurs between the first terminal and the interactive object, triggering to transmit the response information to the interactive object; and the response information is transmitted to the interactive article and is used for indicating the second terminal to display the response information when a seventh interactive operation is carried out between the second terminal and the interactive article.
13. The method according to any one of claims 1 to 12, wherein the first terminal is equipped with a first electronic tag, the interactive article is equipped with a second electronic tag, and the second terminal is equipped with a third electronic tag;
when a first interactive operation occurs between the first terminal and the interactive object, triggering a video acquisition operation to obtain corresponding target video information, comprising:
when a first electronic tag of the first terminal touches a second electronic tag of the interactive article for the first time, triggering video acquisition operation to obtain corresponding target video information;
when a second interactive operation occurs between the first terminal and the interactive object, triggering the transmission of the target video information to the interactive object, including:
when the first electronic tag of the first terminal touches the second electronic tag of the interactive article again, transmitting the target video information to the interactive article;
and the target video information transmitted to the interactive article is used for indicating the second terminal to play the target video corresponding to the target video information when the third electronic tag of the second terminal touches the second electronic tag of the interactive article.
14. A video playing method based on an interactive article is characterized by being applied to a second terminal, and the method comprises the following steps:
when a third interactive operation occurs between the second terminal and the interactive object, displaying a video playing interface;
responding to a video playing operation triggered in the video playing interface, and playing a corresponding target video based on target video information received from the interactive object; the target video information is obtained by video acquisition after a first interactive operation is carried out between the first terminal and the interactive object, and is transmitted to the interactive object when a second interactive operation is carried out between the first terminal and the interactive object.
15. The method according to claim 14, wherein when a third interactive operation occurs between the second terminal and the interactive item, displaying the video playing interface comprises:
when a third interactive operation occurs between a second terminal and an interactive object, target video information and comment information are pulled from the interactive object, and a video playing interface corresponding to the target video information is displayed;
the playing, in response to a video playing operation triggered in the video playing interface, a corresponding target video based on target video information received from the interactive article includes:
and responding to a video playing operation triggered in the video playing interface, playing a target video corresponding to the target video information through the video playing interface, and displaying comment information of the target video.
16. The method of claim 14, wherein the video playback interface includes a message entry area, the method further comprising:
responding to comment input operation triggered in the message input area, and displaying input to-be-issued comments;
and responding to comment publishing operation triggered by the message input area, and uploading the to-be-published comments to the interactive object when eighth interactive operation occurs between the second terminal and the interactive object so as to publish the to-be-published comments.
17. The method of claim 14, further comprising:
generating meeting invitation information in response to meeting invitation operation triggered by the video playing interface;
when the second terminal and the interactive article perform fourth interactive operation, transmitting the meeting invitation information to the interactive article;
and the meeting invitation information transmitted to the interactive article is used for inviting a first user corresponding to the first terminal to meet a second user corresponding to the second terminal.
18. The method according to any one of claims 14 to 17, wherein the target video information includes a target video, the target video is a video obtained by recording a game screen of a target game application, and the target video includes a first game account corresponding to a first user, and the method further includes:
responding to a request for grouping operation in the video playing interface, and initiating a grouping request to the first terminal based on the first game account; the team formation request is used for indicating the first terminal to perform team formation interaction with a second game account logged in the second terminal when entering a target game application based on the first game account.
19. An interactive article, wherein a communication module is deployed in the interactive article, wherein:
the communication module is used for triggering the execution of the video acquisition operation when a first interactive operation occurs between the first terminal and the interactive object;
the communication module is further used for receiving target video information generated based on the video acquisition operation when a second interactive operation occurs between the first terminal and the interactive object;
the communication module is further used for transmitting the target video information to a second terminal when a third interactive operation occurs between the second terminal and the interactive object; and the transmitted target video information is used for indicating the second terminal to play the target video corresponding to the target video information.
20. A computer-readable storage medium, in which a computer program is stored which, when being executed by a processor, carries out the steps of the method according to any one of claims 1 to 18.
CN202110406610.1A 2021-04-15 2021-04-15 Video sharing method based on interactive object and interactive object Active CN113179445B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110406610.1A CN113179445B (en) 2021-04-15 2021-04-15 Video sharing method based on interactive object and interactive object

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110406610.1A CN113179445B (en) 2021-04-15 2021-04-15 Video sharing method based on interactive object and interactive object

Publications (2)

Publication Number Publication Date
CN113179445A true CN113179445A (en) 2021-07-27
CN113179445B CN113179445B (en) 2023-07-14

Family

ID=76925031

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110406610.1A Active CN113179445B (en) 2021-04-15 2021-04-15 Video sharing method based on interactive object and interactive object

Country Status (1)

Country Link
CN (1) CN113179445B (en)

Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140065960A1 (en) * 2012-08-31 2014-03-06 Pantech Co., Ltd. Device and method for sharing content
US20140113550A1 (en) * 2012-10-22 2014-04-24 Samsung Electronics Co. Ltd. Apparatus, system, and method for transferring data across mobile terminals paired using near field communication (nfc)
WO2015157910A1 (en) * 2014-04-15 2015-10-22 华为技术有限公司 Application information sharing method and apparatus
CN106231432A (en) * 2016-07-29 2016-12-14 北京小米移动软件有限公司 The method and device of sharing video frequency
US20170050110A1 (en) * 2015-08-19 2017-02-23 Sony Computer Entertainment America Llc Local application quick start with cloud transitioning
WO2018119630A1 (en) * 2016-12-27 2018-07-05 华为技术有限公司 Interaction method, and terminal
CN109167950A (en) * 2018-10-25 2019-01-08 腾讯科技(深圳)有限公司 Video recording method, video broadcasting method, device, equipment and storage medium
CN110059516A (en) * 2019-04-24 2019-07-26 云南泰科瑟夫科技服务有限公司 Stranger's interaction and personal belongings management system based on label technique
CN110324672A (en) * 2019-05-30 2019-10-11 腾讯科技(深圳)有限公司 A kind of video data handling procedure, device, system and medium
CN110536155A (en) * 2019-09-09 2019-12-03 北京为快科技有限公司 A kind of method and device improving VR video interactive efficiency
US20200077137A1 (en) * 2018-08-31 2020-03-05 Beijing Youku Technology Co., Ltd. Video interaction method and apparatus
CN111541951A (en) * 2020-05-08 2020-08-14 腾讯科技(深圳)有限公司 Video-based interactive processing method and device, terminal and readable storage medium
WO2021023208A1 (en) * 2019-08-08 2021-02-11 华为技术有限公司 Data sharing method, graphical user interface, related device, and system
CN112423087A (en) * 2020-11-17 2021-02-26 北京字跳网络技术有限公司 Video interaction information display method and terminal equipment

Patent Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140065960A1 (en) * 2012-08-31 2014-03-06 Pantech Co., Ltd. Device and method for sharing content
US20140113550A1 (en) * 2012-10-22 2014-04-24 Samsung Electronics Co. Ltd. Apparatus, system, and method for transferring data across mobile terminals paired using near field communication (nfc)
WO2015157910A1 (en) * 2014-04-15 2015-10-22 华为技术有限公司 Application information sharing method and apparatus
EP3125509A1 (en) * 2014-04-15 2017-02-01 Huawei Technologies Co., Ltd Application information sharing method and apparatus
US20170050110A1 (en) * 2015-08-19 2017-02-23 Sony Computer Entertainment America Llc Local application quick start with cloud transitioning
CN106231432A (en) * 2016-07-29 2016-12-14 北京小米移动软件有限公司 The method and device of sharing video frequency
WO2018119630A1 (en) * 2016-12-27 2018-07-05 华为技术有限公司 Interaction method, and terminal
US20200077137A1 (en) * 2018-08-31 2020-03-05 Beijing Youku Technology Co., Ltd. Video interaction method and apparatus
CN109167950A (en) * 2018-10-25 2019-01-08 腾讯科技(深圳)有限公司 Video recording method, video broadcasting method, device, equipment and storage medium
CN110059516A (en) * 2019-04-24 2019-07-26 云南泰科瑟夫科技服务有限公司 Stranger's interaction and personal belongings management system based on label technique
CN110324672A (en) * 2019-05-30 2019-10-11 腾讯科技(深圳)有限公司 A kind of video data handling procedure, device, system and medium
WO2021023208A1 (en) * 2019-08-08 2021-02-11 华为技术有限公司 Data sharing method, graphical user interface, related device, and system
CN110536155A (en) * 2019-09-09 2019-12-03 北京为快科技有限公司 A kind of method and device improving VR video interactive efficiency
CN111541951A (en) * 2020-05-08 2020-08-14 腾讯科技(深圳)有限公司 Video-based interactive processing method and device, terminal and readable storage medium
CN112423087A (en) * 2020-11-17 2021-02-26 北京字跳网络技术有限公司 Video interaction information display method and terminal equipment

Also Published As

Publication number Publication date
CN113179445B (en) 2023-07-14

Similar Documents

Publication Publication Date Title
CN110784752B (en) Video interaction method and device, computer equipment and storage medium
US11412307B2 (en) Interaction information processing method, client, service platform, and storage medium
CN111711560B (en) Resource message generation and resource acquisition method, device, equipment and storage medium
US20080229215A1 (en) Interaction In A Virtual Social Environment
US10904608B2 (en) Display control method, terminal, and non-transitory computer readable recording medium storing a computer program
WO2018196733A1 (en) Data sharing method and device, storage medium and electronic device
CN113965811A (en) Play control method and device, storage medium and electronic device
CN113573129B (en) Commodity object display video processing method and device
CN112770135B (en) Live broadcast-based content explanation method and device, electronic equipment and storage medium
CN113573092B (en) Live broadcast data processing method and device, electronic equipment and storage medium
CN113518240B (en) Live interaction, virtual resource configuration and virtual resource processing method and device
US20220407734A1 (en) Interaction method and apparatus, and electronic device
CN111314204A (en) Interaction method, device, terminal and storage medium
KR20220090411A (en) Method, apparatus and device of live game broadcasting
CN114430494B (en) Interface display method, device, equipment and storage medium
CN108288152B (en) Interaction method, terminal and storage medium for sharing information
CN114071171A (en) Resource acquisition method and device, computer equipment and storage medium
CN111669658B (en) Virtual article issuing method and device, computer equipment and storage medium
CN114205633A (en) Live broadcast interaction method and device, storage medium and electronic equipment
CN109819341B (en) Video playing method and device, computing equipment and storage medium
CN110719426B (en) Video message leaving method, related device and storage medium
US11165596B2 (en) System and method for inviting users to participate in activities based on interactive recordings
CN113179445B (en) Video sharing method based on interactive object and interactive object
CN111585865A (en) Data processing method, data processing device, computer readable storage medium and computer equipment
CN115695355A (en) Data sharing method and device, electronic equipment and medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
REG Reference to a national code

Ref country code: HK

Ref legal event code: DE

Ref document number: 40047522

Country of ref document: HK

GR01 Patent grant
GR01 Patent grant