CN106303733B - Method and device for playing live special effect information - Google Patents

Method and device for playing live special effect information Download PDF

Info

Publication number
CN106303733B
CN106303733B CN201610658817.7A CN201610658817A CN106303733B CN 106303733 B CN106303733 B CN 106303733B CN 201610658817 A CN201610658817 A CN 201610658817A CN 106303733 B CN106303733 B CN 106303733B
Authority
CN
China
Prior art keywords
special effect
effect information
live
terminal
user
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201610658817.7A
Other languages
Chinese (zh)
Other versions
CN106303733A (en
Inventor
陈雪琪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN201610658817.7A priority Critical patent/CN106303733B/en
Publication of CN106303733A publication Critical patent/CN106303733A/en
Application granted granted Critical
Publication of CN106303733B publication Critical patent/CN106303733B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/475End-user interface for inputting end-user data, e.g. personal identification number [PIN], preference data
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • H04N21/4312Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/472End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Human Computer Interaction (AREA)
  • Databases & Information Systems (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention discloses a method and a device for playing live special effect information, and belongs to the technical field of internet. The method comprises the following steps: the method comprises the steps that touch operation of a first user touching a first terminal screen is obtained in a live broadcast interface displayed by a first terminal; acquiring live special effect information matched with the touch operation according to the touch operation; and sending the live special effect information to a second terminal corresponding to a second user, and playing the live special effect information in a live interface displayed on the second terminal by the second terminal, wherein the second user is a user watching the live broadcast of the first user. The device comprises: the device comprises a first acquisition module, a second acquisition module and a playing module. According to the method and the device, the first user can play the live special effect information only by executing one touch operation on the first terminal screen, so that the process of playing the live special effect information is simplified, and the playing efficiency is greatly improved.

Description

Method and device for playing live special effect information
Technical Field
The invention relates to the technical field of internet, in particular to a method and a device for playing live special-effect information.
Background
In recent years, live applications are popular with users who (for ease of distinction, the user is referred to as a main cast) can live their performance to viewers who are currently watching the main cast through the live applications. In the live broadcast process, in order to set off the live broadcast atmosphere, the anchor can play the live broadcast special effect information to audiences through the live broadcast application. For example, when a anchor is playing a joke on the fly, the anchor may play a special effect of "crowd jokes" to the viewer through a live application.
Currently, a live broadcast interface displayed on a terminal at a anchor side includes an interface for displaying a special effect interface, and when the anchor needs to play live broadcast special effect information, the anchor can click the interface; then the anchor side terminal displays a special effect interface, wherein the special effect interface comprises a plurality of live special effect information; the anchor selects live special effect information in the special effect interface; and then the anchor side terminal plays the selected live special effect information in a live broadcast interface displayed by the anchor side terminal, and sends the live special effect information to the audience side terminal, and the audience side terminal plays the live special effect information on the live broadcast interface displayed by the audience side terminal.
In the process of implementing the invention, the inventor finds that the prior art has at least the following problems:
according to the method, the anchor can play the live special effect information to audiences only through a plurality of trigger operations such as clicking an interface, selecting the live special effect information and the like, so that the efficiency of playing the live special effect information is low.
Disclosure of Invention
In order to solve the problems in the prior art, embodiments of the present invention provide a method and an apparatus for playing live special effect information. The technical scheme is as follows:
in one aspect, an embodiment of the present invention provides a method for playing live special effect information, where the method includes:
the method comprises the steps that touch operation of a first user touching a first terminal screen is obtained in a live broadcast interface displayed by a first terminal;
acquiring live special effect information matched with the touch operation according to the touch operation;
and sending the live special effect information to a second terminal corresponding to a second user, and playing the live special effect information in a live interface displayed on the second terminal by the second terminal, wherein the second user is a user watching the live broadcast of the first user.
On the other hand, an embodiment of the present invention provides a device for playing live special effect information, where the device includes:
the first obtaining module is used for obtaining touch operation of a first user touching a first terminal screen in a live broadcast interface displayed by a first terminal;
the second acquisition module is used for acquiring live special effect information matched with the touch operation according to the touch operation;
and the playing module is used for sending the live special effect information to a second terminal corresponding to a second user, and the second terminal plays the live special effect information in a live interface displayed on the second terminal, wherein the second user is a user watching the live broadcast of the first user.
The technical scheme provided by the embodiment of the invention has the following beneficial effects:
in the embodiment of the invention, when a first user plays certain live special effect information to a second user, the first user performs touch operation matched with the live special effect information on a first terminal screen; the first terminal acquires the live special effect information according to the touch operation, sends the live special effect information to a second terminal corresponding to a second user, and plays the live special effect information in a live interface displayed on the second terminal through the second terminal, so that the first user can play the live special effect information only by executing one touch operation on a screen of the first terminal, the process of playing the live special effect information is simplified, and the playing efficiency is greatly improved.
Drawings
Fig. 1 is a flowchart of a method for playing live special effect information according to an embodiment of the present invention;
fig. 2-1 is a flowchart of a method for playing live special effect information according to an embodiment of the present invention;
fig. 2-2 is a schematic view of a live interface provided in an embodiment of the present invention;
2-3 are schematic diagrams of an effect information page provided by an embodiment of the invention;
2-4 are diagrams illustrating selection of live special effect information in a special effect information page according to an embodiment of the present invention;
fig. 3-1 is a flowchart of a method for playing live special effect information according to an embodiment of the present invention;
fig. 3-2 is a schematic diagram of a region divided by a live interface according to an embodiment of the present invention;
3-3 are diagrams of a first user touching a first terminal screen with a finger according to an embodiment of the present invention;
3-4 are diagrams of a first user touching a first terminal screen with two fingers according to an embodiment of the present invention;
3-5 are diagrams of a first user touching a first terminal screen with three fingers according to an embodiment of the present invention;
3-6 are diagrams of a first user touching a first terminal screen with four fingers according to an embodiment of the present invention;
fig. 4 is a schematic structural diagram of an apparatus for playing live special effect information according to an embodiment of the present invention;
fig. 5 is a schematic structural diagram of a first terminal according to an embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, embodiments of the present invention will be described in detail with reference to the accompanying drawings.
In a live application, a first user (anchor) may live a performance of himself or herself to a second user (viewer) who is currently watching the first user; in the live broadcast process, the first user can set off the live broadcast atmosphere, and the first terminal corresponding to the first user can play the live broadcast special effect information to the second terminal corresponding to the second user through the method provided by the embodiment of the invention.
Before the first terminal plays the live special effect information to the second terminal, the first terminal can set a corresponding relation between touch operation and the live special effect information in advance; or the corresponding relation between the configuration information of the touch operation and the live special effect information is set in advance; when a first user plays certain live special effect information to a second user, the first user executes touch operation matched with the live special effect information or executes touch operation of which the configuration information is matched with the configuration information of the live special effect information on a first terminal screen; the first terminal acquires the live special effect information from the corresponding relation between the touch operation and the live special effect information according to the touch operation, or acquires the live special effect information from the corresponding relation between the configuration information and the live special effect information according to the configuration information of the touch operation, sends the live special effect information to the second terminal, and plays the live special effect information in a live interface displayed on the second terminal by the second terminal, so that a first user can trigger the first terminal to send the live special effect information to the second terminal only by executing one touch operation on a screen of the first terminal, and the playing efficiency of playing the live special effect information is improved.
The execution main body of the embodiment of the invention is a first terminal; the first terminal can be any terminal with a touch screen, such as a mobile phone or a tablet computer; the first terminal may include a processor, a memory, and a transmitter; the memory can be used for storing the corresponding relation between the touch operation and the live special effect information, and can also be used for storing the corresponding relation between the configuration information of the touch operation and the live special effect information; the processor can be used for acquiring touch operation of a first user touching a first terminal screen in a live broadcast interface displayed by the first terminal; the processor can also be used for determining live special effect information matched with the touch operation according to the corresponding relation between the touch operation and the stored touch operation and live special effect information; or determining live special effect information matched with the touch operation according to the corresponding relation between the touch operation and the stored configuration information and live special effect information of the touch operation; the transmitter may be configured to send the live special effect information to the second terminal, and play the live special effect information in a live interface displayed on the second terminal by the second terminal. In addition, the first terminal may further include an input device; the input device may be used to record live information of the first user, and the input device may be a microphone or a camera; the first terminal may further include an output device, where the output device may be used to play the live special effect information, and the output device may be a speaker or a player, etc.; the first terminal may further include components such as a power source and a sensor.
An embodiment of the present invention provides a method for playing live special effect information, see fig. 1, where the method includes:
step 101: and acquiring touch operation of a first user touching a first terminal screen in a live interface displayed by the first terminal.
Step 102: and acquiring live special effect information matched with the touch operation according to the touch operation.
Step 103: and sending the live special effect information to a second terminal corresponding to a second user, and playing the live special effect information in a live interface displayed on the second terminal by the second terminal, wherein the second user is a user watching the live broadcast of the first user.
In the embodiment of the invention, when a first user plays certain live special effect information to a second user, the first user performs touch operation matched with the live special effect information on a first terminal screen; the first terminal acquires the live special effect information according to the touch operation, sends the live special effect information to a second terminal corresponding to a second user, and plays the live special effect information in a live interface displayed on the second terminal through the second terminal, so that the first user can play the live special effect information only by executing one touch operation on a screen of the first terminal, the process of playing the live special effect information is simplified, and the playing efficiency is greatly improved.
Optionally, step 102, comprises:
and according to the touch operation, acquiring live special effect information matched with the touch operation from the corresponding relation between the touch operation and the live special effect information.
Optionally, the method further comprises:
acquiring live special effect information selected by a first user, and acquiring touch operation which is set by the first user and is matched with the selected live special effect information;
and establishing a corresponding relation between the selected live special effect information and the touch operation matched with the selected live special effect information.
Optionally, step 102, further includes:
and acquiring configuration information of the touch operation, and acquiring live special effect information matched with the touch operation from the corresponding relation between the configuration information and the live special effect information according to the configuration information.
Optionally, the method further comprises:
acquiring live special effect information selected by a first user, and acquiring configuration information of touch operation matched with the selected live special effect information, which is set by the first user;
and establishing a corresponding relation between the selected live special effect information and the configuration information of the touch operation matched with the selected live special effect information.
Optionally, acquiring the live special effect information selected by the first user includes:
displaying a special effect information page, wherein the special effect information page comprises a plurality of live special effect information and a selection interface of each live special effect information in the plurality of live special effect information;
acquiring a selection interface selected by a first user in the special effect information page;
and acquiring the live special effect information corresponding to the selected selection interface.
Optionally, the configuration information of the touch operation includes one or more of the number of contact points, the touch trajectory, and the touch position of the touch operation.
Optionally, before step 101, further comprising:
acquiring an interface starting state for triggering the playing of the live special effect information; and when the interface starting state is the opening state, executing the step of acquiring the touch operation of the first user touching the first terminal screen.
Referring to fig. 2-1, an embodiment of the present invention provides a method for playing live special effect information, where an execution subject of the method is a first terminal; in the embodiment of the present invention, a description is given by taking an example of a correspondence relationship between a touch operation and live special effect information stored in advance in a first terminal.
Before playing live special effect information, a first terminal can establish a corresponding relation between one touch operation and the live special effect information through the following steps 201 and 202; when the live special effect information is played subsequently, the steps 201 and 202 do not need to be executed, and the step 203 and 205 can be directly executed.
Step 201: the method comprises the steps that a first terminal obtains live special effect information selected by a first user, and obtains touch operation which is set by the first user and matched with the selected live special effect information.
The first terminal includes a plurality of live special effect information, the first user may select live special effect information as alternative live special effect information in the first terminal through the following step 2011-2013, so as to subsequently select live special effect information from the alternative live special effect information for playing, and the live special effect information may be one or more of a music special effect, live animation and animation broadcast.
2011: the first terminal displays a special effect information page, and the special effect information page comprises a plurality of live special effect information and a selection interface of each live special effect information in the plurality of live special effect information.
The method comprises the steps that a live broadcast interface displayed by a first terminal comprises a setting interface, the setting interface is used for triggering the first terminal to display a special effect information page, when a first user selects live broadcast special effect information, the first user clicks the setting interface, when the first terminal detects that the setting interface is triggered, the special effect information page is displayed, and the special effect information page comprises a plurality of pieces of live broadcast special effect information and a selection interface of each piece of live broadcast special effect information; selecting the default state of the interface as a closed state; when the first user selects certain live special effect information, the first user can click a selection interface of the live special effect information, and when the first terminal detects that the selection interface of the live special effect information is triggered and the current state of the selection interface of the live special effect information is in a closed state, the current state of the selection interface of the live special effect information is changed from the closed state to an open state.
Certainly, the first user may also close some live special effect information in the special effect information page, at this time, the first user selects live special effect information to be closed, and clicks a selection interface of the live special effect information; and when the first terminal detects that the selection interface of the live special effect information is triggered and the current state of the selection interface of the live special effect information is the open state, the current state of the selection interface of the live special effect information is changed from the open state to the closed state.
For example, referring to fig. 2-2, a setting interface is included in a live interface displayed by the first terminal; when the first user clicks the setting interface, the first terminal displays a special effect information page, referring to fig. 2-3, the special effect information page comprises four live special effect information of crowd laughter, fierce palmsound and lovely sale, and Oh no, and the selection interfaces corresponding to the Oh no respectively, and the states of the selection interfaces corresponding to the Oh no are default to be closed states.
2012: and the first terminal acquires a selection interface selected by the first user in the special effect information page.
And the first terminal acquires the selection interface with the interface state being in the opening state in the special effect information page, and determines the selection interface with the interface state being in the opening state as the selection interface selected by the first user in the special effect information page. The number of the selected selection interfaces may be one or more.
For example, referring to fig. 2 to 4, the selection interface selected by the first user in the special effect information page by the first terminal is a selection interface corresponding to mass laughter, a selection interface corresponding to fierce clapper, a selection interface corresponding to carefree selling, or a selection interface corresponding to Oh no.
2013: and the first terminal acquires the live special effect information corresponding to the selected selection interface.
And the first terminal acquires the live special effect information corresponding to the selected selection interface from the corresponding relation between the selection interface and the live special effect information according to the selected selection interface.
For example, the first terminal obtains the live broadcast special effect information corresponding to the selected selection interface as mass laughter, fierce clapper, lovely, and Oh no according to the selection interface selected by the first user in the special effect information page as a selection interface corresponding to mass laughter, fierce clapper, lovely, and Oh no.
After acquiring the live special effect information selected by the first user, the first terminal displays a touch operation setting interface, and the first user can set touch operation in the touch operation setting interface; and the first terminal acquires the touch operation set in the touch operation setting interface by the first user and confirms the touch operation as the touch operation matched with the selected live special effect information.
If the selected live special effect information comprises a plurality of pieces of live special effect information, the touch operation setting interface displayed by the first terminal comprises a plurality of pieces of live special effect information, the first user can set touch operation matched with each piece of live special effect information in the plurality of pieces of live special effect information in sequence, and the specific process can be as follows:
the method comprises the steps that a first user selects one live special effect information from a plurality of live special effect information, then a touch operation is set, the first terminal obtains the touch operation set by the first user, and the touch operation is determined to be the touch operation matched with the live special effect information; then, the first user can reselect one live special effect information from the plurality of live special effect information and reset a touch operation; and the first terminal acquires the touch operation reset by the first user, and determines the reset touch operation as the touch operation matched with the newly selected live special effect information until the touch operation corresponding to the plurality of live special effect information is set to be completed.
For example, the touch operations set by the first user to match the popular laughter, fierce clapping, lovely selling, and Oh no are a click operation, a double click operation, a slide operation of sliding left, and a slide operation of sliding right, respectively.
Step 202: and the first terminal establishes the corresponding relation between the selected live special effect information and the touch operation matched with the selected live special effect information.
The first terminal may establish a corresponding relationship between the selected live special effect information and the touch operation matched with the selected live special effect information in a form of a table, a database, or the like.
For example, the first terminal establishes the selected live special effect information: crowd laughter, fierce applause, lovely and lovely spread and Ohno; and touch operations matched with mass laughter, fierce applause, lovely sales and Oh no: the correspondence between the click operation, the double click operation, the slide operation for leftward sliding, and the slide operation for rightward sliding is shown in table 1 below:
TABLE 1
Live special effect information Touch operation
Laughter of people Click operation
Violent applause Double click operation
Lovely selling Sliding operation of leftward sliding
Oh no Sliding operation of rightward sliding
...... ......
Further, step 201 and step 202 are processes in which the first user sets a touch operation corresponding to the live special effect information in the first terminal, and the touch operation corresponding to the live special effect information may also be configured by a developer, and the specific processes may be:
for each live special effect information in all live special effect information included by the first terminal, the first terminal acquires touch operation matched with the live special effect information and set by a developer, and establishes a corresponding relation between the live special effect information and the touch operation.
If the touch operation corresponding to the live special effect information is configured by a developer, the steps 201 and 202 may be replaced by:
the method comprises the steps that a first terminal obtains live special effect information selected by a first user; and according to the selected live special effect information, acquiring the touch operation corresponding to the selected live special effect information from the corresponding relation between the live special effect information and the touch operation configured by the developer, and displaying the touch operation corresponding to the selected live special effect information.
In order to prevent the first user from misoperation, the first terminal can be provided with an interface, and when the starting state of the interface is an opening state, the live special effect information is played according to the method provided by the embodiment of the invention; when the starting state of the interface is the closing state, ending; the specific process can be as follows:
the method comprises the steps that a first terminal obtains an interface starting state used for triggering and playing of live special effect information; when the interface start state is the on state, execute step 203; and ending when the starting state of the interface is the closing state.
The live interface displayed by the first terminal comprises the interface, the first user can click the interface to set the interface starting terminal to be in a closed state or an open state, and the specific process can be as follows:
when the first terminal detects that the interface is triggered, the first terminal acquires the starting state of the interface, and if the starting state of the interface is the starting state, the first terminal sets the starting state of the interface from the starting state to the closing state; and if the interface starting state is the closing state, the first terminal sets the interface starting state from the closing state to the opening state.
Step 203: in a live broadcast interface displayed by a first terminal, the first terminal obtains touch operation of a first user touching a first terminal screen.
The method comprises the steps that a first terminal shoots a first user in real time to obtain live broadcast information, the live broadcast information is played in a live broadcast interface displayed by the first terminal, and the live broadcast information is sent to a second terminal corresponding to a second user in real time; and the second terminal receives the live broadcast information sent by the first terminal and plays the live broadcast information in a live broadcast interface displayed by the second terminal, wherein the second user is a user watching the live broadcast of the first user.
In the live broadcasting process, when a first user wants to play certain live special effect information, the first user executes touch operation matched with the live special effect information in a live broadcasting interface displayed by a first terminal; the method comprises the steps that a first terminal obtains touch operation of a first user touching a first terminal screen in a live broadcast interface displayed by the first terminal.
For example, when a first user plays a crowd laugh, the first user clicks a first terminal screen in a live interface displayed by a first terminal; the first terminal obtains the clicking operation of the first user for clicking the first terminal screen.
Step 204: and the first terminal acquires live special effect information matched with the touch operation from the corresponding relation between the touch operation and the live special effect information according to the touch operation.
For example, according to the click operation, the first terminal acquires live special effect information matched with the click operation from table 1 as the popular laughter.
Step 205: and the first terminal sends the live special effect information to a second terminal corresponding to a second user, and the second terminal plays the live special effect information in a live interface displayed on the second terminal.
The first terminal sends the live special effect information to the second terminal; the second terminal receives the live special effect information sent by the first terminal and plays the live special effect information in a live interface displayed on the second terminal; and when the first terminal sends the live special effect information to the second terminal, the first terminal can also play the live special effect information in a live interface displayed on the first terminal.
The first terminal may send the live special effect information to the second terminal through the live server, and the specific process may be as follows:
the first terminal sends the live special effect information and the user identification of the first user to a live server; the live broadcast server receives the live broadcast special effect information and the user identification of the first user sent by the first terminal, acquires the user identification of a second user watching the live broadcast of the first user currently according to the user identification of the first user, and sends the live broadcast special effect information to a second terminal corresponding to the second user according to the user identification of the second user; and the second terminal receives the live special effect information sent by the live server and plays the live special effect information in a currently displayed live interface.
The user identification of the first user may be a user account of the first user registered in the live server, and the user identification of the second user may be a user account of the second user registered in the live server.
For example, a first terminal sends the crowd laughter to a second terminal, and the second terminal receives the group laughter sent by the first terminal and plays the crowd laughter in a live interface displayed on the second terminal; and the first terminal plays the laughter of the crowd in a live interface displayed on the first terminal.
In the embodiment of the invention, when a first user plays certain live special effect information to a second user, the first user performs touch operation matched with the live special effect information on a first terminal screen; the first terminal acquires the live special effect information according to the touch operation, sends the live special effect information to a second terminal corresponding to a second user, and plays the live special effect information in a live interface displayed on the second terminal through the second terminal, so that the first user can play the live special effect information only by executing one touch operation on a screen of the first terminal, the process of playing the live special effect information is simplified, and the playing efficiency is greatly improved.
Referring to fig. 3-1, an embodiment of the present invention provides a method for playing live special effect information, where an execution subject of the method is a first terminal; in the embodiment of the present invention, a description is given by taking an example of a correspondence relationship between configuration information of a touch operation and live special effect information stored in advance in a first terminal.
Before playing live special effect information, a first terminal can establish a corresponding relation between configuration information of one touch operation and the live special effect information through the following steps 301 and 302, wherein the configuration information of the touch operation comprises one or more of the number of contact points, touch tracks and touch positions of the touch operation; when the live special effect information is played subsequently, the steps 301 and 302 do not need to be executed, and the steps 303 and 305 can be directly executed.
Step 301: the method comprises the steps that a first terminal obtains live special effect information selected by a first user, and obtains configuration information of touch operation matched with the selected live special effect information and set by the first user.
The step of acquiring the live special effect information selected by the first user by the first terminal is the same as the process of acquiring the live special effect information selected by the first user in step 201, and details are not repeated here.
The step of the first terminal obtaining the configuration information of the touch operation set by the first user and matched with the selected live special effect information may be:
after acquiring the live special effect information selected by the first user, the first terminal displays a configuration information setting interface, and the first user can configure configuration information of touch operation in the configuration information setting interface; and the first terminal acquires the configuration information configured in the configuration information setting interface by the first user, and determines the configuration information as the configuration information of the touch operation matched with the selected live special effect information.
If the selected live special effect information comprises a plurality of pieces, the configuration information setting interface displayed by the first terminal comprises a plurality of pieces of live special effect information, the first user can set touch operation matched with each piece of live special effect information in the plurality of pieces of live special effect information in sequence, and the specific process can be as follows:
the method comprises the steps that a first user selects one piece of live special effect information from a plurality of pieces of live special effect information, then one piece of configuration information is set, a first terminal obtains the configuration information set by the first user, and the configuration information is determined to be the configuration information matched with the live special effect information; then, the first user can reselect one piece of live special effect information from the plurality of pieces of live special effect information and then reset one piece of configuration information; and the first terminal acquires the configuration information reset by the first user, and determines the reset configuration information as the configuration information matched with the reselected live special effect information until the configuration information corresponding to the plurality of live special effect information is set.
For example, the configuration information is the number of contact points of the touch operation, the configuration information of the touch operation corresponding to the Oh no is one (finger), two (finger), three (finger) and four (finger) respectively, and the first user sets the laughter, the fierce clapping and the lovely selling of the masses.
For another example, the configuration information is a touch trajectory, the configuration information of the touch operation corresponding to Ohno is circular, triangular, rectangular and square, and the first user sets laughter, fierce clapping and lovely selling of people.
For another example, the configuration information is a touch position, and the first terminal equally divides the first terminal screen into four regions, namely a first region, a second region, a third region and a fourth region from top to bottom. The configuration information of the touch operation corresponding to the crowd laughter, fierce clapping, lovely spreading and selling set by the first user and the Oh no is a first area, a second area, a third area and a fourth area of the first terminal screen respectively, as shown in fig. 3-2.
Step 302: and the first terminal establishes the corresponding relation between the selected live special effect information and the configuration information of the touch operation matched with the selected live special effect information.
The first terminal may establish a corresponding relationship between the selected live special effect information and the configuration information of the touch operation matched with the selected live special effect information in a table form or a database form, and in the embodiment of the present invention, a form in which the first terminal establishes a corresponding relationship between the selected live special effect information and the configuration information of the touch operation matched with the selected live special effect information is not particularly limited.
For example, when the configuration information is the number of contact points of the touch operation, the first terminal establishes the selected live special effect information: crowd laughter, fierce applause, lovely and lovely spread and sell and Oh no; and configuration information of touch operation matched with mass laughter, fierce applause, lovely talent and Oh no: the correspondence of one finger, two fingers, three fingers and four fingers is shown in table 2 below:
TABLE 2
Live special effect information Touch operation
Laughter of people One (one finger)
Violent applause Two (with finger)
Lovely selling Three (one finger)
Oh no Four (one finger)
...... ......
For another example, when the configuration information is a touch track, the first terminal establishes the selected live special effect information: crowd laughter, fierce applause, lovely and lovely spread and sell and Oh no; and configuration information of touch operation matched with mass laughter, fierce applause, lovely talent and Oh no: the correspondence of circles, triangles, rectangles and squares is shown in table 3 below:
TABLE 3
Live special effect information Touch operation
Laughter of people Circular shape
Violent applause Triangle shape
Lovely selling Rectangle
Oh no Square shape
...... ......
For another example, when the configuration information is the touch position, the first terminal establishes the selected live special effect information: crowd laughter, fierce applause, lovely and lovely spread and sell and Oh no; and configuration information of touch operation matched with mass laughter, fierce applause, lovely talent and Oh no: the correspondence relationship among the first region, the second region, the third region, and the fourth region is shown in table 4 below:
TABLE 4
Live special effect information Touch operation
Laughter of people First region
Violent applause Second region
Lovely selling A third region
Oh no Fourth region
...... ......
Further, step 301 and step 302 are processes in which the first user sets a touch operation corresponding to live special effect information in the first terminal, and configuration information of the touch operation corresponding to the live special effect information may also be configured by a developer, and the specific processes may be:
for each live special effect information in all live special effect information included by the first terminal, the first terminal acquires configuration information of touch operation matched with the live special effect information and set by a developer, and establishes a corresponding relation between the live special effect information and the configuration information of the touch operation.
If the configuration information corresponding to the live special effect information is configured by a developer, step 301 and step 302 may be replaced by:
the first terminal obtains live special effect information selected by a first user, obtains configuration information corresponding to the selected live special effect information from a corresponding relation between the live special effect information configured by developers and the configuration information according to the selected live special effect information, and displays the configuration information corresponding to the selected live special effect information.
In order to prevent the first user from misoperation, the first terminal can be provided with an interface, and when the starting state of the interface is an opening state, the live special effect information is played according to the method provided by the embodiment of the invention; when the starting state of the interface is the closing state, ending; the specific process can be as follows:
the method comprises the steps that a first terminal obtains an interface starting state used for triggering and playing of live special effect information; when the interface start state is the on state, execute step 303; and ending when the starting state of the interface is the closing state.
The live interface displayed by the first terminal comprises the interface, the first user can click the interface to set the interface starting terminal to be in a closed state or an open state, and the specific process can be as follows:
when the first terminal detects that the interface is triggered, the first terminal acquires the starting state of the interface, and if the starting state of the interface is the starting state, the first terminal sets the starting state of the interface from the starting state to the closing state; and if the interface starting state is the closing state, the first terminal sets the interface starting state from the closing state to the opening state.
Step 303: the method comprises the steps that a first terminal obtains touch operation of a first user touching a first terminal screen in a live broadcast interface displayed by the first terminal.
In the live broadcasting process, when a first user wants to play certain live special effect information, the first user executes touch operation of which the configuration information is matched with the live special effect information in a live broadcasting interface displayed by a first terminal; the method comprises the steps that a first terminal obtains touch operation of a first user touching a first terminal screen in a live broadcast interface displayed by the first terminal.
For example, when the first user plays a crowd's laughter, the first user touches the first terminal screen with one finger, see fig. 3-3; the first terminal obtains touch operation of a first user touching a first terminal screen through one finger.
For another example, when the first user plays a violent clap, the first user touches the first terminal screen with two fingers, referring to fig. 3-4, the first terminal obtains a touch operation of the first user touching the first terminal screen with two fingers.
For another example, when the first user plays the lovely game, the first user touches the first terminal screen with three fingers, referring to fig. 3 to 5, and the first terminal obtains a touch operation of the first user touching the first terminal screen with the three fingers.
For another example, when the first user plays Oh no, the first user touches the first terminal screen with four fingers, and referring to fig. 3 to 6, the first terminal obtains a touch operation that the first user touches the first terminal screen with four fingers.
Step 304: and the first terminal acquires the configuration information of the touch operation and acquires live special effect information matched with the touch operation from the corresponding relation between the configuration information and the live special effect information according to the configuration information.
For example, the first terminal obtains configuration information of a touch operation as one (finger) according to the touch operation that the first user touches the first terminal screen through one finger, and according to the configuration information: one (finger), the live special effect information matched with the touch operation is obtained from the table 2 and is the laugh of the masses.
For another example, the first terminal obtains configuration information of two (fingers) of a touch operation according to the touch operation that the first user touches the first terminal screen with the two fingers, and according to the configuration information: two (fingers) acquire the live special effect information matched with the touch operation from the table 2 as fierce palm sound.
For another example, the first terminal obtains configuration information of three (fingers) of a touch operation according to the touch operation that the first user touches the first terminal screen with three fingers, and according to the configuration information: and thirdly, acquiring the live broadcast special effect information matched with the touch operation from the table 2 to indicate that the game is lovely.
For another example, the first terminal obtains configuration information of four (fingers) of a touch operation according to the touch operation that the first user touches the first terminal screen with four fingers, and according to the configuration information: and fourthly (one finger), acquiring the live special effect information matched with the touch operation from the table 2 as Oh no.
Step 305: and the first terminal sends the live special effect information to a second terminal corresponding to a second user, and the second terminal plays the live special effect information in a live interface displayed on the second terminal.
This step is the same as step 205, and is not described herein again.
In the embodiment of the invention, when a first user plays certain live special effect information to a second user, the first user performs touch operation on a first terminal screen, wherein the configuration information is matched with the live special effect information; the first terminal acquires the live special effect information according to the configuration information, sends the live special effect information to a second terminal corresponding to a second user, and plays the live special effect information in a live interface displayed on the second terminal through the second terminal, so that the first user can play the live special effect information only by executing one touch operation on a screen of the first terminal, the process of playing the live special effect information is simplified, and the playing efficiency is greatly improved.
The embodiment of the invention provides a device for playing live special effect information, which can be a first terminal in the embodiment and is used for executing the method for playing the live special effect information.
Referring to fig. 4, wherein the apparatus comprises:
a first obtaining module 401, configured to obtain, in a live interface displayed by a first terminal, a touch operation that a first user touches a screen of the first terminal;
a second obtaining module 402, configured to obtain, according to the touch operation, live special effect information matched with the touch operation;
the playing module 403 is configured to send the live special effect information to a second terminal corresponding to a second user, and play the live special effect information in a live interface displayed on the second terminal by the second terminal, where the second user is a user watching a live broadcast of the first user.
Optionally, the second obtaining module 402 is configured to obtain, according to the touch operation, live special effect information matched with the touch operation from a corresponding relationship between the touch operation and the live special effect information.
Optionally, the apparatus further comprises:
the third acquisition module is used for acquiring the live special effect information selected by the first user and acquiring touch operation which is set by the first user and matched with the selected live special effect information;
and the first establishing module is used for establishing the corresponding relation between the selected live special effect information and the touch operation matched with the selected live special effect information.
Optionally, the second obtaining module 402 is configured to obtain configuration information of the touch operation, and obtain live special effect information matched with the touch operation from a corresponding relationship between the configuration information and the live special effect information according to the configuration information.
Optionally, the apparatus further comprises:
the fourth acquisition module is used for acquiring live special effect information selected by the first user and acquiring configuration information of touch operation matched with the selected live special effect information, wherein the configuration information is set by the first user;
and the second establishing module is used for establishing the corresponding relation between the selected live special effect information and the configuration information of the touch operation matched with the selected live special effect information.
Optionally, the third obtaining module or the fourth obtaining module is further configured to display a special effect information page, where the special effect information page includes a plurality of live special effect information and a selection interface of each live special effect information in the plurality of live special effect information; acquiring a selection interface selected by a first user in the special effect information page; and acquiring the live special effect information corresponding to the selected selection interface.
Optionally, the configuration information includes one or more of a number of contact points of the touch operation, a touch trajectory, and a touch position.
Optionally, the apparatus further comprises:
the fifth acquisition module is used for acquiring an interface starting state for triggering the playing of the live special effect information; when the interface start state is an open state, the first obtaining module 401 is configured to obtain a touch operation of a first user touching a first terminal screen.
In the embodiment of the invention, when a first user plays certain live special effect information to a second user, the first user performs touch operation matched with the live special effect information on a first terminal screen; the first terminal acquires the live special effect information according to the touch operation, sends the live special effect information to a second terminal corresponding to a second user, and plays the live special effect information in a live interface displayed on the second terminal through the second terminal, so that the first user can play the live special effect information only by executing one touch operation on a screen of the first terminal, the process of playing the live special effect information is simplified, and the playing efficiency is greatly improved.
Fig. 5 is a schematic structural diagram of a first terminal according to an embodiment of the present invention. The first terminal may be configured to implement the functions executed by the first terminal in the method for playing live special effect information shown in the foregoing embodiment. Specifically, the method comprises the following steps:
the first terminal 500 may include RF (Radio Frequency) circuitry 110, memory 120 including one or more computer-readable storage media, an input unit 130, a display unit 140, a sensor 150, audio circuitry 160, a transmission module 170, a processor 180 including one or more processing cores, and a power supply 190. Those skilled in the art will appreciate that the first terminal structure shown in fig. 5 is not intended to be limiting of the first terminal and may include more or fewer components than shown, or some components may be combined, or a different arrangement of components. Wherein:
the RF circuit 110 may be used for receiving and transmitting signals during information transmission and reception or during a call, and in particular, receives downlink information from a base station and then sends the received downlink information to the one or more processors 180 for processing; in addition, data relating to uplink is transmitted to the base station. In general, the RF circuitry 110 includes, but is not limited to, an antenna, at least one Amplifier, a tuner, one or more oscillators, a Subscriber Identity Module (SIM) card, a transceiver, a coupler, an LNA (Low Noise Amplifier), a duplexer, and the like. In addition, the RF circuit 110 may also communicate with a network and other first terminals through wireless communication. The wireless communication may use any communication standard or protocol, including but not limited to GSM (Global System for mobile communications), GPRS (General Packet Radio Service), CDMA (Code Division Multiple Access), WCDMA (Wideband Code Division Multiple Access), LTE (Long Term Evolution), email, SMS (Short Messaging Service), and the like.
The memory 120 may be used to store software programs and modules, such as the software programs and modules corresponding to the first terminal shown in the above exemplary embodiments, and the processor 180 executes various functional applications and data processing, such as implementing video-based interaction, by operating the software programs and modules stored in the memory 120. The memory 120 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the first terminal 500, and the like. Further, the memory 120 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device. Accordingly, the memory 120 may further include a memory controller to provide the processor 180 and the input unit 130 with access to the memory 120.
The input unit 130 may be used to receive input numeric or character information and generate keyboard, mouse, joystick, optical or trackball signal inputs related to user settings and function control. In particular, the input unit 130 may include a touch-sensitive surface 131 as well as other input first terminals 132. The touch-sensitive surface 131, also referred to as a touch display screen or a touch pad, may collect touch operations by a user on or near the touch-sensitive surface 131 (e.g., operations by a user on or near the touch-sensitive surface 131 using a finger, a stylus, or any other suitable object or attachment), and drive the corresponding link device according to a predetermined program. Alternatively, the touch sensitive surface 131 may comprise two parts, a touch detection means and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 180, and can receive and execute commands sent by the processor 180. Additionally, the touch-sensitive surface 131 may be implemented using various types of resistive, capacitive, infrared, and surface acoustic waves. In addition to the touch sensitive surface 131, the input unit 130 may also comprise other input first terminals 132. In particular, other input first terminals 132 may include, but are not limited to, one or more of a physical keyboard, function keys (such as volume control keys, switch keys, etc.), a trackball, a mouse, a joystick, and the like.
The display unit 140 may be used to display information input by or provided to the user and various graphic user interfaces of the first terminal 500, which may be configured by graphics, text, icons, video, and any combination thereof. The Display unit 140 may include a Display panel 141, and optionally, the Display panel 141 may be configured in the form of an LCD (Liquid Crystal Display), an OLED (Organic Light-Emitting Diode), or the like. Further, the touch-sensitive surface 131 may cover the display panel 141, and when a touch operation is detected on or near the touch-sensitive surface 131, the touch operation is transmitted to the processor 180 to determine the type of the touch event, and then the processor 180 provides a corresponding visual output on the display panel 141 according to the type of the touch event. Although in FIG. 5, touch-sensitive surface 131 and display panel 141 are shown as two separate components to implement input and output functions, in some embodiments, touch-sensitive surface 131 may be integrated with display panel 141 to implement input and output functions.
The first terminal 500 may also include at least one sensor 150, such as a light sensor, a motion sensor, and other sensors. Specifically, the light sensor may include an ambient light sensor that may adjust the brightness of the display panel 141 according to the brightness of ambient light, and a proximity sensor that may turn off the display panel 141 and/or the backlight when the first terminal 500 moves to the ear. As one of the motion sensors, the gravity acceleration sensor can detect the magnitude of acceleration in each direction (generally, three axes), can detect the magnitude and direction of gravity when the mobile phone is stationary, and can be used for applications of recognizing the posture of the mobile phone (such as horizontal and vertical screen switching, related games, magnetometer posture calibration), vibration recognition related functions (such as pedometer and tapping), and the like; as for other sensors such as a gyroscope, a barometer, a hygrometer, a thermometer, and an infrared sensor, which can be further configured in the first terminal 500, detailed descriptions thereof are omitted.
The audio circuit 160, speaker 161, microphone 162 may provide an audio interface between a user and the first terminal 500. The audio circuit 160 may transmit the electrical signal converted from the received audio data to the speaker 161, and convert the electrical signal into a sound signal for output by the speaker 161; on the other hand, the microphone 162 converts the collected sound signal into an electric signal, converts the electric signal into audio data after being received by the audio circuit 160, and then outputs the audio data to the processor 180 for processing, and then to the RF circuit 110 to be transmitted to, for example, another first terminal, or outputs the audio data to the memory 120 for further processing. The audio circuit 160 may also include an earbud jack to provide communication of a peripheral headset with the first terminal 500.
The first terminal 500 may help the user send and receive e-mails, browse web pages, access streaming media, etc. through the transmission module 170, which provides the user with wireless or wired broadband internet access. Although fig. 5 shows the transmission module 170, it is understood that it does not belong to the essential constitution of the first terminal 500 and may be omitted entirely within the scope not changing the essence of the invention as needed.
The processor 180 is a control center of the first terminal 500, links various parts of the entire handset using various interfaces and lines, and performs various functions of the first terminal 500 and processes data by operating or executing software programs and/or modules stored in the memory 120 and calling data stored in the memory 120, thereby monitoring the entire handset. Optionally, processor 180 may include one or more processing cores; preferably, the processor 180 may integrate an application processor, which mainly handles operating systems, user interfaces, application programs, etc., and a modem processor, which mainly handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor 180.
The first terminal 500 further includes a power supply 190 (e.g., a battery) for supplying power to various components, preferably, the power supply may be logically connected to the processor 180 via a power management system, so as to manage charging, discharging, and power consumption management functions via the power management system. The power supply 190 may also include any component including one or more of a dc or ac power source, a recharging system, a power failure detection circuit, a power converter or inverter, a power status indicator, and the like.
Although not shown, the first terminal 500 may further include a camera, a bluetooth module, etc., which will not be described herein. Specifically, in this embodiment, the display unit of the first terminal is a touch screen display, the first terminal further includes a memory, and one or more programs, where the one or more programs are stored in the memory, and the one or more programs configured to be executed by the one or more processors include instructions for:
the method comprises the steps that touch operation of a first user touching a first terminal screen is obtained in a live broadcast interface displayed by a first terminal;
acquiring live special effect information matched with the touch operation according to the touch operation;
and sending the live special effect information to a second terminal corresponding to a second user, and playing the live special effect information in a live interface displayed on the second terminal by the second terminal, wherein the second user is a user watching the live broadcast of the first user.
Optionally, the obtaining, according to the touch operation, live special effect information matched with the touch operation includes:
and according to the touch operation, acquiring live special effect information matched with the touch operation from the corresponding relation between the touch operation and the live special effect information.
Optionally, the method further comprises:
acquiring live special effect information selected by the first user, and acquiring touch operation which is set by the first user and matched with the selected live special effect information;
and establishing a corresponding relation between the selected live special effect information and the touch operation matched with the selected live special effect information.
Optionally, the obtaining, according to the touch operation, live special effect information matched with the touch operation includes:
and acquiring configuration information of the touch operation, and acquiring live special effect information matched with the touch operation from the corresponding relation between the configuration information and the live special effect information according to the configuration information.
Optionally, the method further comprises:
acquiring live special effect information selected by the first user, and acquiring configuration information of touch operation matched with the selected live special effect information, which is set by the first user;
and establishing a corresponding relation between the selected live special effect information and the configuration information of the touch operation matched with the selected live special effect information.
Optionally, the obtaining of the live special effect information selected by the first user includes:
displaying a special effect information page, wherein the special effect information page comprises a plurality of live special effect information and a selection interface of each live special effect information in the live special effect information;
acquiring a selection interface selected by the first user in the special effect information page;
and acquiring the live special effect information corresponding to the selected selection interface.
Optionally, the configuration information of the touch operation includes one or more of the number of contact points, the touch trajectory, and the touch position of the touch operation.
Optionally, before the obtaining of the touch operation of the first user touching the first terminal screen, the method further includes:
acquiring an interface starting state for triggering the playing of the live special effect information; and when the interface starting state is the opening state, executing the step of acquiring the touch operation of the first user touching the first terminal screen.
In the embodiment of the invention, when a first user plays certain live special effect information to a second user, the first user performs touch operation matched with the live special effect information on a first terminal screen; the first terminal acquires the live special effect information according to the touch operation, sends the live special effect information to a second terminal corresponding to a second user, and plays the live special effect information in a live interface displayed on the second terminal through the second terminal, so that the first user can play the live special effect information only by executing one touch operation on a screen of the first terminal, the process of playing the live special effect information is simplified, and the playing efficiency is greatly improved.
It should be noted that: the device for playing the live special effect information provided by the above embodiment is exemplified by only the division of the above functional modules when playing the live special effect information, and in practical applications, the above function distribution may be completed by different functional modules as needed, that is, the internal structure of the device is divided into different functional modules to complete all or part of the above described functions. In addition, the apparatus for playing live special effect information and the method embodiment for playing live special effect information provided by the above embodiments belong to the same concept, and specific implementation processes thereof are described in detail in the method embodiment and are not described herein again.
It will be understood by those skilled in the art that all or part of the steps for implementing the above embodiments may be implemented by hardware, or may be implemented by a program instructing relevant hardware, where the program may be stored in a computer-readable storage medium, and the above-mentioned storage medium may be a read-only memory, a magnetic disk or an optical disk, etc.
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the invention, and any modifications, equivalents, improvements and the like that fall within the spirit and principle of the present invention are intended to be included therein.

Claims (7)

1. A method of playing live special effect information, the method comprising:
the method comprises the steps that operation of a first user on a selection interface is obtained from a special effect information page displayed by a first terminal, wherein the special effect information page comprises a plurality of live special effect information and selection interfaces of each live special effect information;
determining a selection interface in an open state according to the operation on the selection interface;
determining the live special effect information corresponding to the selection interface in the open state as the live special effect information selected by the first user;
acquiring configuration information of touch operation, which is set by the first user and matched with the selected live special effect information, in a touch operation setting interface displayed by the first terminal;
establishing a corresponding relation between the selected live special effect information and configuration information of touch operation matched with the selected live special effect information;
the method comprises the steps that touch operation of a first user touching a first terminal screen is obtained in a live broadcast interface displayed by a first terminal;
acquiring configuration information of the touch operation, and acquiring live special effect information matched with the touch operation from a corresponding relation between the configuration information and live special effect information according to the configuration information, wherein the configuration information of the touch operation comprises touch positions of the touch operation, each piece of live special effect information corresponds to one touch position, and the live special effect information matched with the touch operation is one of the live special effect information selected by the user;
and sending the live special effect information to a second terminal corresponding to a second user, and playing the live special effect information in a live interface displayed on the second terminal by the second terminal, wherein the second user is a user watching the live broadcast of the first user.
2. The method of claim 1, wherein the configuration information of the touch operation comprises one or more of a number of contact points and a touch trajectory of the touch operation.
3. The method according to claim 1, wherein before the obtaining of the touch operation of the first user touching the first terminal screen, further comprising:
acquiring an interface starting state for triggering the playing of the live special effect information; and when the interface starting state is the opening state, executing the step of acquiring the touch operation of the first user touching the first terminal screen.
4. An apparatus for playing live special effect information, the apparatus comprising:
the method comprises the steps that a module used for obtaining the operation of a first user on a selection interface in a special effect information page displayed by a first terminal is used, wherein the special effect information page comprises a plurality of live special effect information and each selection interface of the live special effect information; a module for determining the selection interface in the open state according to the operation on the selection interface;
a module for determining live special effect information corresponding to the selection interface in the open state as the live special effect information selected by the first user;
a fourth obtaining module, configured to obtain, in a touch operation setting interface displayed by the first terminal, configuration information of a touch operation that is set by the first user and matches the selected live special effect information;
the second establishing module is used for establishing the corresponding relation between the selected live special effect information and the configuration information of the touch operation matched with the selected live special effect information;
the first obtaining module is used for obtaining touch operation of a first user touching a first terminal screen in a live broadcast interface displayed by a first terminal;
the second obtaining module is used for obtaining configuration information of the touch operation, and obtaining live special effect information matched with the touch operation from a corresponding relation between the configuration information and the live special effect information according to the configuration information, wherein the configuration information of the touch operation comprises touch positions of the touch operation, each piece of live special effect information corresponds to one touch position, and the live special effect information matched with the touch operation is one of the live special effect information selected by the user;
and the playing module is used for sending the live special effect information to a second terminal corresponding to a second user, and the second terminal plays the live special effect information in a live interface displayed on the second terminal, wherein the second user is a user watching the live broadcast of the first user.
5. The apparatus of claim 4, wherein the configuration information of the touch operation comprises one or more of a number of contact points and a touch trajectory of the touch operation.
6. The apparatus of claim 4, further comprising:
the fifth acquisition module is used for acquiring an interface starting state for triggering the playing of the live special effect information; and when the interface starting state is the starting state, the first acquisition module is used for acquiring the touch operation of a first user touching the first terminal screen.
7. A computer-readable storage medium having instructions stored therein; the instructions, when executed on a processing component, cause the processing component to perform the method of playing live effects information of any of claims 1 to 3.
CN201610658817.7A 2016-08-11 2016-08-11 Method and device for playing live special effect information Active CN106303733B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201610658817.7A CN106303733B (en) 2016-08-11 2016-08-11 Method and device for playing live special effect information

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610658817.7A CN106303733B (en) 2016-08-11 2016-08-11 Method and device for playing live special effect information

Publications (2)

Publication Number Publication Date
CN106303733A CN106303733A (en) 2017-01-04
CN106303733B true CN106303733B (en) 2020-06-26

Family

ID=57668462

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610658817.7A Active CN106303733B (en) 2016-08-11 2016-08-11 Method and device for playing live special effect information

Country Status (1)

Country Link
CN (1) CN106303733B (en)

Families Citing this family (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106878825B (en) * 2017-01-09 2021-07-06 腾讯科技(深圳)有限公司 Live broadcast-based sound effect display method and device
CN107124658B (en) * 2017-05-02 2019-10-11 北京小米移动软件有限公司 Net cast method and device
CN107105315A (en) * 2017-05-11 2017-08-29 广州华多网络科技有限公司 Live broadcasting method, the live broadcasting method of main broadcaster's client, main broadcaster's client and equipment
CN107302716B (en) * 2017-05-15 2019-03-15 武汉斗鱼网络科技有限公司 A kind of method of live game, Zhu Boduan, direct broadcast server and client
CN107800871A (en) * 2017-09-27 2018-03-13 光锐恒宇(北京)科技有限公司 The display of special efficacy and querying method and device, terminal device and cloud server
CN108024134B (en) * 2017-11-08 2020-01-21 北京密境和风科技有限公司 Live broadcast-based data analysis method and device and terminal equipment
CN107948667B (en) * 2017-12-05 2020-06-30 广州酷狗计算机科技有限公司 Method and device for adding display special effect in live video
CN108897597B (en) * 2018-07-20 2021-07-13 广州方硅信息技术有限公司 Method and device for guiding configuration of live broadcast template
CN111246232A (en) * 2020-01-17 2020-06-05 广州华多网络科技有限公司 Live broadcast interaction method and device, electronic equipment and storage medium

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104994421A (en) * 2015-06-30 2015-10-21 广州华多网络科技有限公司 Interaction method, device and system of virtual goods in live channel
CN105828091A (en) * 2016-03-28 2016-08-03 广州华多网络科技有限公司 Method and system for video program playing in network broadcast

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160150284A1 (en) * 2014-11-20 2016-05-26 Squaredon Co Ltd Dynamic channel selection for live and previously broadcast content
CN104618797B (en) * 2015-02-06 2018-02-13 腾讯科技(北京)有限公司 Information processing method, device and client
CN105373306B (en) * 2015-10-13 2018-10-30 广州酷狗计算机科技有限公司 Virtual objects presentation method and device
CN105335050A (en) * 2015-10-30 2016-02-17 广州华多网络科技有限公司 Special effect displaying method and device
CN105487762A (en) * 2015-12-22 2016-04-13 武汉斗鱼网络科技有限公司 Method and device for triggering virtual product in live scene

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104994421A (en) * 2015-06-30 2015-10-21 广州华多网络科技有限公司 Interaction method, device and system of virtual goods in live channel
CN105828091A (en) * 2016-03-28 2016-08-03 广州华多网络科技有限公司 Method and system for video program playing in network broadcast

Also Published As

Publication number Publication date
CN106303733A (en) 2017-01-04

Similar Documents

Publication Publication Date Title
CN106303733B (en) Method and device for playing live special effect information
US10659844B2 (en) Interaction method and system based on recommended content
CN106686396B (en) Method and system for switching live broadcast room
CN109905754B (en) Virtual gift receiving method and device and storage equipment
TWI565315B (en) Method of interactions based on video, terminal, server and system thereof
WO2016169465A1 (en) Method, apparatus and system for displaying screen information
CN111294638B (en) Method, device, terminal and storage medium for realizing video interaction
CN108920084B (en) Visual field control method and device in game
CN106803993B (en) Method and device for realizing video branch selection playing
CN106331826B (en) A kind of methods, devices and systems of setting live streaming template and video mode
CN106973330B (en) Screen live broadcasting method, device and system
CN106210755B (en) A kind of methods, devices and systems playing live video
CN106254910B (en) Method and device for recording image
CN106231433B (en) A kind of methods, devices and systems playing network video
CN107333162B (en) Method and device for playing live video
CN106375179B (en) Method and device for displaying instant communication message
CN108616771B (en) Video playing method and mobile terminal
CN108958606B (en) Split screen display method and device, storage medium and electronic equipment
CN110673770B (en) Message display method and terminal equipment
CN107276984B (en) Game live broadcast method and device and mobile terminal
CN111491197A (en) Live content display method and device and storage medium
CN108958629B (en) Split screen quitting method and device, storage medium and electronic equipment
CN106791916B (en) Method, device and system for recommending audio data
CN107566909B (en) Barrage-based video content searching method and user terminal
WO2017084289A1 (en) Method, apparatus and system for presenting information

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant