CN110677685A - Network live broadcast display method and device - Google Patents

Network live broadcast display method and device Download PDF

Info

Publication number
CN110677685A
CN110677685A CN201910844058.7A CN201910844058A CN110677685A CN 110677685 A CN110677685 A CN 110677685A CN 201910844058 A CN201910844058 A CN 201910844058A CN 110677685 A CN110677685 A CN 110677685A
Authority
CN
China
Prior art keywords
user
emotion
emotional
live
effect
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910844058.7A
Other languages
Chinese (zh)
Other versions
CN110677685B (en
Inventor
张振伟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN201910844058.7A priority Critical patent/CN110677685B/en
Publication of CN110677685A publication Critical patent/CN110677685A/en
Application granted granted Critical
Publication of CN110677685B publication Critical patent/CN110677685B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/21Server components or server architectures
    • H04N21/218Source of audio or video content, e.g. local disk arrays
    • H04N21/2187Live feed
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • H04N21/4312Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/442Monitoring of processes or resources, e.g. detecting the failure of a recording device, monitoring the downstream bandwidth, the number of times a movie has been viewed, the storage space available from the internal hard disk
    • H04N21/44213Monitoring of end-user related data
    • H04N21/44218Detecting physical presence or behaviour of the user, e.g. using sensors to detect if the user is leaving the room or changes his face expression during a TV program
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/478Supplemental services, e.g. displaying phone caller identification, shopping application
    • H04N21/4788Supplemental services, e.g. displaying phone caller identification, shopping application communicating with other users, e.g. chatting

Abstract

The embodiment of the application discloses a live network display method and device. The network live broadcast display method comprises the following steps: obtaining emotion information of at least one user in a live webcast room when watching live content; determining an emotional expression effect corresponding to the at least one user according to the emotional information; and sending the emotional expression effect to a main broadcasting terminal corresponding to a main broadcasting of the network live broadcasting room, so that the main broadcasting terminal displays the emotional expression effect. According to the method and the device, the anchor can acquire the emotion change of the watching user in real time, live broadcast content is adjusted in time, the wonderful degree of the live broadcast content is improved, and then the interest degree of the watching user on the live broadcast content is improved.

Description

Network live broadcast display method and device
Technical Field
The application relates to the technical field of communication, in particular to a live network display method and device.
Background
With the development of internet technology, live webcasts attract more and more viewing users in novel forms and rich contents. Typically, webcasts provide live content from a host, and users can select live content of interest to view.
However, in the existing webcast, the anchor can only obtain the feedback of the user to the live content by checking the information of the user, such as the left message and the gift. The information can not enable the anchor to acquire the emotion changes of all watching users in real time, so that the live broadcast content can not be adjusted in real time, and the interest degree of the users in the live broadcast content is not improved.
Disclosure of Invention
The application provides a live webcast display method and device, which enable a main broadcast to acquire emotion changes of watching users in real time, adjust live content in time, improve the wonderful degree of the live content and further improve the interest degree of the watching users in the live content.
In a first aspect, the present application provides a live webcast display method, including:
obtaining emotion information of at least one user in a live webcast room when watching live content;
determining an emotional expression effect corresponding to the at least one user according to the emotional information;
and sending the emotional expression effect to a main broadcasting terminal corresponding to a main broadcasting of the network live broadcasting room, so that the main broadcasting terminal displays the emotional expression effect.
In some embodiments of the present application, the mood information comprises mood data for each of the at least one user;
the obtaining of the emotion information of at least one user watching the live content in the live webcast room specifically includes:
and respectively taking each user in at least one user in the live webcast room as a target user, and acquiring emotion data of the target user, which is acquired by a user terminal corresponding to the target user, wherein the emotion data is face change data of the target user when the target user watches the live content.
In some embodiments of the present application, the obtaining of emotion data of the target user, which is collected by a user terminal corresponding to the target user, specifically includes:
sending an emotion acquisition request to a user terminal corresponding to the target user, so that the user terminal starts a shooting function after responding to the emotion acquisition request to shoot the face of the target user, and obtaining emotion data of the target user;
and acquiring emotion data of the target user uploaded by the user terminal.
In some embodiments of the present application, the emotional performance effect comprises an emotional numerical effect;
the determining, according to the emotional information, an emotional performance effect corresponding to the at least one user specifically includes:
determining the emotion score value of the target user according to the emotion data of the target user;
counting the total value of the emotion scores of the at least one user according to the emotion score value of the target user;
and determining the emotional numerical effect corresponding to the total emotional score value.
In some embodiments of the present application, the emotional expression effect further comprises an overall emotional expression package of the at least one user and a background effect of the emotional numerical effect;
the determining the emotional performance effect corresponding to the at least one user according to the emotional information further includes:
determining a target score interval in which the total emotion score value is located according to a plurality of preset score intervals;
acquiring an integral emotion expression packet corresponding to the target score interval from a preset expression packet database;
and acquiring a background effect corresponding to the target score interval from a preset background effect database.
In some embodiments of the present application, the emotional expression effect comprises an emotional expression package of the target user;
the determining, according to the emotional information, an emotional performance effect corresponding to the at least one user specifically includes:
according to the emotion data of the target user, recognizing the emotion level of the target user;
and acquiring the emotion expression packet corresponding to the emotion grade from a preset emotion expression packet database.
In some embodiments of the present application, the emotional performance effect further comprises a user avatar of the target user;
the sending of the emotional expression effect to a anchor terminal corresponding to an anchor of the live webcast room enables the anchor terminal to display the emotional expression effect, and the sending specifically includes:
and sending the emotional expression package and the user head portrait of the target user to a main broadcast terminal corresponding to a main broadcast of the network live broadcast room, so that the main broadcast terminal correspondingly displays the emotional expression package and the user head portrait of the target user, and at least part of the emotional expression package is overlapped and displayed on the corresponding user head portrait.
In some embodiments of the present application, the method further comprises:
acquiring an emotion display request sent by the anchor terminal;
and sending the emotional expression effect to a user terminal corresponding to the target user according to the emotional display request, so that the user terminal displays the emotional expression effect.
In some embodiments of the present application, the method further comprises:
acquiring live broadcast content adjusted by the anchor according to the emotional expression effect;
and sending the adjusted live broadcast content to a user terminal corresponding to the target user, so that the user terminal displays the adjusted live broadcast content.
In a second aspect, the present application provides a live webcast display method, including:
sending an emotion acquisition instruction to a server, wherein the emotion acquisition instruction is used for instructing the server to acquire emotion information of at least one user in a live webcast room when the user watches live content, and determining an emotional expression effect corresponding to the at least one user according to the emotion information;
acquiring the emotional expression effect fed back by the server;
and displaying the emotional expression effect.
In a third aspect, the present application provides a live webcast display apparatus, including:
the information acquisition module is used for acquiring emotion information of at least one user in a live webcast room when watching live content;
the determining module is used for determining the emotional expression effect corresponding to the at least one user according to the emotional information; and the number of the first and second groups,
and the sending module is used for sending the emotional expression effect to a main broadcasting terminal corresponding to a main broadcasting of the network live broadcasting room, so that the main broadcasting terminal displays the emotional expression effect.
In a fourth aspect, the present application provides a live webcast display apparatus, including:
the system comprises an instruction sending module, an emotion acquisition module and a display module, wherein the instruction sending module is used for sending an emotion acquisition instruction to a server, the emotion acquisition instruction is used for instructing the server to acquire emotion information of at least one user in a live webcast room when the user watches live content, and the emotion expression effect corresponding to the at least one user is determined according to the emotion information;
the acquisition module is used for acquiring the emotional expression effect fed back by the server;
and the display module is used for displaying the emotional expression effect.
According to the embodiment of the application, the emotion information of at least one user in the live webcast room is acquired when the live content is watched, the emotion expression effect corresponding to the watching user is determined according to the emotion information, the emotion expression effect is sent to the anchor terminal corresponding to the anchor in the live webcast room to be displayed, the anchor is enabled to acquire the emotion change of the watching user in real time according to the emotion expression effect displayed by the anchor terminal, the live content can be adjusted in time, interactivity between the anchor and the watching user is increased, the wonderful degree of the live content is improved, and the interest degree of the watching user in the live content is improved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1 is a schematic view of a scene of a live webcast display system provided in an embodiment of the present application;
fig. 2 is a flowchart illustrating an embodiment of a live webcast display method provided in an embodiment of the present application;
FIG. 3 is a diagram illustrating an example of an interface of a user terminal according to an embodiment of the present application;
FIG. 4 is a schematic diagram of interaction between a user and a corresponding user terminal in an embodiment of the present application;
fig. 5 is a diagram of a first example interface of an anchor terminal in an embodiment of the present application;
fig. 6 is a diagram of a second example interface of an anchor terminal in an embodiment of the present application;
fig. 7 is a diagram of a third example interface of the anchor terminal in the embodiment of the present application;
fig. 8 is a diagram of a fourth example interface of the anchor terminal in an embodiment of the present application;
fig. 9 is a diagram of a fifth example interface of the anchor terminal in the embodiment of the present application;
fig. 10 is a diagram of an example of a sixth interface of a anchor terminal in an embodiment of the present application;
fig. 11 is a schematic specific flowchart of a live webcast display method provided in an embodiment of the present application;
fig. 12 is a schematic structural diagram of an embodiment of a live webcast display apparatus provided in an embodiment of the present application;
fig. 13 is a schematic view of another scene of a live webcast display system provided in an embodiment of the present application;
fig. 14 is a flowchart illustrating another embodiment of a live webcast display method provided in an embodiment of the present application;
fig. 15 is a schematic structural diagram of another embodiment of a live webcast display apparatus provided in an embodiment of the present application;
fig. 16 is a schematic structural diagram of a server according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
In the description that follows, specific embodiments of the present application will be described with reference to steps and symbols executed by one or more computers, unless otherwise indicated. Accordingly, these steps and operations will be referred to, several times, as being performed by a computer, the computer performing operations involving a processing unit of the computer in electronic signals representing data in a structured form. This operation transforms the data or maintains it at locations in the computer's memory system, which may be reconfigured or otherwise altered in a manner well known to those skilled in the art. The data maintains a data structure that is a physical location of the memory that has particular characteristics defined by the data format. However, while the principles of the application have been described in language specific to above, it is not intended to be limited to the specific form set forth herein, and it will be recognized by those of ordinary skill in the art that various of the steps and operations described below may be implemented in hardware.
The term "module" or "unit" as used herein may be considered a software object executing on the computing system. The various components, modules, engines, and services described herein may be viewed as objects implemented on the computing system. The apparatus and method described herein are preferably implemented in software, but may also be implemented in hardware, and are within the scope of the present application.
As used herein, the singular forms "a", "an", "the" and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms "comprises" and/or "comprising," when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof. It will be understood that when an element is referred to as being "connected" or "coupled" to another element, it can be directly connected or coupled to the other element or intervening elements may also be present. Further, "connected" or "coupled" as used herein may include wirelessly connected or wirelessly coupled. As used herein, the term "and/or" includes all or any element and all combinations of one or more of the associated listed items.
The embodiment of the application provides a live network display method and device.
Referring to fig. 1, fig. 1 is a schematic view of a live webcast display system according to an embodiment of the present disclosure, where the live webcast display system may include a user terminal 100, an anchor terminal 200, and a server 300, where the user terminal 100 and the anchor terminal 200 are respectively connected to the server 300 through a network, and a live webcast display device is integrated in the server 300. In the embodiment of the present application, the server 300 is mainly used for acquiring emotion information of at least one user in a live webcast room when watching live content; determining an emotional expression effect corresponding to the at least one user according to the emotional information; and sending the emotional expression effect to a main broadcasting terminal corresponding to a main broadcasting of the network live broadcasting room, so that the main broadcasting terminal displays the emotional expression effect.
In this embodiment, the server 300 may be an independent server, or may be a server network or a server cluster composed of servers, for example, the server 300 described in this embodiment includes, but is not limited to, a computer, a network host, a single network server, a plurality of network server sets, or a cloud server composed of a plurality of servers. Among them, the Cloud server is constituted by a large number of computers or web servers based on Cloud Computing (Cloud Computing). In the embodiment of the present application, the server 300, the User terminal 100 and the anchor terminal 200 may implement communication through any communication manner, including but not limited to mobile communication based on the third Generation Partnership Project (3 GPP), Long Term Evolution (LTE), Worldwide Interoperability for microwave Access (WiMAX), or computer network communication based on the TCP/IP Protocol Suite (TCP/IP), User Datagram Protocol (UDP) Protocol, and the like.
It is to be understood that the user terminal 100 and the anchor terminal 200 used in the embodiments of the present application may be understood as client devices, which include both devices of receiving and transmitting hardware, i.e., devices having receiving and transmitting hardware capable of performing two-way communication over a two-way communication link. Such a client device may include: a cellular or other communication device having a single line display or a multi-line display or a cellular or other communication device without a multi-line display. The user terminal 100 and the anchor terminal 200 may be specifically desktop terminals or mobile terminals, and the mobile terminals may be specifically one of a mobile phone, a tablet computer, a notebook computer, and the like.
Those skilled in the art will understand that the application environment shown in fig. 1 is only one application scenario related to the present embodiment, and does not constitute a limitation on the application scenario of the present embodiment, and that other application environments may further include more or less servers than those shown in fig. 1, or a server network connection relationship, for example, only 1 server, 1 user terminal, and 1 anchor terminal are shown in fig. 1, and it is understood that the webcast display system may further include one or more other servers, or/and one or more user terminals and anchor terminals connected to the server network, and this is not limited herein.
In some embodiments of the present application, a live platform, such as a live platform of a fish dipper, strange fish, zanthoxylum bungeanum, etc., may be loaded in the server 300. The live broadcast platform can provide a live webcast room, and any user can create the live webcast room through an account on the live broadcast platform to become a main broadcast. The terminal corresponding to the anchor is the anchor terminal, the anchor terminal can access the live broadcast platform, and the anchor can release live broadcast content in the established network live broadcast room through the anchor terminal. The terminal corresponding to the watching user is a user terminal, the user terminal can access the live broadcast platform, and the user can enter a network live broadcast room through the user terminal to watch live broadcast contents released by the anchor broadcast.
In addition, as shown in fig. 1, the webcast display system may further include a memory 400 for storing data, such as a database for storing object data, where the object data may include application templates (e.g., various application templates such as an approval template and a card punching template), document data (e.g., documents in various formats such as Word documents, Excel documents, and PPT documents), picture data (e.g., pictures in various formats such as jpg, png, and bmp), and video data, and correspondingly, the database may also be divided into multiple types of data, such as an application database, a file database, a picture database, and a video database.
It should be noted that the scene schematic diagram of the live webcast display system shown in fig. 1 is merely an example, and the live webcast display system and the scene described in the embodiment of the present application are for more clearly illustrating the technical solution of the embodiment of the present application, and do not form a limitation on the technical solution provided in the embodiment of the present application.
The following is a detailed description of specific embodiments.
In the present embodiment, description will be made from the perspective of a live display apparatus, which may be specifically integrated in the server 200.
The application provides a live network display method, which comprises the following steps: obtaining emotion information of at least one user in a live webcast room when watching live content; determining an emotional expression effect corresponding to the at least one user according to the emotional information; and sending the emotional expression effect to a main broadcasting terminal corresponding to a main broadcasting of the network live broadcasting room, so that the main broadcasting terminal displays the emotional expression effect.
Please refer to fig. 2, which is a flowchart illustrating an embodiment of a live webcast display method in an embodiment of the present application, where the live webcast display method includes:
201. and acquiring emotion information of at least one user in the live webcast room when watching the live content.
In the embodiment of the application, the network live broadcast mainly refers to the synchronous production and information release of the scene along with the occurrence and development processes of events, and has an information network release mode of a bidirectional circulation process. The network live broadcast room is a platform used for releasing live broadcast contents, and the live broadcast contents mainly refer to audio and video contents and the like collected by a main broadcast terminal. The anchor terminal is a terminal corresponding to the anchor, the anchor is a person who releases the live content, the anchor shoots and releases the live content through the anchor terminal, and the anchor terminal can also play the live content so that the anchor can check the live content in real time and adjust the shooting visual angle of the live content in real time. The user refers to a viewer watching the live content, the terminal corresponding to the user is a user terminal, and the user terminal is used for playing the live content, so that the user can watch the live content through the user terminal, and meanwhile, the user can discuss the live content through the user terminal, interact with the anchor, and the like. The network live broadcast effectively popularizes the live broadcast content by utilizing the characteristics of intuition, rapidness, good expression form, rich content, strong interactivity, unlimited region, divisible audience and the like of the internet.
The server is loaded with a live broadcast platform, the anchor and the user have unique accounts on the live broadcast platform, namely the anchor and the user need to register before logging in the live broadcast platform, and the registration generally needs to upload related information, such as a user name, a password, gender, age, a user head portrait and the like. The relevant information of registration is stored in the registration database.
The live broadcast platform can provide a plurality of live webcast rooms, the anchor can create live webcast rooms on the live broadcast platform through the account number of the anchor, and only one live webcast room can be created by one anchor. The anchor can distribute live broadcast contents of different types in a live broadcast room of the network through the anchor terminal when live broadcast is carried out each time, such as singing live broadcast, eating live broadcast, commentary live broadcast and the like, and the anchor can also adjust the live broadcast contents in real time through the anchor terminal in the live broadcast process. In order to meet the watching requirements of different users, the types of the live broadcast contents released by different live webcast rooms are different, the users can enter the interested live webcast room through the user terminals to watch the live broadcast contents of the corresponding types, and different users can enter the same live webcast room to watch the same live broadcast contents.
In the process of watching the live content, the user can send information, a main prize-watching broadcast and the like according to the live content, and the information, the gift and the like sent by the user can be displayed in the live content in real time, so that the interaction between the user and the main broadcast is realized. In the embodiment of the application, at least one user watches live broadcast content in a live broadcast room, and the anchor of the live broadcast room can further realize interaction with the watching user by knowing the emotional state of the at least one user. The identification of the emotional state may be implemented by obtaining emotional information of the at least one user, where the emotional information includes emotional data of each user in the at least one user, and the step 201 of obtaining the emotional information of the at least one user when watching the live content in the live webcast room specifically includes: and respectively taking each user in at least one user in the live webcast room as a target user, and acquiring emotion data of the target user, which is acquired by a user terminal corresponding to the target user. Wherein, the emotion data is face change data when the target user watches live broadcast content.
It should be noted that, during the process of watching the live content, the user may present different facial expressions based on the change of the live content, that is, the facial data of the user may change. The emotion data of the user can be obtained by collecting the face change data of the user. For example, when seeing interesting live content, a user may laugh, and the laugh may cause changes of facial muscles of the user, such as facial mouth corners rising, and the emotion data of the user is acquired by collecting facial change data of the rising amplitude of the facial mouth corners.
The user terminal may have a shooting function, for example, the user terminal may include a camera, and the user face is shot in real time by the camera of the user terminal to collect the change data of the user face, so as to obtain the emotion data of the user. Specifically, the acquiring of the emotion data of the target user, which is acquired by the user terminal corresponding to the target user, includes: sending an emotion acquisition request to a user terminal corresponding to the target user, so that the user terminal starts a shooting function after responding to the emotion acquisition request to shoot the face of the target user, and obtaining emotion data of the target user; and acquiring emotion data of the target user uploaded by the user terminal.
The anchor terminal can be provided with an emotion acquisition function button, when the anchor needs to know emotion change of a target user, the anchor terminal clicks the emotion acquisition function button to send an emotion acquisition instruction to the server, and the server sends an emotion acquisition request to a user terminal corresponding to the target user according to the emotion acquisition instruction. After receiving the emotion acquisition request, the user terminal corresponding to the target user pops up a prompt box in the live broadcast content played by the user terminal, and as shown in fig. 3, the content of the prompt box may be "allow the camera to identify your emotion" to prompt the target user whether to allow the user terminal to acquire emotion data of the target user. In addition, the prompt box is provided with two function buttons of 'reject' and 'permit', if the user clicks the 'reject' function button, the user is indicated that the user is not permitted to collect emotion data of the user terminal, and the user terminal discards an emotion collection request sent by the server; if the user clicks the 'allow' function button, it indicates that the user allows the user terminal to collect emotion data of the user, that is, the user allows the camera to start an emotion collection function, as shown in fig. 4, the user terminal responds to an emotion collection request sent by the server, starts a shooting function, and collects face change data of the user in real time through the camera to acquire emotion data of the user.
Specifically, when the user terminal collects emotion data of the user, the face of the user is shot through the camera. After acquiring the shot face image, the user terminal locates key parts in the face image, wherein the key parts can be five sense organs, such as left and right mouth corners, nose tips, left and right eyebrows, left and right eyes and the like. After the key part is positioned, the change of the pixel position of the image area where the key part is positioned is analyzed, and the change data is used as face change data of a user, namely emotion data. The method comprises the steps that after the emotion data of a user are obtained by a user terminal, the emotion data are sent to a server, and the server collects the emotion data sent by the user terminal corresponding to each user in a live webcast room, so that the emotion information of at least one user in the live webcast room can be obtained.
202. And determining the emotional expression effect corresponding to the at least one user according to the emotional information.
In the embodiment of the application, after the server acquires the emotion information, the emotion state of each user in the live webcast room can be identified by combining a machine learning algorithm, and the corresponding emotional expression effect of at least one user in the live webcast room is determined according to the identified emotion state. The emotional expression effect refers to the display effect of the emotional state on the anchor terminal.
The emotional state can be divided into a plurality of emotional levels in advance based on the common expression, for example, the emotional state can be divided into 5 emotional levels based on the degree of happiness of the user, namely, laughing, very happy, smiling, normal and not happy. The emotional state may further include other emotional levels, such as boredom, hurry, anger, etc., and the emotional level may be set according to actual specific needs, which is not limited herein.
After a plurality of emotion grades are set, an emotion recognition model can be constructed, a large number of facial image samples are collected, the facial image samples generally require different facial expressions, namely, the collected facial image samples cover different emotion grades. And the server acquires emotion data in the face image sample and marks corresponding emotion grades in the acquired emotion data, wherein the emotion data acquisition method of the server is the same as that of the user terminal. And then, the server inputs the emotion data marked with the emotion grades into the constructed emotion recognition model so as to train the emotion recognition model. The trained emotion recognition model can be used for recognizing the emotion state of the user.
Specifically, after obtaining the emotion information, the server respectively inputs emotion data of each user in the emotion information into a trained emotion recognition model, and obtains an emotion grade of each user in the live webcast room. And further, according to the emotion level of each user in the live webcast room, determining the emotional expression effect corresponding to at least one user in the live webcast room.
It should be noted that the emotional expression effects may include multiple types, a type selection function button for the emotional expression effect may be set on the anchor terminal, and the anchor may select different types of emotional expression effects through the type selection function button. The different types of emotional performance effects are determined in different ways, as illustrated by the following examples:
(1) the emotional expression effect is the overall emotional expression effect of all users in the live webcast room, and the overall emotional expression effect is determined according to the emotional data of all the users.
The overall emotional expression effect also has a plurality of expression forms, such as emotional numerical effects, overall emotional expression packages of all users, and the like. After the anchor selects the type of emotional expression effect, the presentation form of the emotional expression effect may be further selected. The overall emotional performance of different manifestations is determined differently.
For example, when the overall emotional performance effect is an emotional numerical effect, the determining, according to the emotional information, the emotional performance effect corresponding to the at least one user in step 202 specifically includes: respectively taking each user of the at least one user as a target user, and determining the emotion score value of the target user according to the emotion data of the target user; counting the total value of the emotion scores of the at least one user according to the emotion score value of the target user; and determining the emotional numerical effect corresponding to the total emotional score value.
Specifically, after obtaining the emotion data of each user, the emotion level of each user may be identified, and different emotion levels may be preset with different level scores, for example, the emotion levels include laughter, very happy, smiley, normal, and not happy, and the level scores are 5, 4, 3, 2, and 1, respectively. When the emotion grade of the target user is recognized to be very happy, the grade score corresponding to the very happy is obtained to be 4, the grade score is the emotion score value of the target user, and the emotion score value of the target user is 4 at the moment.
And after the emotion score value of each user is obtained, counting the total emotion score value of all users. The statistical method may adopt an averaging method, i.e. adding the emotion score values of all users and dividing by the number of users. The average value may be used as the total value of the emotion scores of all the users, or the average value may be converted into a percentile, which is used as the total value of the emotion scores of all the users, for example, if the ranking score is between 1 and 5, the obtained average value is also between 1 and 5, the obtained average value is multiplied by 20 and converted into a percentile, and finally the converted integer is used as the total value of the emotion scores of all the users.
The emotional numerical effect means that the overall emotional expression effect is presented in a way of total emotional score, and the presentation effect of the total emotional score can also be specifically set, for example, the font size, the color and the like are set. The total emotion score value can be set differently according to the size of the value, for example, different score intervals are preset, and fonts and font sizes and colors corresponding to the different score intervals are also different, so that the target score interval in which the total emotion score value is located is determined, and further, the emotional numerical value effect corresponding to the total emotion score value is determined according to the fonts and font sizes and colors corresponding to the target score interval.
In addition, the emotional numerical effect may further set a background effect, that is, the overall emotional performance effect may include an emotional numerical effect and a background effect thereof, and the determining, according to the emotional information, the emotional performance effect corresponding to the at least one user in step 202 further includes: determining a target score interval in which the total emotion score value is located according to a plurality of preset score intervals; and acquiring a background effect corresponding to the target score interval from a preset background effect database.
Specifically, a numerical range in which the total emotion score value may exist is obtained, and the numerical range is divided into a plurality of score intervals, for example, the numerical range in which the total emotion score value may exist is 1 to 100, and the range of 1 to 100 is divided into 4 score intervals, which are 1 to 25, 26 to 50, 51 to 75, and 76 to 100, respectively. Furthermore, corresponding background effects are set according to the divided score intervals, one score interval can correspond to one background effect, and the background colors, the background shapes and the like of different background effects can be different, for example, the background color corresponding to the score intervals 1-25 is gray, the background color corresponding to the score intervals 26-50 is purple, the background color corresponding to the score intervals 51-75 is pink, and the background color corresponding to the score intervals 76-100 is red. Each score interval and its corresponding background effect are pre-stored in a background effects database, which may be a database stored in memory 400 as shown in fig. 1.
After obtaining the total value of the emotion scores, determining a score interval where the total value of the emotion scores is located, and then determining the background effect. For example, if the total emotion score value is 95, determining that a score interval corresponding to the total emotion score value is 76-100, further querying a background effect database, obtaining that a background effect corresponding to the score interval 76-100 is a red background, and taking 95 and the red background as a final overall emotion expression effect.
For example, when the overall emotional performance effect is an overall expression package of all users in the webcast room, the determining, according to the emotional information, the emotional performance effect corresponding to the at least one user in step 202 specifically includes: determining a target score interval in which the total emotion score value is located according to a plurality of preset score intervals; and acquiring the whole emotion expression packet corresponding to the target score interval from a preset expression packet database.
Specifically, a numerical range in which the total emotion score value may exist is obtained, and the numerical range is divided into a plurality of score intervals, for example, 4 score intervals, which are 1-25, 26-50, 51-75, and 76-100 respectively. And setting corresponding overall emotional expression packages according to the divided score intervals, wherein one score interval can correspond to one overall emotional expression package, for example, the overall emotional expression package corresponding to the score interval 1-25 is a non-happy expression package, the overall emotional expression package corresponding to the score interval 26-50 is a boring expression package, the overall emotional expression package corresponding to the score interval 51-75 is a smile expression package, and the overall emotional expression package corresponding to the score interval 76-100 is a happy expression package. The pattern of the expression bag can be set by oneself as long as different emotions can be distinguished. Each score interval and its corresponding overall emotional expression package are pre-stored in an expression package database, which may be a database stored in the memory 400 shown in fig. 1.
After the total emotion score value is obtained, the score interval where the total emotion score value is located is determined, and then the whole emotion expression package is determined. For example, if the total emotion score value is 95, determining that a score interval corresponding to the total emotion score value is 76-100, further querying an expression packet database, acquiring that the whole emotion expression packet corresponding to the score interval of 76-100 is a happy expression packet, and taking the happy expression packet as a final whole emotion expression effect.
It should be noted that the above embodiment only takes the emotional numerical effect and its background effect or the whole emotional expression package as the whole emotional expression effect as an example. In practical application, the emotion numerical effect, the background effect and the overall emotion expression package can be used as the overall emotional expression effect singly, integrally or randomly. In addition, the overall emotional expression effect of other expression forms can be set, as long as the overall emotion of all users in the live webcast room can be expressed, and details are not repeated here.
(2) The emotional expression effect is the emotional expression effect of each user in the live webcast room, and the emotional expression effect of each user is determined according to the emotional data of each user.
The emotional expression effect of each user also has a plurality of expression forms, such as an emotional expression package of each user, a combination of a user head portrait and the emotional expression package, and the like. After the anchor selects the type of emotional expression effect, the presentation form of the emotional expression effect may be further selected. The manner in which the emotional performance effects of different manifestations are determined also varies.
For example, when the emotional expression effect of each user is an emotional expression package, the determining the emotional expression effect corresponding to the at least one user according to the emotional information in step 202 specifically includes: respectively taking each user in at least one user in a live webcast room as a target user, and identifying the emotion level of the target user according to the emotion data of the target user; and acquiring the emotion expression packet corresponding to the emotion grade from a preset emotion expression packet database.
Specifically, after obtaining the emotion data of each user, the emotion level of each user can be identified, different emotion expression packages can be preset at different emotion levels, the emotion levels correspond to the emotions in the emotion expression packages, for example, the emotion levels are divided into happy, smiley, boring and boring, and the correspondingly set emotion expression packages are happy expression packages, smiley expression packages, boring expression packages and boring expression packages. The pattern of the expression bag can be set by oneself as long as different emotions can be distinguished. Each emotion level and its corresponding emotional expression package are previously stored in an emotional expression package database, which may be a database stored in the memory 400 shown in fig. 1.
For example, when the emotional expression effect of each user is the combination of the user head portrait and the emotional expression package, the emotional expression package of each user can be obtained through the method, meanwhile, the corresponding user head portrait is obtained from the registration database according to the account number of each user, and the user head portrait and the emotional expression package are used as the final emotional expression effect.
It should be noted that the above embodiment only takes the emotional expression package or the combination of the avatar of the user and the emotional expression package as the emotional expression effect of each user. In practical applications, the emotional expression effect of each user may also have other expression forms as long as the emotion of each user in the webcast room can be expressed, and details are not described herein.
203. And sending the emotional expression effect to a main broadcasting terminal corresponding to a main broadcasting of the network live broadcasting room, so that the main broadcasting terminal displays the emotional expression effect.
In the embodiment of the application, the server sends the emotional expression effect to the anchor terminal, and the anchor terminal loads and displays the emotional expression effect. The different types of emotional performance effects are displayed in different manners, wherein the display manners include display positions, display tracks and the like.
For example, when the emotional expression effect is the overall emotional expression effect of all users in the live webcast room, the overall emotional expression effect is displayed in the upper right corner position of the live content, as shown in fig. 5 to 8. The overall emotional expression effect comprises an emotional numerical effect, a background effect of the emotional numerical effect and an overall emotional expression package of all users, the emotional numerical effect and the overall emotional expression package are displayed on the background effect side by side, and in addition, a plurality of words of 'audience emotion' can be added on the left side of the emotional numerical effect, so that the anchor can know the meaning represented by the displayed emotional numerical effect.
For example, when the emotional expression effect is an emotional expression effect of each user in the live webcast room, the emotional expression effect of each user is displayed at a position on the right side of the live content, as shown in fig. 9 and 10. Specifically, when the emotional expression effect includes the emotional expression package of each user, as shown in fig. 9, the emotional expression package of each user may be sequentially displayed at the lower right corner of the live broadcast content, and dynamically moved upward, and disappears after moving to a certain height, that is, the emotional expression package does not display in the live broadcast content after moving to a certain height, so as to prevent the live broadcast content from being blocked by too many emotional expression packages displayed on the screen.
When the emotional expression effect includes combination of the user avatar of each user and the emotional expression package, the sending of the emotional expression effect to the anchor terminal corresponding to the anchor in the live webcast room in step 203 causes the anchor terminal to display the emotional expression effect specifically includes: and sending the emotional expression package and the user head portrait of the target user to a main broadcast terminal corresponding to a main broadcast of the network live broadcast room, so that the main broadcast terminal correspondingly displays the emotional expression package and the user head portrait of the target user, and at least part of the emotional expression package is overlapped and displayed on the corresponding user head portrait.
The emotional expression effect can be displayed at the right position of the live broadcast content, and the emotional expression effect of each user in the network live broadcast room can be sequentially displayed by setting the number and duration of the users displayed at one time. The number and duration of the users displayed at one time can be adjusted according to the number of the users who start the emotion collection function in the live webcast room, the recognition efficiency of the emotion states of the users and the like. As shown in fig. 10, the number of users displayed at one time is 4, and the duration is set to 2 seconds. The display area of the user avatar may be larger than the display area of the emotional emoticon so that the anchor can identify the user to which the user avatar corresponds. The emotional expression package can be partially overlapped and displayed on the head portrait of the user, as shown in fig. 10, can be completely displayed on the head portrait of the user, and can be displayed in parallel with the head portrait of the user, so that the anchor can intuitively acquire the current emotional state of each user.
The anchor can know the whole emotional state or most of the emotional states of the user according to the emotional expression effect displayed on the anchor terminal, and if the whole emotional state or most of the emotional states of the user do not reach the expectation of the anchor, the anchor can timely adjust the live broadcast content. Specifically, the method further comprises: acquiring live broadcast content adjusted by the anchor according to the emotional expression effect; and sending the adjusted live broadcast content to a user terminal corresponding to the target user, so that the user terminal displays the adjusted live broadcast content.
For example, the current live content of the anchor is singing, but in the process of live singing of the anchor, the emotion expression effect displayed on the anchor terminal is 'audience emotion 35' and a chatting expression package, as shown in fig. 7, the anchor learns that the overall emotion state of the user is chatting according to the displayed emotion expression effect, that is, the user is not interested in the singing of the anchor, and at the moment, the anchor can stop singing, adjust the live content to dancing and the like. The adjusted live broadcast content can be displayed on the user terminal in real time, and the anchor can select to continue adjusting the live broadcast content according to the emotional state of the user when watching the adjusted live broadcast content until the emotional state of the user reaches the expectation of the anchor.
In addition, the anchor can also select whether to enable users in the live webcast room to watch the emotional expression effect through the anchor terminal. Specifically, the method further comprises: acquiring an emotion display request sent by the anchor terminal; and sending the emotional expression effect to a user terminal corresponding to the target user according to the emotional display request, so that the user terminal displays the emotional expression effect.
It should be noted that the anchor terminal is provided with an emotion display function button, and the anchor clicks the emotion display function button to open the emotion display function. And the anchor terminal sends an emotion display request to the server, and the server acquires the user terminal corresponding to each user in the live webcast room according to the emotion display request so as to send the emotion expression effect to the user terminal corresponding to each user for display. The emotional expression effect displayed by the user terminal is the same as the emotional expression effect displayed by the anchor terminal.
To sum up, according to the embodiment of the application, through obtaining the emotion information when at least one user watches the live broadcast content in the live webcast room, and determining the emotion expression effect corresponding to the watching user according to the emotion information, the emotion expression effect is sent to the anchor terminal corresponding to the anchor in the live webcast room to be displayed, so that the anchor obtains the emotion change of the watching user in real time according to the emotion expression effect displayed by the anchor terminal, the live broadcast content can be adjusted in time, the interactivity between the anchor and the watching user is increased, the wonderful degree of the live broadcast content is improved, and the interest degree of the watching user on the live broadcast content is further improved.
The live webcast display method in the embodiment of the present application is described below with reference to a specific application scenario.
Please refer to fig. 11, which is a flowchart illustrating a live webcast display method according to another embodiment of the present application, where the live webcast display method is applied to a server, and the live webcast display method includes:
111. and acquiring emotion data of a user A in the live webcast room when watching the live content.
The user A allows the emotion collection function to be started, the user terminal corresponding to the user A starts the shooting function to shoot the face of the user A, face change data of the user A are obtained, the face change data are emotion data, and for example, the mouth angle of the user A moves upwards.
112. And determining the emotion grade of the user A according to the emotion data of the user A.
For example, the emotion data of the user a is position change data in which the mouth angle moves upward, the position change data is input to a pre-trained emotion recognition model, and the corresponding emotion level is output as happy, and then the emotion level of the user a can be determined as happy.
113. And acquiring the emotion expression packet of the user A from a preset emotion expression packet database according to the emotion grade of the user A.
For example, if the emotion level of the user a is happy, the happy emotion expression packet corresponding to the happy emotion is inquired in the emotional expression packet database, so that the emotional expression packet of the user a is determined to be the happy expression packet.
114. And sending the emotional expression package of the user A to a anchor terminal corresponding to an anchor B of the network live broadcast room, so that the anchor terminal displays the emotional expression package of the user A in the live broadcast content.
For example, the emotional expression package of the user a is a happy expression package, the happy expression package is sent to the anchor terminal, and the anchor terminal displays the happy expression package in the lower right corner of the live content. The happy facial expression package can also move from the lower right corner to the upper right corner of the live content and disappear after moving to a certain position or moving for a certain time. The anchor in the embodiment of the application can judge the interest of the current live broadcast content according to the emotion expression package of the user so as to adjust the live broadcast content in time.
In order to better implement the live webcast display method provided by the embodiment of the present application, an embodiment of the present application further provides a device based on the live webcast display method. The meaning of the noun is the same as that in the above live network display method, and specific implementation details can refer to the description in the method embodiment.
Referring to fig. 12, fig. 12 is a schematic structural diagram of a live webcast display apparatus according to an embodiment of the present disclosure, where the live webcast display apparatus may include an information obtaining module 121, a determining module 122, and a sending module 123, where:
the information acquisition module 121 is configured to acquire emotion information of at least one user in the live webcast room when the user watches live content;
a determining module 122, configured to determine, according to the emotional information, an emotional performance effect corresponding to the at least one user; and the number of the first and second groups,
a sending module 123, configured to send the emotional performance effect to a anchor terminal corresponding to an anchor in the live webcast room, so that the anchor terminal displays the emotional performance effect.
In some embodiments of the present application, the emotion information includes emotion data of each user of the at least one user, and the information obtaining module 121 is specifically configured to:
and respectively taking each user in at least one user in the live webcast room as a target user, and acquiring emotion data of the target user, which is acquired by a user terminal corresponding to the target user, wherein the emotion data is face change data of the target user when the target user watches the live content.
In some embodiments of the present application, the information obtaining module 121 is further configured to:
sending an emotion acquisition request to a user terminal corresponding to the target user, so that the user terminal starts a shooting function after responding to the emotion acquisition request to shoot the face of the target user, and obtaining emotion data of the target user;
and acquiring emotion data of the target user uploaded by the user terminal.
In some embodiments of the present application, the emotional performance effect includes an emotional numerical effect, and the determining module 122 is specifically configured to:
determining the emotion score value of the target user according to the emotion data of the target user;
counting the total value of the emotion scores of the at least one user according to the emotion score value of the target user;
and determining the emotional numerical effect corresponding to the total emotional score value.
In some embodiments of the present application, the emotional expression effect further comprises an overall emotional expression package of the at least one user and a background effect of the emotional numerical effect, and the determining module 122 is further configured to:
determining a target score interval in which the total emotion score value is located according to a plurality of preset score intervals;
acquiring an integral emotion expression packet corresponding to the target score interval from a preset expression packet database;
and acquiring a background effect corresponding to the target score interval from a preset background effect database.
In some embodiments of the present application, the emotional performance effect includes an emotional expression package of the target user, and the determining module 122 is further configured to:
according to the emotion data of the target user, recognizing the emotion level of the target user;
and acquiring the emotion expression packet corresponding to the emotion grade from a preset emotion expression packet database.
In some embodiments of the present application, the emotional performance effect further includes a user avatar of the target user, and the sending module 123 is specifically configured to:
and sending the emotional expression package and the user head portrait of the target user to a main broadcast terminal corresponding to a main broadcast of the network live broadcast room, so that the main broadcast terminal correspondingly displays the emotional expression package and the user head portrait of the target user, and at least part of the emotional expression package is overlapped and displayed on the corresponding user head portrait.
In some embodiments of the present application, the apparatus further comprises a display module, the display module specifically configured to:
acquiring an emotion display request sent by the anchor terminal;
and sending the emotional expression effect to a user terminal corresponding to the target user according to the emotional display request, so that the user terminal displays the emotional expression effect.
In specific implementation, the above modules may be implemented as independent entities, or may be combined arbitrarily to be implemented as the same or several entities, and specific implementation of the above modules may refer to the foregoing method embodiments, which are not described herein again.
According to the embodiment of the application, the emotion information of at least one user in the live webcast room is acquired when the live content is watched, the emotion expression effect corresponding to the watching user is determined according to the emotion information, the emotion expression effect is sent to the anchor terminal corresponding to the anchor in the live webcast room to be displayed, the anchor is enabled to acquire the emotion change of the watching user in real time according to the emotion expression effect displayed by the anchor terminal, the live content can be adjusted in time, interactivity between the anchor and the watching user is increased, the wonderful degree of the live content is improved, and the interest degree of the watching user in the live content is improved.
The embodiment of the application also provides a live network display method and device.
Referring to fig. 13, fig. 13 is a schematic view of a scene of a live webcast display system according to an embodiment of the present disclosure, where the live webcast display system may include a user terminal 131, an anchor terminal 132, and a server 133, where the user terminal 131 and the anchor terminal 132 are respectively connected to the server 133 through a network, and a live webcast display device is integrated in the anchor terminal 132. In the embodiment of the application, anchor terminal 132 is mainly configured to send an emotion acquisition instruction to a server, where the emotion acquisition instruction is used to instruct the server to acquire emotion information of at least one user in a live webcast room when the user watches live content, and determine an emotional performance effect corresponding to the at least one user according to the emotion information; acquiring the emotional expression effect fed back by the server; and displaying the emotional expression effect.
The following is a detailed description of specific embodiments.
In the present embodiment, description will be made from the perspective of a live network display apparatus, which may be specifically integrated in a anchor terminal.
Please refer to fig. 14, which is a flowchart illustrating an embodiment of a live webcast display method in an embodiment of the present application, the live webcast display method is integrated in a anchor terminal, and the live webcast display method includes:
141. sending an emotion acquisition instruction to a server, wherein the emotion acquisition instruction is used for instructing the server to acquire emotion information of at least one user in a live webcast room when the user watches live content, and determining an emotional expression effect corresponding to the at least one user according to the emotion information.
In the embodiment of the application, the anchor terminal can be provided with an emotion acquisition function button, when the anchor needs to know emotion change of a target user, the anchor terminal clicks the emotion acquisition function button to send an emotion acquisition instruction to the server, the server sends an emotion acquisition request to a user terminal corresponding to each user in a network live broadcast room according to the emotion acquisition instruction, and each user terminal acquires emotion data of the corresponding user according to the emotion acquisition request and feeds the emotion data back to the server. The server collects emotion data of all users into emotion information, and an emotion expression effect database is preset in the server, so that corresponding emotion expression effects can be determined from the emotion expression effect database according to the emotion information. Wherein the emotional expression effect may have a plurality of different types of expressions.
142. And acquiring the emotional expression effect fed back by the server.
In the embodiment of the application, the server sends the determined emotional expression effect to the anchor terminal. In addition, the anchor terminal can also send a type selection instruction of the emotional expression effect to the server, and the server selects the corresponding type of emotional expression effect according to the type selection instruction and feeds the type of emotional expression effect back to the anchor terminal.
143. And displaying the emotional expression effect.
In the embodiment of the application, the anchor terminal loads the emotional expression effect into the live broadcast content after receiving the emotional expression effect fed back by the server, so that the anchor can check the emotional state of the user in real time.
According to the embodiment of the application, the emotion acquisition instruction is sent to the server, the emotion expression effect fed back by the server is obtained and displayed, the emotion expression effect can reflect the emotion state of at least one user in a network live broadcast room when the live broadcast content is watched, so that the anchor acquires the emotion change of the watching user in real time according to the displayed emotion expression effect, the live broadcast content can be adjusted in time, the interactivity between the anchor and the watching user is increased, the wonderful degree of the live broadcast content is improved, and the interest degree of the watching user on the live broadcast content is further improved.
In order to better implement the live webcast display method provided by the embodiment of the present application, an embodiment of the present application further provides a device based on the live webcast display method. The meaning of the noun is the same as that in the above live network display method, and specific implementation details can refer to the description in the method embodiment.
Referring to fig. 15, fig. 15 is a schematic structural diagram of a live webcast display apparatus according to an embodiment of the present disclosure, where the live webcast display apparatus may include an instruction sending module 151, an obtaining module 152, and a display module 153, where:
the instruction sending module 151 is configured to send an emotion acquisition instruction to a server, where the emotion acquisition instruction is used to instruct the server to acquire emotion information of at least one user in a live webcast room when the user watches live content, and determine an emotional performance effect corresponding to the at least one user according to the emotion information;
an obtaining module 152, configured to obtain the emotional performance effect fed back by the server;
and the display module 153 is used for displaying the emotional expression effect.
According to the embodiment of the application, the emotion acquisition instruction is sent to the server, the emotion expression effect fed back by the server is obtained and displayed, the emotion expression effect can reflect the emotion state of at least one user in a network live broadcast room when the live broadcast content is watched, so that the anchor acquires the emotion change of the watching user in real time according to the displayed emotion expression effect, the live broadcast content can be adjusted in time, the interactivity between the anchor and the watching user is increased, the wonderful degree of the live broadcast content is improved, and the interest degree of the watching user on the live broadcast content is further improved.
The embodiment of the present application further provides a server, as shown in fig. 16, which shows a schematic structural diagram of the server according to the embodiment of the present application, specifically:
the server may include components such as a processor 161 of one or more processing cores, memory 162 of one or more computer-readable storage media, a power supply 163, and an input unit 164. Those skilled in the art will appreciate that the server architecture shown in FIG. 16 is not meant to be limiting, and may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components.
Wherein:
the processor 161 is a control center of the server, connects various parts of the entire server using various interfaces and lines, and performs various functions of the server and processes data by running or executing software programs and/or modules stored in the memory 162 and calling data stored in the memory 162, thereby performing overall monitoring of the server. Optionally, processor 161 may include one or more processing cores; preferably, the processor 161 may integrate an application processor, which mainly handles operations of storage media, user interfaces, application programs, etc., and a modem processor, which mainly handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor 161.
The memory 162 may be used to store software programs and modules, and the processor 161 executes various functional applications and data processing by operating the software programs and modules stored in the memory 162. The memory 162 may mainly include a storage program area and a storage data area, wherein the storage program area may store an application program (such as a sound playing function, an image playing function, etc.) required for operating the storage medium, at least one function, and the like; the storage data area may store data created according to the use of the server, and the like. Further, the memory 162 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device. Accordingly, the memory 162 may also include a memory controller to provide the processor 161 access to the memory 162.
The server further includes a power supply 163 for supplying power to the various components, and preferably, the power supply 163 is logically connected to the processor 161 via a power management storage medium, so that functions of managing charging, discharging, and power consumption are realized via the power management storage medium. The power supply 163 may also include any component of one or more dc or ac power sources, rechargeable storage media, power failure detection circuitry, power converters or inverters, power status indicators, and the like.
The server may also include an input unit 164, and the input unit 164 may be used to receive input numeric or character information and generate keyboard, mouse, joystick, optical or trackball signal inputs related to user settings and function control.
Although not shown, the server may further include a display unit and the like, which will not be described in detail herein. Specifically, in this embodiment, the processor 161 in the server loads the executable file corresponding to the process of one or more application programs into the memory 162 according to the following instructions, and the processor 161 runs the application programs stored in the memory 162, so as to implement various functions as follows:
obtaining emotion information of at least one user in a live webcast room when watching live content; determining an emotional expression effect corresponding to the at least one user according to the emotional information; and sending the emotional expression effect to a main broadcasting terminal corresponding to a main broadcasting of the network live broadcasting room, so that the main broadcasting terminal displays the emotional expression effect.
It will be understood by those skilled in the art that all or part of the steps of the methods of the above embodiments may be performed by instructions or by associated hardware controlled by the instructions, which may be stored in a computer readable storage medium and loaded and executed by a processor.
To this end, embodiments of the present application provide a storage medium, where a plurality of instructions are stored, where the instructions can be loaded by a processor to perform steps in any one of the live webcast display methods provided in the embodiments of the present application. For example, the instructions may perform the steps of:
obtaining emotion information of at least one user in a live webcast room when watching live content; determining an emotional expression effect corresponding to the at least one user according to the emotional information; and sending the emotional expression effect to a main broadcasting terminal corresponding to a main broadcasting of the network live broadcasting room, so that the main broadcasting terminal displays the emotional expression effect.
The above operations can be implemented in the foregoing embodiments, and are not described in detail herein.
Wherein the storage medium may include: read Only Memory (ROM), Random Access Memory (RAM), magnetic or optical disks, and the like.
Since the instructions stored in the storage medium can execute the steps in any live webcast display method provided in the embodiment of the present application, beneficial effects that can be achieved by any live webcast display method provided in the embodiment of the present application can be achieved, which are detailed in the foregoing embodiments and will not be described herein again.
The network live broadcast display method, device, server and storage medium provided by the embodiment of the present application are introduced in detail above, and a specific example is applied in the present application to explain the principle and the implementation manner of the present application, and the description of the above embodiment is only used to help understanding the method and the core idea of the present application; meanwhile, for those skilled in the art, according to the idea of the present application, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present application.

Claims (12)

1. A live network display method is characterized by comprising the following steps:
obtaining emotion information of at least one user in a live webcast room when watching live content;
determining an emotional expression effect corresponding to the at least one user according to the emotional information;
and sending the emotional expression effect to a main broadcasting terminal corresponding to a main broadcasting of the network live broadcasting room, so that the main broadcasting terminal displays the emotional expression effect.
2. The live webcast display method of claim 1, wherein the emotion information includes emotion data of each of the at least one user;
the obtaining of the emotion information of at least one user watching the live content in the live webcast room specifically includes:
and respectively taking each user in at least one user in the live webcast room as a target user, and acquiring emotion data of the target user, which is acquired by a user terminal corresponding to the target user, wherein the emotion data is face change data of the target user when the target user watches the live content.
3. The live webcast display method according to claim 2, wherein the acquiring of the emotion data of the target user, acquired by the user terminal corresponding to the target user, specifically includes:
sending an emotion acquisition request to a user terminal corresponding to the target user, so that the user terminal starts a shooting function after responding to the emotion acquisition request to shoot the face of the target user, and obtaining emotion data of the target user;
and acquiring emotion data of the target user uploaded by the user terminal.
4. The live webcasting display method of claim 2, wherein the emotional expression effect comprises an emotional numerical effect;
the determining, according to the emotional information, an emotional performance effect corresponding to the at least one user specifically includes:
determining the emotion score value of the target user according to the emotion data of the target user;
counting the total value of the emotion scores of the at least one user according to the emotion score value of the target user;
and determining the emotional numerical effect corresponding to the total emotional score value.
5. The live webcast display method of claim 4, wherein the emotional expression effect further comprises an overall emotional expression package of the at least one user and a background effect of the emotional numerical effect;
the determining the emotional performance effect corresponding to the at least one user according to the emotional information further includes:
determining a target score interval in which the total emotion score value is located according to a plurality of preset score intervals;
acquiring an integral emotion expression packet corresponding to the target score interval from a preset expression packet database;
and acquiring a background effect corresponding to the target score interval from a preset background effect database.
6. The live webcasting display method of claim 2, wherein the emotional expression effect comprises an emotional expression package of the target user;
the determining, according to the emotional information, an emotional performance effect corresponding to the at least one user specifically includes:
according to the emotion data of the target user, recognizing the emotion level of the target user;
and acquiring the emotion expression packet corresponding to the emotion grade from a preset emotion expression packet database.
7. The live webcasting display method of claim 6, wherein the emotional performance effect further comprises a user avatar of the target user;
the sending of the emotional expression effect to a anchor terminal corresponding to an anchor of the live webcast room enables the anchor terminal to display the emotional expression effect, and the sending specifically includes:
and sending the emotional expression package and the user head portrait of the target user to a main broadcast terminal corresponding to a main broadcast of the network live broadcast room, so that the main broadcast terminal correspondingly displays the emotional expression package and the user head portrait of the target user, and at least part of the emotional expression package is overlapped and displayed on the corresponding user head portrait.
8. The live webcasting display method of claim 2, further comprising:
acquiring an emotion display request sent by the anchor terminal;
and sending the emotional expression effect to a user terminal corresponding to the target user according to the emotional display request, so that the user terminal displays the emotional expression effect.
9. The live webcasting display method of claim 2, further comprising:
acquiring live broadcast content adjusted by the anchor according to the emotional expression effect;
and sending the adjusted live broadcast content to a user terminal corresponding to the target user, so that the user terminal displays the adjusted live broadcast content.
10. A live network display method is characterized by comprising the following steps:
sending an emotion acquisition instruction to a server, wherein the emotion acquisition instruction is used for instructing the server to acquire emotion information of at least one user in a live webcast room when the user watches live content, and determining an emotional expression effect corresponding to the at least one user according to the emotion information;
acquiring the emotional expression effect fed back by the server;
and displaying the emotional expression effect.
11. A live webcast display apparatus, comprising:
the information acquisition module is used for acquiring emotion information of at least one user in a live webcast room when watching live content;
the determining module is used for determining the emotional expression effect corresponding to the at least one user according to the emotional information; and the number of the first and second groups,
and the sending module is used for sending the emotional expression effect to a main broadcasting terminal corresponding to a main broadcasting of the network live broadcasting room, so that the main broadcasting terminal displays the emotional expression effect.
12. A live webcast display apparatus, comprising:
the system comprises an instruction sending module, an emotion acquisition module and a display module, wherein the instruction sending module is used for sending an emotion acquisition instruction to a server, the emotion acquisition instruction is used for instructing the server to acquire emotion information of at least one user in a live webcast room when the user watches live content, and the emotion expression effect corresponding to the at least one user is determined according to the emotion information;
the acquisition module is used for acquiring the emotional expression effect fed back by the server;
and the display module is used for displaying the emotional expression effect.
CN201910844058.7A 2019-09-06 2019-09-06 Network live broadcast display method and device Active CN110677685B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910844058.7A CN110677685B (en) 2019-09-06 2019-09-06 Network live broadcast display method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910844058.7A CN110677685B (en) 2019-09-06 2019-09-06 Network live broadcast display method and device

Publications (2)

Publication Number Publication Date
CN110677685A true CN110677685A (en) 2020-01-10
CN110677685B CN110677685B (en) 2021-08-31

Family

ID=69076622

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910844058.7A Active CN110677685B (en) 2019-09-06 2019-09-06 Network live broadcast display method and device

Country Status (1)

Country Link
CN (1) CN110677685B (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112417297A (en) * 2020-12-04 2021-02-26 网易(杭州)网络有限公司 Data processing method and device, live broadcast server and terminal equipment
CN112752159A (en) * 2020-08-25 2021-05-04 腾讯科技(深圳)有限公司 Interaction method and related device
CN112887746A (en) * 2021-01-22 2021-06-01 维沃移动通信(深圳)有限公司 Live broadcast interaction method and device
CN113778301A (en) * 2021-08-16 2021-12-10 盒马(中国)有限公司 Emotion interaction method based on content service and electronic equipment
CN114598896A (en) * 2022-02-17 2022-06-07 北京达佳互联信息技术有限公司 Network live broadcast method and device, electronic equipment and storage medium

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2012039902A1 (en) * 2010-09-22 2012-03-29 General Instrument Corporation System and method for measuring audience reaction to media content
CN106791893A (en) * 2016-11-14 2017-05-31 北京小米移动软件有限公司 Net cast method and device
CN107368495A (en) * 2016-05-12 2017-11-21 阿里巴巴集团控股有限公司 Determine method and device of the user to the mood of internet object
CN107784114A (en) * 2017-11-09 2018-03-09 广东欧珀移动通信有限公司 Recommendation method, apparatus, terminal and the storage medium of facial expression image
CN107786894A (en) * 2017-09-29 2018-03-09 维沃移动通信有限公司 A kind of recognition methods of user feedback data, mobile terminal and storage medium
CN108702523A (en) * 2017-12-29 2018-10-23 深圳和而泰数据资源与云技术有限公司 A kind of user emotion display methods, system and user emotion show equipment
CN109635616A (en) * 2017-10-09 2019-04-16 阿里巴巴集团控股有限公司 Interactive approach and equipment

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2012039902A1 (en) * 2010-09-22 2012-03-29 General Instrument Corporation System and method for measuring audience reaction to media content
CN107368495A (en) * 2016-05-12 2017-11-21 阿里巴巴集团控股有限公司 Determine method and device of the user to the mood of internet object
CN106791893A (en) * 2016-11-14 2017-05-31 北京小米移动软件有限公司 Net cast method and device
CN107786894A (en) * 2017-09-29 2018-03-09 维沃移动通信有限公司 A kind of recognition methods of user feedback data, mobile terminal and storage medium
CN109635616A (en) * 2017-10-09 2019-04-16 阿里巴巴集团控股有限公司 Interactive approach and equipment
CN107784114A (en) * 2017-11-09 2018-03-09 广东欧珀移动通信有限公司 Recommendation method, apparatus, terminal and the storage medium of facial expression image
CN108702523A (en) * 2017-12-29 2018-10-23 深圳和而泰数据资源与云技术有限公司 A kind of user emotion display methods, system and user emotion show equipment

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112752159A (en) * 2020-08-25 2021-05-04 腾讯科技(深圳)有限公司 Interaction method and related device
CN112752159B (en) * 2020-08-25 2024-01-30 腾讯科技(深圳)有限公司 Interaction method and related device
CN112417297A (en) * 2020-12-04 2021-02-26 网易(杭州)网络有限公司 Data processing method and device, live broadcast server and terminal equipment
CN112887746A (en) * 2021-01-22 2021-06-01 维沃移动通信(深圳)有限公司 Live broadcast interaction method and device
CN113778301A (en) * 2021-08-16 2021-12-10 盒马(中国)有限公司 Emotion interaction method based on content service and electronic equipment
CN114598896A (en) * 2022-02-17 2022-06-07 北京达佳互联信息技术有限公司 Network live broadcast method and device, electronic equipment and storage medium

Also Published As

Publication number Publication date
CN110677685B (en) 2021-08-31

Similar Documents

Publication Publication Date Title
CN110677685B (en) Network live broadcast display method and device
CN110570698B (en) Online teaching control method and device, storage medium and terminal
CN107316520B (en) Video teaching interaction method, device, equipment and storage medium
US10834479B2 (en) Interaction method based on multimedia programs and terminal device
CN106227335B (en) Interactive learning method for preview lecture and video course and application learning client
CN104967902B (en) Video sharing method, apparatus and system
US8806518B2 (en) Performance analysis for combining remote audience responses
US9898850B2 (en) Support and complement device, support and complement method, and recording medium for specifying character motion or animation
CN104021441B (en) A kind of system and method for making the electronics resume with video and audio
CN110609970B (en) User identity identification method and device, storage medium and electronic equipment
CN112423143B (en) Live broadcast message interaction method, device and storage medium
CN112188267B (en) Video playing method, device and equipment and computer storage medium
WO2022022485A1 (en) Content provision method and apparatus, content display method and apparatus, and electronic device and storage medium
CN112887746B (en) Live broadcast interaction method and device
CN112672219B (en) Comment information interaction method and device and electronic equipment
CN111629222B (en) Video processing method, device and storage medium
CN113824983B (en) Data matching method, device, equipment and computer readable storage medium
CN112188223B (en) Live video playing method, device, equipment and medium
CN114449301B (en) Item sending method, item sending device, electronic equipment and computer-readable storage medium
US20160082356A1 (en) Game system control method and game system
KR102328287B1 (en) Server device, and computer programs used therein
CN113497946A (en) Video processing method and device, electronic equipment and storage medium
CN112533009A (en) User interaction method, system, storage medium and terminal equipment
CN111659114B (en) Interactive game generation method and device, interactive game processing method and device and electronic equipment
CN112637640B (en) Video interaction method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
REG Reference to a national code

Ref country code: HK

Ref legal event code: DE

Ref document number: 40018342

Country of ref document: HK

SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant