CN112667081A - Bullet screen display method and device, storage medium and terminal - Google Patents

Bullet screen display method and device, storage medium and terminal Download PDF

Info

Publication number
CN112667081A
CN112667081A CN202011588812.4A CN202011588812A CN112667081A CN 112667081 A CN112667081 A CN 112667081A CN 202011588812 A CN202011588812 A CN 202011588812A CN 112667081 A CN112667081 A CN 112667081A
Authority
CN
China
Prior art keywords
gesture
terminal
target
bullet screen
barrage
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011588812.4A
Other languages
Chinese (zh)
Inventor
骆曦
张特
王贺
张峰石
高柏青
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Dami Technology Co Ltd
Original Assignee
Beijing Dami Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Dami Technology Co Ltd filed Critical Beijing Dami Technology Co Ltd
Priority to CN202011588812.4A priority Critical patent/CN112667081A/en
Publication of CN112667081A publication Critical patent/CN112667081A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the application discloses a bullet screen display method, a bullet screen display device, a storage medium and a terminal, wherein the bullet screen display method comprises the following steps: receiving a trigger instruction of a gesture barrage sent by a second terminal, and collecting an input gesture image; identifying a target gesture corresponding to the gesture image, and determining gesture information corresponding to the target gesture; sending the gesture information to the second terminal; and receiving a target bullet screen corresponding to the gesture information sent by the second terminal, and displaying the target bullet screen. By adopting the embodiment of the application, the user concentration can be maintained.

Description

Bullet screen display method and device, storage medium and terminal
Technical Field
The present application relates to the field of communications technologies, and in particular, to a bullet screen display method and apparatus, a storage medium, and a terminal.
Background
With the rapid development of the barrage technology, more and more users can communicate and interact through the barrage when watching videos or live broadcasts.
At present, the bullet screen is mainly sent by manually inputting characters by a user, and the interaction is carried out by displaying the character bullet screen. This way makes the interaction uninteresting, thus reducing the concentration of the user. Especially, the energy of the lessons can be dispersed in the bullet screen sending scene of a remote classroom. Therefore, sending the bullet screen by manually inputting the text may reduce the concentration of the user.
Disclosure of Invention
The embodiment of the application provides a bullet screen display method, a bullet screen display device, a storage medium and a terminal, which can maintain the concentration of a user. The technical scheme is as follows:
in a first aspect, an embodiment of the present application provides a bullet screen display method, where the method includes:
receiving a trigger instruction of a gesture barrage sent by a second terminal, and collecting an input gesture image;
identifying a target gesture corresponding to the gesture image, and determining gesture information corresponding to the target gesture;
sending the gesture information to the second terminal;
and receiving a target bullet screen corresponding to the gesture information sent by the second terminal, and displaying the target bullet screen.
In a second aspect, an embodiment of the present application provides a bullet screen display method, where the method includes:
acquiring a trigger instruction of an input gesture bullet screen, and sending the trigger instruction to a first terminal so as to enable the first terminal to acquire a gesture image and determine gesture information corresponding to a target gesture in the gesture image;
receiving the gesture information sent by the first terminal;
and generating a target bullet screen corresponding to the gesture information, and sending the target bullet screen to the first terminal so that the first terminal displays the target bullet screen.
In a third aspect, an embodiment of the present application provides a bullet screen display device, where the device includes:
the gesture image acquisition module is used for receiving a trigger instruction of the gesture barrage sent by the second terminal and acquiring the input gesture image;
the gesture information determining module is used for identifying a target gesture corresponding to the gesture image and determining gesture information corresponding to the target gesture;
the gesture information sending module is used for sending the gesture information to the second terminal;
and the target bullet screen display module is used for receiving the target bullet screen corresponding to the gesture information sent by the second terminal and displaying the target bullet screen.
In a fourth aspect, an embodiment of the present application provides a bullet screen display device, where the device includes:
the trigger instruction sending module is used for acquiring a trigger instruction of the input gesture barrage and sending the trigger instruction to the first terminal so as to enable the first terminal to collect a gesture image and determine gesture information corresponding to a target gesture in the gesture image;
the gesture information receiving module is used for receiving the gesture information sent by the first terminal;
and the target bullet screen sending module is used for generating a target bullet screen corresponding to the gesture information and sending the target bullet screen to the first terminal so that the first terminal displays the target bullet screen.
In a fifth aspect, embodiments of the present application provide a computer storage medium storing a plurality of instructions adapted to be loaded by a processor and to perform the method steps of the first aspect.
In a sixth aspect, an embodiment of the present application provides a terminal, which may include: a processor and a memory; wherein the memory stores a computer program adapted to be loaded by the processor and to perform the method steps of the first aspect described above.
In a seventh aspect, an embodiment of the present application provides a computer storage medium storing a plurality of instructions, the instructions being adapted to be loaded by a processor and to perform the method steps of the second aspect.
In an eighth aspect, an embodiment of the present application provides a terminal, which may include: a processor and a memory; wherein the memory stores a computer program adapted to be loaded by the processor and to perform the method steps of the second aspect described above.
The beneficial effects brought by the technical scheme provided by some embodiments of the application at least comprise:
in one or more embodiments of the present application, a first terminal receives a trigger instruction of a gesture bullet screen sent by a second terminal, acquires an input gesture image, recognizes a target gesture corresponding to the gesture image, determines gesture information corresponding to the target gesture, sends the gesture information to the second terminal, receives the target bullet screen corresponding to the gesture information sent by the second terminal, and displays the target bullet screen. The user that first terminal corresponds can be through the input gesture so that first terminal shows the target barrage that the gesture corresponds, and need not be through input characters so that first terminal shows the characters barrage, has avoided the distraction that manual input characters caused, inputs the gesture simultaneously and shows the barrage that the gesture corresponds, can increase the interest of barrage to maintain user's concentration.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present application, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
Fig. 1 is a schematic flowchart of a bullet screen display method according to an embodiment of the present application;
fig. 2a is an exemplary schematic diagram of a display interface of a program upgrade instruction provided in an embodiment of the present application;
FIG. 2b is a schematic diagram illustrating an example of a display interface of a gesture image according to an embodiment of the present disclosure;
FIG. 2c is a schematic diagram illustrating an example of gesture information provided by an embodiment of the present application;
fig. 2d is an exemplary schematic diagram of a bullet screen display interface provided in an embodiment of the present application;
fig. 2e is an exemplary schematic diagram of a bullet screen display of an online classroom provided by an embodiment of the present application;
fig. 3 is a flowchart illustrating another bullet screen display method according to an embodiment of the present disclosure;
fig. 4a is an exemplary schematic diagram of a display interface of timing information provided in an embodiment of the present application;
fig. 4b is an exemplary diagram of a corresponding relationship provided in the embodiment of the present application;
fig. 5 is a flowchart illustrating another bullet screen display method according to an embodiment of the present disclosure;
fig. 6a is an exemplary schematic diagram of a gesture guidance information display interface according to an embodiment of the present application;
FIG. 6b is a schematic diagram illustrating an example of a display interface with successful gesture recognition according to an embodiment of the present disclosure;
fig. 6c is an exemplary schematic diagram of a target bullet screen provided in the embodiment of the present application;
fig. 7 is a schematic flowchart of another bullet screen display method provided in the embodiment of the present application;
FIG. 8 is a schematic diagram illustrating an example of a display interface at the teacher end according to an embodiment of the present disclosure;
fig. 9 is a schematic flowchart of another bullet screen display method provided in the embodiment of the present application;
fig. 10 is a schematic structural diagram of a bullet screen display device according to an embodiment of the present application;
fig. 11 is a schematic structural diagram of another bullet screen display device provided in the embodiment of the present application;
fig. 12 is a schematic structural diagram of a gesture information determination module according to an embodiment of the present application;
fig. 13 is a schematic structural diagram of another bullet screen display device provided in the embodiment of the present application;
fig. 14 is a schematic structural diagram of another bullet screen display device provided in the embodiment of the present application;
fig. 15 is a schematic structural diagram of a target bullet screen sending module according to an embodiment of the present application;
fig. 16 is a schematic structural diagram of another bullet screen display device provided in the embodiment of the present application;
fig. 17 is a schematic structural diagram of another bullet screen display device provided in the embodiment of the present application;
fig. 18 is a schematic structural diagram of a first terminal according to an embodiment of the present application;
fig. 19 is a schematic structural diagram of a second terminal according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
In the description of the present application, it is to be understood that the terms "first," "second," and the like are used for descriptive purposes only and are not to be construed as indicating or implying relative importance. In the description of the present application, it is noted that, unless explicitly stated or limited otherwise, "including" and "having" and any variations thereof, are intended to cover non-exclusive inclusions. For example, a process, method, system, article, or apparatus that comprises a list of steps or elements is not limited to only those steps or elements listed, but may alternatively include other steps or elements not listed, or inherent to such process, method, article, or apparatus. The specific meaning of the above terms in the present application can be understood in a specific case by those of ordinary skill in the art. Further, in the description of the present application, "a plurality" means two or more unless otherwise specified. "and/or" describes the association relationship of the associated objects, meaning that there may be three relationships, e.g., a and/or B, which may mean: a exists alone, A and B exist simultaneously, and B exists alone. The character "/" generally indicates that the former and latter associated objects are in an "or" relationship.
The present application will be described in detail with reference to specific examples.
In one embodiment, as shown in fig. 1, a bullet screen display method is proposed, which can be implemented by means of a computer program and can be run on a bullet screen display device based on von neumann architecture. The computer program may be integrated into the application or may run as a separate tool-like application. The bullet screen display device can be a terminal device, including but not limited to: personal computers, tablet computers, handheld devices, in-vehicle devices, wearable devices, computing devices or other processing devices connected to a wireless modem, and the like. The terminal devices in different networks may be called different names, for example: user equipment, access terminal, subscriber unit, subscriber station, mobile station, remote terminal, mobile device, user terminal, wireless communication device, user agent or user equipment, cellular telephone, cordless telephone, terminal equipment in a 5G network or future evolution network, and the like.
Specifically, the bullet screen display method comprises the following steps:
s101: and receiving a trigger instruction of the gesture bullet screen sent by the second terminal, and collecting the input gesture image.
The execution main body of this embodiment is the first terminal, and the first terminal may include a bullet screen display application program, and the bullet screen display application program may implement information interaction between the first terminal and other terminals based on an information interaction rule. The bullet screen display application program comprises but is not limited to video software, live broadcast software, online classroom software and the like.
The trigger instruction is an instruction for triggering the first terminal to enter a bullet screen display link, and can also trigger the first terminal to start to acquire gesture information. In this embodiment, the process of collecting gesture information is as follows: the first terminal collects the gesture image, then recognizes gesture information corresponding to the gesture image through a gesture recognition technology, and sends the gesture information to the second terminal.
It is easy to understand that the second terminal sending the trigger instruction is different for different bullet screen display applications.
When the bullet screen display application program can realize multi-terminal interaction, the second terminal can be another client of the bullet screen display application program. For example, the client of the online classroom software comprises a teacher end and a student end, the second terminal is the teacher end, the first terminal is the student end, and the teacher end sends a trigger instruction to the student end, so that the student end receives the trigger instruction.
When the bullet screen display application program can only realize program interaction, the second terminal can be a server corresponding to the bullet screen display application program, and the first terminal is a client. For example, the server of the video software sets the trigger instruction on a specific plot or time point in the played video, and when the client browses to the specific plot or time point of the video, the server sends the trigger instruction to the client, wherein the second terminal is the server of the video software.
And after receiving the trigger instruction, acquiring the input gesture image. The method for acquiring the gesture image can be shooting an image through a camera or obtaining a video through video recording of the camera, and selecting a frame of image with the most gesture features from the video as the gesture image based on an image feature extraction technology.
Optionally, after receiving the trigger instruction, determining whether the first terminal supports gesture recognition, and if so, acquiring the gesture image; if not, a program upgrading instruction is sent out on a display interface, a determination instruction sent by the user aiming at the program upgrading instruction is received, and program upgrading is carried out, so that the first terminal supports gesture recognition, and the gesture image starts to be collected. Wherein whether the first terminal supports gesture recognition may be determined by determining whether the first terminal includes a gesture recognition engine. The gesture recognition engine is a core component of a gesture recognition program or system, and when the first terminal is detected not to comprise the gesture engine, the gesture recognition engine can be acquired in an online downloading mode, so that program upgrading is carried out.
As shown in fig. 2a, an exemplary schematic diagram of a display interface of a program upgrade instruction includes program upgrade instruction content and options. Wherein, the program upgrading instruction content is 'please upgrade the program to perform gesture image acquisition'! ", options include" determine "and" cancel ". The user can input a determination instruction by clicking a 'determination' option so as to enable the first terminal to perform program upgrading. The user can also input a cancel instruction by clicking a cancel option to prohibit the first terminal from upgrading the program. After the program is upgraded, a prompt instruction of 'upgrading the program and starting to collect a gesture image' can be sent out, so that a user can input a gesture, and therefore gesture image collection is carried out.
It is easy to understand that before the first terminal receives the trigger instruction, the first terminal has sent the gesture barrage in the barrage link, which indicates that the first terminal supports gesture recognition, and does not need to judge whether the first terminal supports gesture recognition operation every time the trigger instruction is received, and can directly collect the gesture image.
Specifically, in order to improve the success rate of gesture recognition, a user is prompted on a display interface to input a gesture in an imaging area. As shown in fig. 2b, an exemplary schematic view of a display interface of a gesture image includes an imaging region and a gesture image, and a user places a gesture in the imaging region defined by the display interface, so as to reduce an interfering object in the acquired gesture image, and further improve a success rate of gesture recognition.
S102: and identifying a target gesture corresponding to the gesture image, and determining gesture information corresponding to the target gesture.
The target gesture refers to a gesture of a hand used to convey information. In the embodiment of the application, the target gesture refers to a gesture which is recognized in the gesture image and used by a user for conveying bullet screen information.
Specifically, the method for identifying the target gesture corresponding to the gesture image may be: and performing color extraction processing on the gesture image to enable the first terminal to accurately extract gesture edge features from the processed image, performing data analysis on the extracted gesture edge features to obtain target gesture data, and matching the target gesture data with the universally trained gesture model to obtain the target gesture.
Optionally, the target gesture data is stored in a database of a gesture model corresponding to the target gesture, so that the database of the gesture model can be further improved, and accuracy of gesture recognition is improved.
The gesture information refers to information conveyed by the gesture, and therefore after the target gesture is recognized, gesture information corresponding to the target gesture can be determined. For example, as shown in fig. 2c, an exemplary diagram of gesture information includes a target gesture and gesture information corresponding to the target gesture. The target gesture is formed by overlapping the thumb and the index finger, and closing the middle finger, the ring finger and the little finger, so that the gesture information corresponding to the target gesture can be determined to be the heart of the hand.
S103: and sending the gesture information to the second terminal.
And sending the gesture information in different modes aiming at different second terminals.
When the second terminal is another client of the bullet screen display application program, the manner of sending the gesture information may be: and the first terminal sends the gesture information to a server of a bullet screen display application program, and then the server sends the gesture information to the second terminal according to an information transmission protocol. Taking online classroom software as an example, when a student terminal sends the gesture information to a teacher terminal, the student terminal sends the gesture information to a server terminal of the online classroom software, and then the server terminal sends the gesture information to the teacher terminal.
When the second terminal is a server corresponding to a bullet screen display application program, the second terminal can directly receive the gesture information sent by the first terminal.
S104: and receiving a target bullet screen corresponding to the gesture information sent by the second terminal, and displaying the target bullet screen.
The target barrage refers to a barrage which pops up on a display interface when a video or a live broadcast is watched and takes the gesture information as content.
It should be noted that the first terminal may receive the bullet screen sent by the second terminal, where the bullet screen includes, but is not limited to, the target bullet screen. In the bullet screen display application, the bullet screen may include a bullet screen transmitted by the second terminal. For example, in video software, a server of the video software receives the gesture information sent by a client, sets or automatically generates other gesture information, and generates a bullet screen by using the gesture information and the other gesture information and sends the bullet screen to the client, wherein the bullet screen includes a target bullet screen corresponding to the gesture information and other bullet screens corresponding to the other gesture information. As shown in fig. 2d, an exemplary diagram of a bullet screen display interface includes a target bullet screen corresponding to gesture information sent by a client and other bullet screens corresponding to other gesture information set by a server. Wherein, the bar! "," Yes! "," 666 ", and" OK "are the target barrages," a large wave gesture barrage arrives! "is other barrage.
In summary, when the embodiment of the application is applied to the scene of online classroom interaction, the first terminal is a student terminal, and the second terminal is a teacher terminal.
An example of a possible on-line classroom barrage display is shown in fig. 2 e. And the student end receives a trigger instruction of the gesture barrage sent by the teacher end, starts the camera and collects gesture images, identifies a target gesture corresponding to the gesture images, determines gesture information corresponding to the target gesture, and sends the gesture information to the teacher end. After receiving the gesture information of the student end, the teacher end converts the gesture information into a target bullet screen, and sends the target bullet screen to the student end, so that the student end receives and displays the target bullet screen.
In this application embodiment, the user that first terminal corresponds can be through the input gesture so that first terminal demonstration the target barrage that the gesture corresponds, and need not be through input characters so that first terminal demonstration characters barrage, has avoided the distraction that manual input characters caused, inputs the gesture simultaneously and shows the barrage that the gesture corresponds, can increase the interest of barrage to can maintain user's concentration.
Please refer to fig. 3, which is a flowchart illustrating a bullet screen display method according to an embodiment of the present disclosure. As shown in fig. 3, in the embodiment of the present application, a specific flow of a bullet screen display method is described from a first terminal side and a second terminal side, where the method may include the following steps:
s201: the second terminal sends a trigger instruction to the first terminal.
When the second terminal is another client of the bullet screen display application program, the method for sending the trigger instruction to the first terminal comprises the following steps: and the second terminal sends the trigger instruction to a server of a bullet screen display application program, and the server sends the trigger instruction to the first terminal.
Taking online classroom software as an example, a teacher end sends the trigger instruction to a server end of the online classroom software, and the server end sends the trigger instruction to a student end.
When the second terminal is a server side of the bullet screen display application program, the second terminal can directly send the trigger instruction to the first terminal.
S202: the first terminal displays timing information of preset duration, and collects the gesture image in the preset duration.
The preset duration refers to a duration for acquiring the gesture image, wherein the preset duration can be set by the first terminal according to the performance of hardware for acquiring the gesture image, or the display time of the bullet screen can be specified by a user corresponding to the second terminal, the display time is determined while the trigger instruction is received, and the duration required for acquiring the gesture image is determined according to the display time.
The timing information refers to the duration from the current moment to the gesture collection ending moment. Specifically, the timing information is displayed and used for prompting a user corresponding to the first terminal to make a gesture within a specified time, so that the first terminal collects the gesture image within a preset time.
For example, in the online classroom software, as shown in fig. 4a, an exemplary schematic diagram of a display interface of timing information includes timing information of "gesture capture will be completed within 16 s", and a user corresponding to a client (a first terminal) can determine the remaining time for capturing a gesture image by using the timing information for 16 s.
S203: and the first terminal identifies a target gesture corresponding to the gesture image and determines gesture information corresponding to the target gesture.
The method for identifying the target gesture corresponding to the gesture image may specifically refer to S102, which is not described herein again.
Optionally, based on the gesture use probability of the user of the first terminal, a target gesture set is selected from a preset gesture set, the target gesture matched with the gesture image is searched in the target gesture set, and based on the corresponding relationship between different gestures and different gesture information, gesture information corresponding to the target gesture is determined.
For example, at the student end a of the online classroom software, the most commonly used gestures include a "heart-compared" gesture, a "666" gesture, and an "OK" gesture, and a gesture set including the three gestures is generated, the current student end a acquires a gesture image, data corresponding to the gesture image is compared with gesture data corresponding to the "heart-compared" gesture, and if the data are the same, the target gesture corresponding to the gesture image is determined to be the "heart-compared".
It is easy to understand that the gesture in the target gesture set is the most frequently used gesture by the user of the first terminal, and the target gesture is searched in the target gesture set, so that the efficiency of gesture recognition can be improved.
And determining gesture information corresponding to the gesture based on the corresponding relation between the different gestures and the different gesture information. The corresponding relationship may be set in the first terminal in advance, and when the target gesture is determined, gesture information corresponding to the target gesture may be determined. As shown in fig. 4b, an exemplary diagram of a corresponding relationship includes different gestures, different gesture information, and a corresponding relationship between different gestures and different gesture information.
S204: and the first terminal sends the gesture information to the second terminal.
See S103 specifically, and the details are not repeated here.
S205: and the second terminal generates a target bullet screen corresponding to the gesture information.
And according to the time sequence of receiving the gesture information, the second terminal can sequentially generate target bullet screens. Wherein, the target barrage can comprise the gesture information and can also comprise the gesture graph. The bullet screen, as shown in FIG. 2d, is shown as bullet screen "stick! "for example, barrage" rod! The target bullet screen generated by the method comprises gesture information and gesture graphs for erecting thumbs, so that the target bullet screen is interesting.
S206: and the second terminal sends the target bullet screen to the first terminal.
And when the second terminal is the other client side of the bullet screen display application program, the target bullet screen can be sent to the first terminal through the server side of the bullet screen display application program. When the second terminal is a server side for displaying application of the bullet screen, the target bullet screen can be directly sent to the first terminal.
S207: and the first terminal displays the target bullet screen.
And the first terminal displays the target bullet screen on a display interface in a preset mode. The preset mode refers to the display dynamic effect of the target bullet screen, and includes but is not limited to suspension, floating, rolling and the like.
S208: and the second terminal sends a courseware page turning instruction to the first terminal.
The courseware page turning instruction refers to an instruction for ending the current barrage link and carrying out the next link. In the video software, the courseware page turning instruction can be sent when the current video is played and the next video is played. In live broadcast software, the courseware unit instructions may be sent for the anchor upon completion of the current game session for the next session. The sending method of the courseware page turning instruction is the same as the sending method of the trigger instruction, and reference may be made to S202 specifically, which is not described herein again.
S209: and the first terminal hides the target bullet screen.
Specifically, after receiving the courseware page turning instruction, the first terminal directly hides the target barrage. It should be noted that, when the target bullet screen is not completely displayed, the first terminal receives the courseware page turning instruction, and the target bullet screen which is not displayed any more.
For example, taking online classroom software as an example, the second terminal is a teacher terminal, and the first terminal is a student terminal. The teacher end sends a trigger instruction to the student end on the current page of the courseware, so that the student end collects gesture images and conducts gesture recognition to obtain corresponding gesture information; the student terminal sends the gesture information to the teacher terminal so that the teacher terminal generates a target barrage corresponding to the gesture information, and the teacher terminal sends the target barrage to the student terminal so that the student terminal displays the target barrage; and the teacher end sends a courseware page turning instruction to the student end after page turning of the courseware, so that the student end hides the target barrage.
In this application embodiment, the user that first terminal corresponds can be through the input gesture so that first terminal demonstration the target barrage that the gesture corresponds, and need not be through input characters so that first terminal demonstration characters barrage, has avoided the distraction that manual input characters caused, inputs the gesture simultaneously and shows the barrage that the gesture corresponds, can increase the interest of barrage to can maintain user's concentration. Further, during gesture recognition, a target gesture set is selected from a preset gesture set based on the gesture use probability of the user of the first terminal, and the target gesture matched with the gesture image is searched in the target gesture set, so that the gesture recognition efficiency can be improved. In addition, when the gesture image is collected, timing information with preset duration is displayed, a user can be effectively prompted to input a gesture within a specified time, a courseware page turning instruction sent by the second terminal is received, and the target bullet screen is hidden, so that the bullet screen display process is more complete.
Please refer to fig. 5, which is a flowchart illustrating a bullet screen display method according to an embodiment of the present disclosure. As shown in fig. 5, in the embodiment of the present application, a specific flow of a bullet screen display method is described from a first terminal side and a second terminal side, where the method may include the following steps: specifically, the method comprises the following steps:
s301: the second terminal sends a trigger instruction to the first terminal.
See S201 specifically, and the details are not repeated here.
S302: and the first terminal displays the gesture guide information corresponding to the trigger instruction and collects the input gesture image.
When the second terminal is used for receiving specific gesture information, the gesture guiding information is carried on the trigger instruction when the trigger instruction is sent. The gesture guidance information is information for guiding a user corresponding to the first terminal to make a corresponding gesture, and may be displayed in the form of a picture, a text description, or a gesture box. As shown in FIG. 6a, an exemplary diagram of a gesture guidance information display interface includes an imaging area and a gesture box, which is a gesture guidance information and guides a user to make "Ye! "is detected.
For a process of acquiring the input gesture image, reference may be specifically made to S101, which is not described in detail herein.
S303: and the first terminal identifies a target gesture corresponding to the gesture image and determines gesture information corresponding to the target gesture.
The process of identifying the target gesture and determining the gesture information may specifically refer to S203, which is not described herein again.
Optionally, in order to increase the interest of the gesture bullet screen, the target gesture may be displayed in a dynamic effect after the gesture recognition is successful, and the user gesture recognition corresponding to the first terminal is prompted to be successful. As shown in fig. 6b, an exemplary schematic view of a display interface with successfully recognized gesture includes an acquired gesture image, gesture information corresponding to a target gesture, and prompt information, where the recognized gesture information is "OK", and the content of the prompt information is "gesture recognition successful", and is used to notify a user corresponding to the first terminal of a result of gesture recognition.
S304: and the first terminal sends the gesture information to the second terminal.
See S103 specifically, and the details are not repeated here.
S305: and the second terminal generates a target bullet screen corresponding to the gesture information.
See S205 specifically, and the details are not repeated here.
S306: and the second terminal sends the target bullet screen to the first terminal.
See S206 for details, which are not described herein.
S307: and the first terminal displays and marks the target bullet screen.
It should be noted that, in the application of bullet screen display, the first terminal includes at least one terminal. The second terminal may receive each gesture information corresponding to each first terminal, summarize each gesture information, and generate a bullet screen. Therefore, the bullet screen received by each first terminal comprises the target bullet screen corresponding to the gesture information of the first terminal and other bullet screens corresponding to the gesture information of other first terminals, and when the target bullet screen is displayed, other bullet screens are displayed. In addition, the first terminal marks and displays the target bullet screen, so that a user corresponding to the first terminal can quickly determine the target bullet screen corresponding to the gesture made by the user, and the interestingness of the gesture bullet screen is increased. The method for displaying the mark can be magnifying display, special color display and framed display.
Optionally, the target bullet screen may include the user identifier of the first terminal and the gesture information, so that the user corresponding to the first terminal may determine the target bullet screen corresponding to the gesture made by the user more quickly. The user identifier refers to a name capable of identifying the identity of the user, and may be a name of an account of the user in an application program.
For example, in the live broadcast software, one anchor terminal corresponds to a plurality of audience terminals, as shown in fig. 6C, an exemplary schematic diagram of a target-marked bullet screen includes an anchor terminal, an audience terminal a, an audience terminal B, and an audience terminal C, where the anchor terminal receives gesture information a sent by the audience a, gesture information B sent by the audience B, and gesture information C sent by the audience C, and generates a bullet screen a corresponding to the gesture information a, a bullet screen B corresponding to the gesture information B, and a bullet screen C corresponding to the gesture information C. The method comprises the following steps that a main broadcast end sends a bullet screen A, a bullet screen B and a bullet screen C to a spectator end A, wherein the bullet screen A, the bullet screen B and the bullet screen C received by the spectator end A comprise a target bullet screen (bullet screen A), and when the bullet screen A, the bullet screen B and the bullet screen C are displayed, a thick frame is added on the bullet screen A for displaying; the method comprises the following steps that a main broadcast end sends a bullet screen A, a bullet screen B and a bullet screen C to a spectator end B, the bullet screen A, the bullet screen B and the bullet screen C received by the spectator end B comprise a target bullet screen (bullet screen B), and when the bullet screen A, the bullet screen B and the bullet screen C are displayed, a thick frame is added on the bullet screen B for displaying; the anchor end sends barrage A, barrage B and barrage C to spectator end C, and in the barrage A, barrage B and the barrage C that spectator end C received, including target barrage (barrage C), when showing barrage A, barrage B and barrage C, it shows to add thick frame on barrage C.
S308: and the second terminal sends the ending instruction of the gesture bullet screen to the first terminal.
The finishing instruction of the gesture barrage refers to an instruction that the second terminal stops receiving gesture information and stops sending the barrage to the first terminal. The gesture bullet screen link can be sent when receiving the current bullet screen link, but is different from the courseware page turning instruction, and after the ending instruction of the gesture bullet screen is sent to the first terminal, the target bullet screen which is received but not displayed in the first terminal can be continuously displayed.
S309: and the first terminal judges whether the target bullet screen is completely displayed.
Specifically, the method for the first terminal to determine whether the target barrage is completely displayed may be: and judging whether the target bullet screen which is not displayed exists or not.
S310: and if not, the first terminal continues to display the target bullet screen until the display is complete, and hides the target bullet screen.
And if the target bullet screen which is not displayed exists, continuing to display the target bullet screen until the target bullet screen is completely displayed, and hiding the target bullet screen.
Optionally, if yes, hiding the target bullet screen. Specifically, if the target bullet screen is completely displayed, the target bullet screen is hidden. Hiding the target bullet screen refers to hiding a display interface of the target bullet screen, and the display interface may include gesture guidance information, a function key for gesture recognition, and the like.
Taking online classroom software as an example, the teacher end sends the ending instruction to the student end and stops sending the target barrage to the student end; the student end receives the ending instruction, judges whether an undisplayed target bullet screen exists or not, if so, completely displays the undisplayed target bullet screen, and hides the target bullet screen; if not, the target bullet screen is directly hidden.
In this application embodiment, the user that first terminal corresponds can be through the input gesture so that first terminal demonstration the target barrage that the gesture corresponds, and need not be through input characters so that first terminal demonstration characters barrage, has avoided the distraction that manual input characters caused, inputs the gesture simultaneously and shows the barrage that the gesture corresponds, can increase the interest of barrage to can maintain user's concentration. In addition, when gathering the gesture image, show gesture guide information, can effectively indicate the corresponding gesture of user input to can further improve the interest that the barrage shows, also improve gesture recognition's accuracy. Receiving a finishing instruction of the gesture bullet screen sent by the second terminal, judging whether the target bullet screen is completely displayed or not, if not, continuing to display the target bullet screen until the display is complete, and hiding the target bullet screen, if so, hiding the target bullet screen, finishing bullet screen display, and enabling a bullet screen display process to be more complete.
Referring to fig. 7, fig. 7 is a schematic flowchart illustrating another embodiment of a bullet screen display method provided in the present application. Specifically, the method comprises the following steps:
s401: and acquiring a trigger instruction of the input gesture barrage, and sending the trigger instruction to the first terminal.
The execution main body of this embodiment is a second terminal, the second terminal may be another client of the bullet screen display application, and a user corresponding to the second terminal inputs a trigger instruction to the second terminal, so that the second terminal obtains the trigger instruction. And a user corresponding to the second terminal can input a trigger instruction by clicking a function key of the display interface and can also input the trigger instruction by voice.
Taking online classroom software as an example, the second terminal is a teacher terminal. As shown in fig. 8, an exemplary schematic diagram of a display interface of a teacher end includes a "barrage link" function key, and a teacher inputs a trigger instruction by clicking an "on" key, so that the teacher end obtains the trigger instruction.
S402: and receiving the gesture information sent by the first terminal.
And after receiving the trigger instruction, the first terminal starts to acquire a gesture image, determines the gesture information and sends the gesture information to the second terminal. See S103 specifically, and the details are not repeated here.
S403: and generating a target bullet screen corresponding to the gesture information, and sending the target bullet screen to the first terminal.
The process of generating the target bullet screen corresponding to the gesture information may specifically refer to S205, which is not described herein again. The process of sending the target barrage to the first terminal may specifically refer to S206, which is not described herein again.
In this application embodiment, the second terminal obtains the trigger command of the gesture barrage of inputing, will trigger command sends to first terminal, so that first terminal gathers the gesture image and confirms the gesture information that the target gesture corresponds in the gesture image, and receive first terminal sends the gesture information generates the target barrage that the gesture information corresponds, and will target barrage send to first terminal, so that first terminal shows the target barrage. The second terminal can generate a target barrage corresponding to the gesture information and send the target barrage to the first terminal, so that interestingness of barrage display is increased, and concentration of a user corresponding to the first terminal is maintained.
Referring to fig. 9, fig. 9 is a schematic flowchart illustrating another embodiment of a bullet screen display method according to the present application. Specifically, the method comprises the following steps:
s501: and acquiring a trigger instruction of the input gesture barrage, and sending the trigger instruction to the first terminal.
See S401 for details, which are not described herein.
S502: and receiving the gesture information sent by the first terminal.
See S402 for details, which are not described herein.
S503: and acquiring the user identification of the first terminal and generating the target bullet screen.
It should be noted that the first terminal includes at least one terminal, and before the target barrage is generated, the user identifier of the first terminal is obtained, so that the target barrage includes the user identifier and the gesture information, and thus the target barrages corresponding to the first terminals can be distinguished. See S307 for details, which are not described herein.
S504: and counting the interactive feedback information corresponding to the target bullet screen, and displaying the target bullet screen and the interactive feedback information.
The interactive feedback information refers to the number of the first terminals and the number of the gesture information corresponding to each first terminal. It should be noted that, among the first terminals, there are terminals that fail to send gesture information in time, and there are terminals that send a plurality of pieces of gesture information. Therefore, when the second terminal receives the gesture information, the result of the received gesture information is counted, so that the interaction feedback information is obtained, and the user corresponding to the second terminal can clearly know the interaction condition of the bullet screen link.
Furthermore, in order to enable a user corresponding to the second terminal to clearly know the interaction situation and the bullet screen display situation of the bullet screen link, the second terminal displays the target bullet screen and the interaction feedback information.
Taking online classroom software as an example, a teacher end corresponds to five student ends, namely a student end a, a student end B, a student end C, a student end D and a student end E, and an interactive feedback information table is shown as follows:
student end Gesture information and number of times of transmission of gesture information
A
1 "rod! ", 3". Ok "
B 2*“666”
C 0
D 4 Bixin "
E 0
Wherein, as can be seen from the watch, student A sends a "stick! "and three times" OK "; student terminal B sends "666" twice; the student terminal C does not send gesture information; the student end D sends four times "heart of heart! "; the student end E does not send gesture information.
S505: and sending the target bullet screen to the first terminal.
See S403 specifically, and will not be described herein again.
S506: and acquiring an input ending instruction of the gesture bullet screen, and sending the ending instruction to the first terminal.
See S308 specifically, and will not be described herein again.
Optionally, the input courseware page turning instruction is obtained, and the courseware page turning instruction is sent to the first terminal, so that the target barrage is hidden by the first terminal. See S403 specifically, and will not be described herein again.
In the embodiment of the application, the second terminal can generate the target barrage corresponding to the gesture information and send the target barrage to the first terminal, so that the interest of barrage display is increased, and the concentration of a user corresponding to the first terminal is maintained. Furthermore, the second terminal counts the interactive feedback information corresponding to the target bullet screen and displays the target bullet screen and the interactive feedback information, so that a user corresponding to the second terminal can clearly know the participation rate of the bullet screen display link and the bullet screen display condition.
The following are embodiments of the apparatus of the present application that may be used to perform embodiments of the method of the present application. For details which are not disclosed in the embodiments of the apparatus of the present application, reference is made to the embodiments of the method of the present application.
Please refer to fig. 10, which shows a schematic structural diagram of a bullet screen display device according to an exemplary embodiment of the present application. The bullet screen display device can be implemented as all or part of the device through software, hardware or a combination of the two. The device 1 comprises a gesture image acquisition module 11, a gesture information determination module 12, a gesture information sending module 13 and a target barrage display module 14.
The gesture image acquisition module 11 is configured to receive a trigger instruction of a gesture barrage sent by the second terminal, and acquire an input gesture image;
the gesture information determining module 12 is configured to identify a target gesture corresponding to the gesture image, and determine gesture information corresponding to the target gesture;
a gesture information sending module 13, configured to send the gesture information to the second terminal;
and the target bullet screen display module 14 is configured to receive the target bullet screen corresponding to the gesture information sent by the second terminal, and display the target bullet screen.
Optionally, as shown in fig. 11, the apparatus 1 further includes:
and the guiding information display module 15 is configured to display gesture guiding information corresponding to the trigger instruction.
Optionally, the gesture image capturing module 11 is further configured to:
and displaying timing information of preset time length, and collecting the gesture image in the preset time length.
Optionally, as shown in fig. 12, the gesture information determining module 12 includes:
a set selecting unit 121, configured to select a target gesture set from a preset gesture set based on a gesture use probability of a user of the first terminal;
a target gesture searching unit 122, configured to search the target gesture matching the gesture image in the target gesture set;
the gesture information determining unit 123 is configured to determine gesture information corresponding to the target gesture based on a corresponding relationship between different gestures and different gesture information.
Optionally, the target bullet screen display module 14 is specifically configured to:
and receiving a target bullet screen corresponding to the gesture information sent by the second terminal, and displaying and marking the target bullet screen, wherein the target bullet screen comprises the user identification of the first terminal and the gesture information.
Optionally, the apparatus 1 further comprises:
an ending instruction receiving module 16, configured to receive an ending instruction of the gesture bullet screen sent by the second terminal;
the judging module 17 is configured to judge whether the target bullet screen is completely displayed;
the bullet screen hiding module 18 is used for continuing to display the target bullet screen until the display is complete if the bullet screen is not displayed, and hiding the target bullet screen;
and the bullet screen hiding module 18 is further used for hiding the target bullet screen if the target bullet screen is hidden.
Optionally, as shown in fig. 13, the apparatus 1 further includes:
a page turning instruction receiving module 19, configured to receive a courseware page turning instruction sent by the second terminal;
and a bullet screen hiding module 18, which is also used for hiding the target bullet screen.
It should be noted that, when the bullet screen display device provided in the foregoing embodiment executes the bullet screen display method, only the division of the above functional modules is taken as an example, and in practical applications, the above function distribution may be completed by different functional modules according to needs, that is, the internal structure of the device is divided into different functional modules, so as to complete all or part of the above described functions. In addition, the bullet screen display device and the bullet screen display method provided by the above embodiments belong to the same concept, and the detailed implementation process is shown in the method embodiments, which is not described herein again.
The above-mentioned serial numbers of the embodiments of the present application are merely for description and do not represent the merits of the embodiments.
In this application embodiment, the user that first terminal corresponds can be through the input gesture so that first terminal demonstration the target barrage that the gesture corresponds, and need not be through input characters so that first terminal demonstration characters barrage, has avoided the distraction that manual input characters caused, inputs the gesture simultaneously and shows the barrage that the gesture corresponds, can increase the interest of barrage to maintain user's concentration.
The following are embodiments of the apparatus of the present application that may be used to perform embodiments of the method of the present application. For details which are not disclosed in the embodiments of the apparatus of the present application, reference is made to the embodiments of the method of the present application.
Please refer to fig. 14, which shows a schematic structural diagram of a bullet screen display device according to an exemplary embodiment of the present application. The bullet screen display device can be implemented as all or part of the device through software, hardware or a combination of the two. The device 2 comprises a trigger instruction sending module 21, a gesture information receiving module 22 and a target barrage sending module 23.
The trigger instruction sending module 21 is configured to obtain a trigger instruction of the input gesture barrage, and send the trigger instruction to the first terminal, so that the first terminal collects a gesture image and determines gesture information corresponding to a target gesture in the gesture image;
a gesture information receiving module 22, configured to receive the gesture information sent by the first terminal;
and the target bullet screen sending module 23 is configured to generate a target bullet screen corresponding to the gesture information, and send the target bullet screen to the first terminal, so that the first terminal displays the target bullet screen.
Optionally, as shown in fig. 15, the target bullet screen sending module 23 includes:
an identifier obtaining unit 231, configured to obtain a user identifier of the first terminal;
the bullet screen generating unit 232 is configured to generate the target bullet screen, where the target bullet screen includes the user identifier and the gesture information.
Optionally, as shown in fig. 16, the apparatus 2 further includes:
and the feedback information display module 24 is configured to count the interaction feedback information corresponding to the target bullet screen, and display the target bullet screen and the interaction feedback information.
Optionally, the apparatus 2 further comprises:
an ending instruction obtaining module 25, configured to obtain an ending instruction of the input gesture bullet screen;
and an ending instruction sending module 26, configured to send the ending instruction to the first terminal, so that the target bullet screen is hidden by the first terminal after the target bullet screen is completely displayed.
Optionally, as shown in fig. 17, the apparatus 2 further includes:
a page turning instruction obtaining module 27, configured to obtain an input courseware page turning instruction;
a page turning instruction sending module 28, configured to send the courseware page turning instruction to the first terminal, so that the target barrage is hidden by the first terminal.
It should be noted that, when the bullet screen display device provided in the foregoing embodiment executes the bullet screen display method, only the division of the above functional modules is taken as an example, and in practical applications, the above function distribution may be completed by different functional modules according to needs, that is, the internal structure of the device is divided into different functional modules, so as to complete all or part of the above described functions. In addition, the bullet screen display device and the bullet screen display method provided by the above embodiments belong to the same concept, and the detailed implementation process is shown in the method embodiments, which is not described herein again.
The above-mentioned serial numbers of the embodiments of the present application are merely for description and do not represent the merits of the embodiments.
In this application embodiment, the second terminal obtains the trigger command of the gesture barrage of inputing, will trigger command sends to first terminal, so that first terminal gathers the gesture image and confirms the gesture information that the target gesture corresponds in the gesture image, and receive first terminal sends the gesture information generates the target barrage that the gesture information corresponds, and will target barrage send to first terminal, so that first terminal shows the target barrage. The second terminal can generate a target barrage corresponding to the gesture information and send the target barrage to the first terminal, so that interestingness of barrage display is increased, and concentration of a user corresponding to the first terminal is maintained.
An embodiment of the present application further provides a computer storage medium, where the computer storage medium may store a plurality of instructions, and the instructions are suitable for being loaded by a processor and executing the bullet screen display method according to the embodiment shown in fig. 1 to 9, and a specific execution process may refer to specific descriptions of the embodiment shown in fig. 1 to 9, which is not described herein again.
The present application further provides a computer program product, where at least one instruction is stored in the computer program product, and the at least one instruction is loaded by the processor and executes the bullet screen display method according to the embodiment shown in fig. 1 to 9, where a specific execution process may refer to specific descriptions of the embodiment shown in fig. 1 to 9, and is not described herein again.
Please refer to fig. 18, which provides a schematic structural diagram of a first terminal according to an embodiment of the present application. As shown in fig. 18, the first terminal 1000 can include: at least one processor 1001, at least one network interface 1004, a user interface 1003, memory 1005, at least one communication bus 1002.
Wherein a communication bus 1002 is used to enable connective communication between these components.
The user interface 1003 may include a Display screen (Display) and a Camera (Camera), and the optional user interface 1003 may also include a standard wired interface and a wireless interface.
The network interface 1004 may optionally include a standard wired interface, a wireless interface (e.g., WI-FI interface), among others.
Processor 1001 may include one or more processing cores, among other things. The processor 1001 connects various parts throughout the server 1000 using various interfaces and lines, and performs various functions of the server 1000 and processes data by executing or executing instructions, programs, code sets, or instruction sets stored in the memory 1005, and calling data stored in the memory 1005. Alternatively, the processor 1001 may be implemented in at least one hardware form of Digital Signal Processing (DSP), Field-Programmable Gate Array (FPGA), and Programmable Logic Array (PLA). The processor 1001 may integrate one or more of a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), a modem, and the like. Wherein, the CPU mainly processes an operating system, a user interface, an application program and the like; the GPU is used for rendering and drawing the content required to be displayed by the display screen; the modem is used to handle wireless communications. It is understood that the modem may not be integrated into the processor 1001, but may be implemented by a single chip.
The Memory 1005 may include a Random Access Memory (RAM) or a Read-Only Memory (Read-Only Memory). Optionally, the memory 1005 includes a non-transitory computer-readable medium. The memory 1005 may be used to store an instruction, a program, code, a set of codes, or a set of instructions. The memory 1005 may include a stored program area and a stored data area, wherein the stored program area may store instructions for implementing an operating system, instructions for at least one function (such as a touch function, a sound playing function, an image playing function, etc.), instructions for implementing the various method embodiments described above, and the like; the storage data area may store data and the like referred to in the above respective method embodiments. The memory 1005 may optionally be at least one memory device located remotely from the processor 1001. As shown in fig. 18, a memory 1005, which is a kind of computer storage medium, may include therein an operating system, a network communication module, a user interface module, and a bullet screen display application program.
In the first terminal 1000 shown in fig. 18, the user interface 1003 is mainly used as an interface for providing input for a user, and acquiring data input by the user; the processor 1001 may be configured to call the bullet screen display application stored in the memory 1005, and specifically perform the following operations:
receiving a trigger instruction of a gesture barrage sent by a second terminal, and collecting an input gesture image;
identifying a target gesture corresponding to the gesture image, and determining gesture information corresponding to the target gesture;
sending the gesture information to the second terminal;
and receiving a target bullet screen corresponding to the gesture information sent by the second terminal, and displaying the target bullet screen.
In one embodiment, after executing the trigger instruction of receiving the gesture barrage sent by the second terminal, the processor 1001 further performs the following operations:
and displaying gesture guiding information corresponding to the triggering instruction.
In one embodiment, when the processor 1001 performs the capturing of the input gesture image, the following operations are specifically performed:
and displaying timing information of preset time length, and collecting the gesture image in the preset time length.
In an embodiment, when the processor 1001 performs the target gesture corresponding to the gesture image and determines the gesture information corresponding to the target gesture, specifically performs the following operations:
selecting a target gesture set from a preset gesture set based on the gesture use probability of the user of the first terminal;
searching the target gesture matched with the gesture image in the target gesture set;
and determining gesture information corresponding to the target gesture based on the corresponding relation between different gestures and different gesture information.
In an embodiment, when the receiving of the target barrage corresponding to the gesture information sent by the second terminal and the displaying of the target barrage are performed, the processor 1001 specifically performs the following operations:
and receiving a target bullet screen corresponding to the gesture information sent by the second terminal, and displaying and marking the target bullet screen, wherein the target bullet screen comprises the user identification of the first terminal and the gesture information.
In one embodiment, after the processor 1001 displays the target barrage, the following operations are further performed:
receiving an ending instruction of the gesture bullet screen sent by the second terminal;
judging whether the target bullet screen is completely displayed;
if not, continuing to display the target bullet screen until the display is complete, and hiding the target bullet screen;
and if so, hiding the target bullet screen.
In one embodiment, after the processor 1001 displays the target barrage, the following operations are further performed:
receiving a courseware page turning instruction sent by the second terminal;
and hiding the target bullet screen.
In this application embodiment, the user that first terminal corresponds can be through the input gesture so that first terminal demonstration the target barrage that the gesture corresponds, and need not be through input characters so that first terminal demonstration characters barrage, has avoided the distraction that manual input characters caused, inputs the gesture simultaneously and shows the barrage that the gesture corresponds, can increase the interest of barrage to maintain user's concentration.
Please refer to fig. 19, which provides a schematic structural diagram of a second terminal according to an embodiment of the present application. As shown in fig. 19, the second terminal 2000 may include: at least one processor 2001, at least one network interface 2004, a user interface 2003, memory 2005, at least one communication bus 2002.
The communication bus 2002 is used to implement connection communication between these components.
The user interface 2003 may include a Display (Display) and a Camera (Camera), and the optional user interface 2003 may further include a standard wired interface and a wireless interface.
The network interface 2004 may optionally include a standard wired interface, a wireless interface (e.g., WI-FI interface), among others.
Processor 2001 may include one or more processing cores, among other things. The processor 2001 connects the various parts within the overall server 2000 using various interfaces and lines, and performs various functions of the server 2000 and processes data by executing or executing instructions, programs, code sets, or instruction sets stored in the memory 2005 and calling data stored in the memory 2005. Optionally, the processor 2001 may be implemented in hardware using at least one of Digital Signal Processing (DSP), Field-Programmable Gate Array (FPGA), and Programmable Logic Array (PLA). The processor 1001 may integrate one or more of a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), a modem, and the like. Wherein, the CPU mainly processes an operating system, a user interface, an application program and the like; the GPU is used for rendering and drawing the content required to be displayed by the display screen; the modem is used to handle wireless communications. It is understood that the modem may not be integrated into the processor 2001, but may be implemented by a single chip.
The Memory 2005 may include a Random Access Memory (RAM) or a Read-Only Memory (Read-Only Memory). Optionally, the memory 2005 includes a non-transitory computer-readable medium. The memory 2005 may be used to store instructions, programs, code, sets of codes, or sets of instructions. The memory 2005 may include a stored program area and a stored data area, wherein the stored program area may store instructions for implementing an operating system, instructions for at least one function (such as a touch function, a sound playing function, an image playing function, etc.), instructions for implementing the various method embodiments described above, and the like; the storage data area may store data and the like referred to in the above respective method embodiments. The memory 2005 may optionally also be at least one memory device located remotely from the aforementioned processor 2001. As shown in fig. 19, a memory 2005 as one type of computer storage medium may include therein an operating system, a network communication module, a user interface module, and a bullet screen display application program.
In the second terminal 2000 shown in fig. 19, the user interface 2003 is mainly used as an interface for providing input for a user, and acquiring data input by the user; the processor 2001 may be configured to call the bullet screen display application program stored in the memory 2005, and specifically perform the following operations:
acquiring a trigger instruction of an input gesture bullet screen, and sending the trigger instruction to a first terminal so as to enable the first terminal to acquire a gesture image and determine gesture information corresponding to a target gesture in the gesture image;
receiving the gesture information sent by the first terminal;
and generating a target bullet screen corresponding to the gesture information, and sending the target bullet screen to the first terminal so that the first terminal displays the target bullet screen.
In one embodiment, when the processor 2001 executes the generation of the target barrage corresponding to the gesture information, the following operations are specifically executed:
acquiring a user identifier of the first terminal;
and generating the target bullet screen, wherein the target bullet screen comprises the user identification and the gesture information.
In one embodiment, after the processor 2001 performs the generation of the target barrage corresponding to the gesture information, the following operations are further performed:
and counting the interactive feedback information corresponding to the target bullet screen, and displaying the target bullet screen and the interactive feedback information.
In one embodiment, after the processor 2001 executes the generation of the target barrage corresponding to the gesture information and sends the target barrage to the first terminal, so that the first terminal displays the target barrage, the following operations are further executed:
acquiring an input ending instruction of the gesture bullet screen;
and sending the ending instruction to the first terminal so that the first terminal hides the target bullet screen after the target bullet screen is completely displayed.
In one embodiment, after the processor 2001 executes the generation of the target barrage corresponding to the gesture information and sends the target barrage to the first terminal, so that the first terminal displays the target barrage, the following operations are further executed:
acquiring an input courseware page turning instruction;
and sending the courseware page turning instruction to the first terminal so that the target barrage is hidden by the first terminal.
In this application embodiment, the second terminal obtains the trigger command of the gesture barrage of inputing, will trigger command sends to first terminal, so that first terminal gathers the gesture image and confirms the gesture information that the target gesture corresponds in the gesture image, and receive first terminal sends the gesture information generates the target barrage that the gesture information corresponds, and will target barrage send to first terminal, so that first terminal shows the target barrage. The second terminal can generate a target barrage corresponding to the gesture information and send the target barrage to the first terminal, so that interestingness of barrage display is increased, and concentration of a user corresponding to the first terminal is maintained.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by a computer program, which can be stored in a computer-readable storage medium, and when executed, can include the processes of the embodiments of the methods described above. The storage medium may be a magnetic disk, an optical disk, a read-only memory or a random access memory.
The above disclosure is only for the purpose of illustrating the preferred embodiments of the present application and is not to be construed as limiting the scope of the present application, so that the present application is not limited thereto, and all equivalent variations and modifications can be made to the present application.

Claims (16)

1. A bullet screen display method is applied to a first terminal, and is characterized by comprising the following steps:
receiving a trigger instruction of a gesture barrage sent by a second terminal, and collecting an input gesture image;
identifying a target gesture corresponding to the gesture image, and determining gesture information corresponding to the target gesture;
sending the gesture information to the second terminal;
and receiving a target bullet screen corresponding to the gesture information sent by the second terminal, and displaying the target bullet screen.
2. The method according to claim 1, wherein after receiving the triggering instruction of the gesture barrage sent by the second terminal, the method further comprises:
and displaying gesture guiding information corresponding to the triggering instruction.
3. The method of claim 1, wherein capturing the input gesture image comprises:
and displaying timing information of preset time length, and collecting the gesture image in the preset time length.
4. The method according to claim 1, wherein the recognizing a target gesture corresponding to the gesture image and determining gesture information corresponding to the target gesture comprise:
selecting a target gesture set from a preset gesture set based on the gesture use probability of the user of the first terminal;
searching the target gesture matched with the gesture image in the target gesture set;
and determining gesture information corresponding to the target gesture based on the corresponding relation between different gestures and different gesture information.
5. The method according to claim 1, wherein the receiving a target barrage corresponding to the gesture information sent by the second terminal and displaying the target barrage comprises:
and receiving a target bullet screen corresponding to the gesture information sent by the second terminal, and displaying and marking the target bullet screen, wherein the target bullet screen comprises the user identification of the first terminal and the gesture information.
6. The method of claim 1, wherein after displaying the target barrage, further comprising:
receiving an ending instruction of the gesture bullet screen sent by the second terminal;
judging whether the target bullet screen is completely displayed;
if not, continuing to display the target bullet screen until the display is complete, and hiding the target bullet screen;
and if so, hiding the target bullet screen.
7. The method of claim 1, wherein after displaying the target barrage, further comprising:
receiving a courseware page turning instruction sent by the second terminal;
and hiding the target bullet screen.
8. A bullet screen display method is applied to a second terminal, and is characterized by comprising the following steps:
acquiring a trigger instruction of an input gesture bullet screen, and sending the trigger instruction to a first terminal so as to enable the first terminal to acquire a gesture image and determine gesture information corresponding to a target gesture in the gesture image;
receiving the gesture information sent by the first terminal;
and generating a target bullet screen corresponding to the gesture information, and sending the target bullet screen to the first terminal so that the first terminal displays the target bullet screen.
9. The method of claim 8, wherein the generating of the target barrage corresponding to the gesture information comprises:
acquiring a user identifier of the first terminal;
and generating the target bullet screen, wherein the target bullet screen comprises the user identification and the gesture information.
10. The method according to claim 8, wherein after the generating of the target barrage corresponding to the gesture information, further comprising:
and counting the interactive feedback information corresponding to the target bullet screen, and displaying the target bullet screen and the interactive feedback information.
11. The method according to claim 8, wherein after generating a target barrage corresponding to the gesture information and sending the target barrage to the first terminal so that the first terminal displays the target barrage, the method further comprises:
acquiring an input ending instruction of the gesture bullet screen;
and sending the ending instruction to the first terminal so that the first terminal hides the target bullet screen after the target bullet screen is completely displayed.
12. The method according to claim 8, wherein after generating a target barrage corresponding to the gesture information and sending the target barrage to the first terminal so that the first terminal displays the target barrage, the method further comprises:
acquiring an input courseware page turning instruction;
and sending the courseware page turning instruction to the first terminal so that the target barrage is hidden by the first terminal.
13. A bullet screen display device, characterized in that the device comprises:
the gesture image acquisition module is used for receiving a trigger instruction of the gesture barrage sent by the second terminal and acquiring the input gesture image;
the gesture information determining module is used for identifying a target gesture corresponding to the gesture image and determining gesture information corresponding to the target gesture;
the gesture information sending module is used for sending the gesture information to the second terminal;
and the target bullet screen display module is used for receiving the target bullet screen corresponding to the gesture information sent by the second terminal and displaying the target bullet screen.
14. A bullet screen display device, characterized in that the device comprises:
the trigger instruction sending module is used for acquiring a trigger instruction of the input gesture barrage and sending the trigger instruction to the first terminal so as to enable the first terminal to collect a gesture image and determine gesture information corresponding to a target gesture in the gesture image;
the gesture information receiving module is used for receiving the gesture information sent by the first terminal;
and the target bullet screen sending module is used for generating a target bullet screen corresponding to the gesture information and sending the target bullet screen to the first terminal so that the first terminal displays the target bullet screen.
15. A computer storage medium, characterized in that it stores a plurality of instructions adapted to be loaded by a processor and to perform the method steps according to any of claims 1 to 7 or 8 to 12.
16. A terminal, comprising: a processor and a memory; wherein the memory stores a computer program adapted to be loaded by the processor and to perform the method steps according to any of claims 1-7 or 8-12.
CN202011588812.4A 2020-12-28 2020-12-28 Bullet screen display method and device, storage medium and terminal Pending CN112667081A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011588812.4A CN112667081A (en) 2020-12-28 2020-12-28 Bullet screen display method and device, storage medium and terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011588812.4A CN112667081A (en) 2020-12-28 2020-12-28 Bullet screen display method and device, storage medium and terminal

Publications (1)

Publication Number Publication Date
CN112667081A true CN112667081A (en) 2021-04-16

Family

ID=75411711

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011588812.4A Pending CN112667081A (en) 2020-12-28 2020-12-28 Bullet screen display method and device, storage medium and terminal

Country Status (1)

Country Link
CN (1) CN112667081A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113315871A (en) * 2021-05-25 2021-08-27 广州三星通信技术研究有限公司 Mobile terminal and operating method thereof
CN113823109A (en) * 2021-08-02 2021-12-21 阿波罗智联(北京)科技有限公司 Live broadcast method and device, electronic equipment and storage medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019007281A1 (en) * 2017-07-04 2019-01-10 上海全土豆文化传播有限公司 Method for displaying on-screen comment, and client
CN109309878A (en) * 2017-07-28 2019-02-05 Tcl集团股份有限公司 The generation method and device of barrage
WO2019101038A1 (en) * 2017-11-22 2019-05-31 腾讯科技(深圳)有限公司 Bullet screen content control method, computer device and storage medium
CN110187764A (en) * 2019-05-29 2019-08-30 努比亚技术有限公司 A kind of barrage display methods, wearable device and storage medium
CN112004113A (en) * 2020-07-27 2020-11-27 北京大米科技有限公司 Teaching interaction method, device, server and storage medium

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019007281A1 (en) * 2017-07-04 2019-01-10 上海全土豆文化传播有限公司 Method for displaying on-screen comment, and client
CN109309878A (en) * 2017-07-28 2019-02-05 Tcl集团股份有限公司 The generation method and device of barrage
WO2019101038A1 (en) * 2017-11-22 2019-05-31 腾讯科技(深圳)有限公司 Bullet screen content control method, computer device and storage medium
CN110187764A (en) * 2019-05-29 2019-08-30 努比亚技术有限公司 A kind of barrage display methods, wearable device and storage medium
CN112004113A (en) * 2020-07-27 2020-11-27 北京大米科技有限公司 Teaching interaction method, device, server and storage medium

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113315871A (en) * 2021-05-25 2021-08-27 广州三星通信技术研究有限公司 Mobile terminal and operating method thereof
CN113315871B (en) * 2021-05-25 2022-11-22 广州三星通信技术研究有限公司 Mobile terminal and operating method thereof
CN113823109A (en) * 2021-08-02 2021-12-21 阿波罗智联(北京)科技有限公司 Live broadcast method and device, electronic equipment and storage medium
CN113823109B (en) * 2021-08-02 2023-01-17 阿波罗智联(北京)科技有限公司 Live broadcast method and device, electronic equipment and storage medium

Similar Documents

Publication Publication Date Title
US10834479B2 (en) Interaction method based on multimedia programs and terminal device
CN105556594B (en) Voice recognition processing unit, voice recognition processing method and display device
JP6168544B2 (en) INTERACTION METHOD BASED ON MULTIMEDIA PROGRAM, TERMINAL DEVICE, AND SERVER
CN106101747B (en) A kind of barrage content processing method and application server, user terminal
CN109803152B (en) Violation auditing method and device, electronic equipment and storage medium
CN106658199A (en) Video content display method and apparatus
CN105435453A (en) Bullet screen information processing method, device and system
CN108304762B (en) Human body posture matching method and device, storage medium and terminal
CN112653902B (en) Speaker recognition method and device and electronic equipment
CN109086276B (en) Data translation method, device, terminal and storage medium
EP4047490A1 (en) Video-based interaction realization method and apparatus, device and medium
CN108777806B (en) User identity recognition method, device and storage medium
CN111010598B (en) Screen capture application method and smart television
CN112667081A (en) Bullet screen display method and device, storage medium and terminal
EP4191513A1 (en) Image processing method and apparatus, device and storage medium
CN107657469A (en) A kind of method for pushing of advertising message, device and set top box
CN113573090A (en) Content display method, device and system in game live broadcast and storage medium
CN111666820A (en) Speaking state recognition method and device, storage medium and terminal
CN111522524B (en) Presentation control method and device based on conference robot, storage medium and terminal
CN105808231B (en) System and method for recording and playing script
CN111741321A (en) Live broadcast control method, device, equipment and computer storage medium
CN112866577B (en) Image processing method and device, computer readable medium and electronic equipment
US20170134327A1 (en) Method and device for notifying mobile terminal of unread information
CN111523343B (en) Reading interaction method, device, equipment, server and storage medium
US20170139933A1 (en) Electronic Device, And Computer-Readable Storage Medium For Quickly Searching Video Segments

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination