CN113873266A - Barrage display method, user terminal, server, device and storage medium - Google Patents

Barrage display method, user terminal, server, device and storage medium Download PDF

Info

Publication number
CN113873266A
CN113873266A CN202010616903.8A CN202010616903A CN113873266A CN 113873266 A CN113873266 A CN 113873266A CN 202010616903 A CN202010616903 A CN 202010616903A CN 113873266 A CN113873266 A CN 113873266A
Authority
CN
China
Prior art keywords
information
bullet screen
user
scene
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010616903.8A
Other languages
Chinese (zh)
Inventor
范涛
曾琦娟
唐健明
彭丹
张瑞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
China Mobile Communications Group Co Ltd
China Mobile Chengdu ICT Co Ltd
Original Assignee
China Mobile Communications Group Co Ltd
China Mobile Chengdu ICT Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China Mobile Communications Group Co Ltd, China Mobile Chengdu ICT Co Ltd filed Critical China Mobile Communications Group Co Ltd
Priority to CN202010616903.8A priority Critical patent/CN113873266A/en
Publication of CN113873266A publication Critical patent/CN113873266A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/21Server components or server architectures
    • H04N21/218Source of audio or video content, e.g. local disk arrays
    • H04N21/2187Live feed
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/25Management operations performed by the server for facilitating the content distribution or administrating data related to end-users or client devices, e.g. end-user or client device authentication, learning user preferences for recommending movies
    • H04N21/258Client or end-user data management, e.g. managing client capabilities, user preferences or demographics, processing of multiple end-users preferences to derive collaborative data
    • H04N21/25866Management of end-user data
    • H04N21/25891Management of end-user data being end-user preferences
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/422Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
    • H04N21/42203Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS] sound input device, e.g. microphone
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/441Acquiring end-user identification, e.g. using personal code sent by the remote control or by inserting a card
    • H04N21/4415Acquiring end-user identification, e.g. using personal code sent by the remote control or by inserting a card using biometric characteristics of the user, e.g. by voice recognition or fingerprint scanning
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/478Supplemental services, e.g. displaying phone caller identification, shopping application
    • H04N21/4788Supplemental services, e.g. displaying phone caller identification, shopping application communicating with other users, e.g. chatting
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/488Data services, e.g. news ticker
    • H04N21/4884Data services, e.g. news ticker for displaying subtitles
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/81Monomedia components thereof
    • H04N21/8146Monomedia components thereof involving graphical data, e.g. 3D object, 2D graphics

Abstract

The embodiment of the invention provides a bullet screen display method, a user terminal, a server, equipment and a storage medium. The method is applied to the user terminal and comprises the following steps: acquiring augmented reality AR scene information and trigger information input by a user; responding to the trigger information, sending a bullet screen acquisition request to the server, wherein the bullet screen acquisition request comprises AR scene information and is used for the server to determine target bullet screen information corresponding to the AR scene information; receiving target bullet screen information sent by a server; and displaying the target bullet screen information. By the embodiment of the invention, the user can check the barrage information of other users aiming at the AR content, so that real-time information sharing is carried out based on the AR content, and the user experience is improved.

Description

Barrage display method, user terminal, server, device and storage medium
Technical Field
The present invention relates to the field of augmented reality technologies, and in particular, to a bullet screen display method, a user terminal, a server, a device, and a storage medium.
Background
Augmented Reality (AR) projects a virtual model to the real world through computer operation, so that a real environment and a virtual object are superimposed on the same picture or space in real time, and interaction between a person and virtual information can be performed.
However, in the current AR system, a user cannot view messages or comments of other users with respect to AR contents, for example, in AR teaching, messages or comments of other people with respect to teaching contents cannot be viewed between students, between teachers and students, and between teachers. As a result, it is difficult to share information based on AR content, resulting in poor user experience.
Disclosure of Invention
The embodiment of the invention provides a barrage display method, a user terminal, a server, equipment and a storage medium, which can realize that a user checks barrage information of other users aiming at AR content, and further perform real-time information sharing based on the AR content.
In a first aspect, an embodiment of the present invention provides a bullet screen display method, which is applied to a user terminal, and includes:
acquiring AR scene information and trigger information input by a user;
responding to the trigger information, sending a bullet screen acquisition request to the server, wherein the bullet screen acquisition request comprises AR scene information and is used for the server to determine target bullet screen information corresponding to the AR scene information;
receiving target bullet screen information sent by a server;
and displaying the target bullet screen information.
In some implementations of the first aspect, obtaining the trigger information input by the user includes:
acquiring trigger information input by a user in a voice mode; alternatively, the first and second electrodes may be,
acquiring trigger information input by a user through a key arranged on a user terminal; alternatively, the first and second electrodes may be,
acquiring trigger information input by a user through a touch control element arranged on a user terminal.
In some implementations of the first aspect, the method further comprises:
acquiring at least one item of terminal information of a user terminal and user information of a user;
the bullet screen acquiring request further comprises at least one item of terminal information and user information, and the bullet screen acquiring request is used for the server to determine target bullet screen information corresponding to the AR scene information and the at least one item of the terminal information and the user information.
In some implementations of the first aspect, the AR scene information includes at least one of a physical image, a virtual image, scene keyword information, and visual identification information.
In some implementations of the first aspect, the scene keyword information includes first scene keyword information input by a user through a voice manner and/or second scene keyword information acquired based on an AR scene.
In some implementations of the first aspect, the number of target barrage information is at least one;
displaying target bullet screen information, comprising:
acquiring target category information input by a user in a voice mode;
and determining target bullet screen information corresponding to the target category information in the at least one piece of target bullet screen information, and displaying the target bullet screen information corresponding to the target category information.
In a second aspect, an embodiment of the present invention provides a bullet screen display method, which is applied to a server, and includes:
receiving a barrage acquisition request sent by a user terminal, wherein the barrage acquisition request comprises AR scene information;
determining target bullet screen information corresponding to the AR scene information;
and sending the target bullet screen information to the user terminal so that the user terminal can display the target bullet screen information.
In some realizations of the second aspect, the bullet screen acquiring request further includes at least one of terminal information of the user terminal and user information of the user;
determining target barrage information corresponding to the AR scene information, including:
and determining target bullet screen information corresponding to at least one of the AR scene information, the terminal information and the user information.
In some implementations of the second aspect, the AR scene information includes at least one of a physical image, a virtual image, scene keyword information, and visual identification information.
In some realizations of the second aspect, the scene keyword information includes first scene keyword information input by a user through a voice manner and/or second scene keyword information acquired based on an AR scene.
In some implementations of the second aspect, determining the target barrage information corresponding to the AR scene information includes:
and determining target bullet screen information corresponding to the AR scene information in a bullet screen database, wherein the bullet screen database comprises bullet screen information input by a user and/or bullet screen information obtained from a resource database.
In a third aspect, an embodiment of the present invention provides a user terminal, where the user terminal includes:
the acquisition module is used for acquiring AR scene information and trigger information input by a user;
the sending module is used for responding to the trigger information and sending a bullet screen obtaining request to the server, wherein the bullet screen obtaining request comprises AR scene information and is used for the server to determine target bullet screen information corresponding to the AR scene information;
the receiving module is used for receiving the target bullet screen information sent by the server;
and the display module is used for displaying the target bullet screen information.
In some implementations of the third aspect, the obtaining module is specifically configured to: acquiring trigger information input by a user in a voice mode; or acquiring trigger information input by a user through a key arranged on the user terminal; or acquiring the trigger information input by the user through a touch control element arranged on the user terminal.
In some implementations of the third aspect, the obtaining module is further configured to obtain at least one of terminal information of the user terminal and user information of the user; the bullet screen acquiring request further comprises at least one item of terminal information and user information, and the bullet screen acquiring request is used for the server to determine target bullet screen information corresponding to the AR scene information and the at least one item of the terminal information and the user information.
In some implementations of the third aspect, the AR scene information includes at least one of a physical image, a virtual image, scene keyword information, and visual identification information.
In some implementations of the third aspect, the scene keyword information includes first scene keyword information input by a user through a voice manner and/or second scene keyword information acquired based on the AR scene.
In some implementations of the third aspect, the number of target barrage information is at least one;
the acquisition module is also used for acquiring target category information input by a user in a voice mode;
the display module is specifically configured to: and determining target bullet screen information corresponding to the target category information in the at least one piece of target bullet screen information, and displaying the target bullet screen information corresponding to the target category information.
In a fourth aspect, an embodiment of the present invention provides a server, where the server includes:
the receiving module is used for receiving a barrage acquiring request sent by a user terminal, wherein the barrage acquiring request comprises AR scene information;
the determining module is used for determining target barrage information corresponding to the AR scene information;
and the sending module is used for sending the target bullet screen information to the user terminal so that the user terminal can display the target bullet screen information.
In some implementations of the fourth aspect, the barrage acquisition request further includes at least one of terminal information of the user terminal and user information of the user;
the determination module is specifically configured to: and determining target bullet screen information corresponding to at least one of the AR scene information, the terminal information and the user information.
In some implementations of the fourth aspect, the AR scene information includes at least one of a physical image, a virtual image, scene keyword information, and visual identification information.
In some implementations of the fourth aspect, the scene keyword information includes first scene keyword information input by a user through a voice manner and/or second scene keyword information acquired based on the AR scene.
In some implementations of the fourth aspect, the determining module is specifically configured to: and determining target bullet screen information corresponding to the AR scene information in a bullet screen database, wherein the bullet screen database comprises bullet screen information input by a user and/or bullet screen information obtained from a resource database.
In a fifth aspect, an embodiment of the present invention provides a bullet screen display device, including: a processor and a memory storing computer program instructions; the bullet screen display method described in the first aspect or any of the realizable manners of the first aspect is implemented when the processor executes the computer program instructions, or the bullet screen display method described in the second aspect or any of the realizable manners of the second aspect is implemented when the processor executes the computer program instructions.
In a sixth aspect, an embodiment of the present invention provides a computer-readable storage medium, where computer program instructions are stored on the computer-readable storage medium, and when executed by a processor, the computer program instructions implement the bullet screen display method in the first aspect or any of the realizable manners of the first aspect, or when executed by a processor, the computer program instructions implement the bullet screen display method in the second aspect or any of the realizable manners of the second aspect.
According to the bullet screen display method, the user terminal, the server, the equipment and the storage medium, the user terminal acquires AR scene information and trigger information input by a user, responds to the trigger information and sends a bullet screen acquisition request comprising the AR scene information to the server, the server determines target bullet screen information corresponding to the AR scene information according to the bullet screen acquisition request, target bullet screen information related to AR content can be accurately determined, the target bullet screen information is sent to the user terminal, and the user terminal displays the target bullet screen information to the user. Therefore, the user can check the barrage information of other users aiming at the AR content, real-time information sharing is carried out based on the AR content, and user experience is improved.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings required to be used in the embodiments of the present invention will be briefly described below, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
Fig. 1 is a schematic structural diagram of a bullet screen display system according to an embodiment of the present invention;
fig. 2 is a schematic flowchart of a bullet screen display method according to an embodiment of the present invention;
fig. 3 is a schematic flow chart of another bullet screen display method according to an embodiment of the present invention;
fig. 4 is a schematic diagram of constructing a bullet screen database according to an embodiment of the present invention;
fig. 5 is a schematic flowchart of another bullet screen display method according to an embodiment of the present invention;
fig. 6 is a schematic diagram of a bullet screen display effect according to an embodiment of the present invention;
fig. 7 is a schematic structural diagram of a user terminal according to an embodiment of the present invention;
fig. 8 is a schematic structural diagram of a server according to an embodiment of the present invention;
fig. 9 is a schematic diagram of a hardware structure of a bullet screen display device according to an embodiment of the present invention.
Detailed Description
Features and exemplary embodiments of various aspects of the present invention will be described in detail below, and in order to make objects, technical solutions and advantages of the present invention more apparent, the present invention will be further described in detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not to be construed as limiting the invention. It will be apparent to one skilled in the art that the present invention may be practiced without some of these specific details. The following description of the embodiments is merely intended to provide a better understanding of the present invention by illustrating examples of the present invention.
It is noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising … …" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element.
At present, the AR technology can be applied to scenes such as teaching, meetings, concerts and the like, for example, in the AR teaching, the AR technology can show related virtual scenes in combination with courseware contents, and show complex and abstract concepts in a vivid manner, so that students are helped to strengthen knowledge cognition, enthusiasm of the students is mobilized, and learning efficiency is further improved. However, in AR teaching, it is impossible to check messages or comments of other people on teaching contents among students, teachers and students, and teachers. Further, current AR technologies are difficult to share information based on AR content, resulting in poor user experience.
As an information interaction mode, the barrage can enable different viewers to perform viewpoint communication when watching the same content, and information under more viewing angles can be obtained. The traditional barrage scheme is usually applied to online video playing, a time axis parameter is used as a matching parameter, instant barrage content matched with the time is played at the corresponding time, however, the AR content is rendered in real time, the AR content displayed each time is not repeated, and the time axis parameter is used as the matching parameter and cannot adapt to the AR content rendered in real time.
In view of the above, embodiments of the present invention provide a bullet screen display method, a user terminal, a server, a device, and a storage medium, where the user terminal may obtain AR scene information and trigger information input by a user, and send a bullet screen obtaining request including the AR scene information to the server in response to the trigger information, and the server determines target bullet screen information corresponding to the AR scene information according to the bullet screen obtaining request, and may accurately determine target bullet screen information associated with AR content, and send the target bullet screen information to the user terminal, and the user terminal displays the target bullet screen information to the user. Therefore, the user can check the barrage information, namely the left message or the comment, of other users aiming at the AR content, and then real-time information sharing is carried out based on the AR content, so that the user experience is improved.
As shown in fig. 1, the bullet screen display method according to the embodiment of the present invention is applied to a bullet screen display system, where the bullet screen display system includes a user terminal and a server, the user terminal is in communication connection with the server and can send a bullet screen acquisition request to the server, the server responds according to the bullet screen acquisition request, returns target bullet screen information matched with the bullet screen acquisition request, and the user terminal renders the target bullet screen information and displays the target bullet screen information to a user. The user terminal may be an AR device such as AR glasses.
The bullet screen display method provided by the embodiment of the invention is described below with reference to the accompanying drawings:
fig. 2 is a schematic flowchart of a bullet screen display method according to an embodiment of the present invention, and as shown in fig. 2, the bullet screen display method may include S210 to S240.
S210, the user terminal acquires AR scene information and trigger information input by the user.
The AR scene information can be obtained based on AR content rendered by the user terminal and can comprise at least one of a real object image, a virtual image, scene keyword information and visual identification information.
The real object image can comprise a screenshot of a real object image in the AR content; the virtual image may comprise one or more frames of a virtual imagery; the scene keyword information may include first scene keyword information input by a user in a voice manner and/or second scene keyword information acquired based on an AR scene; the visual identification information may be information obtained by the user terminal based on a preset visual identification having an identification function, and optionally, the visual identification may be set in a courseware in the AR teaching.
As an example, the first scene keyword information may include keywords uttered by the user after viewing the AR content, for example, the user sees a map of a city a and utters a special product of the city a, at this time, the user terminal may obtain the voice content of the user through a microphone, perform processing by using a preset keyword acoustic model, and recognize "city a" and "special product" as the keywords. The second scene keyword information may include keywords captured by the user terminal based on the AR scene, for example, in the AR teaching, keywords such as subject, chapter, page, knowledge point, and the like may be obtained based on the content of courseware displayed in the AR teaching.
In some embodiments, the user terminal may obtain the trigger information input by the user in a voice manner, for example, the user recites "open barrage", at this time, the user terminal may obtain the voice content of the user through a microphone, process the voice content by using a preset keyword acoustic model, recognize the trigger keyword "open barrage", and use the trigger keyword as the trigger information, so that the trigger operation may be simplified through voice recognition, and the immersive experience of the user may be improved.
Optionally, in other embodiments, the user terminal may obtain the trigger information input by the user through a key provided on the user terminal, for example, a bullet screen opening/closing key is provided on the user terminal, and the user may manually input the trigger information through the bullet screen opening/closing key when the user wants to view the bullet screen.
Optionally, in other embodiments, the user terminal may obtain the trigger information input by the user through a touch element disposed on the user terminal.
And S220, responding to the trigger information, and sending a bullet screen acquisition request to the server by the user terminal.
In response to the trigger information, the user terminal may construct a bullet screen acquisition request, where the bullet screen acquisition request includes AR scene information.
And S230, the server determines target bullet screen information corresponding to the AR scene information.
And the server receives the bullet screen acquisition request, and determines target bullet screen information corresponding to the AR scene information according to the bullet screen acquisition request.
In some embodiments, the server may determine target bullet screen information corresponding to the AR scene information in a bullet screen database, where the bullet screen database includes bullet screen information input by the user and/or bullet screen information obtained from a resource database, and high-quality target bullet screen information may be obtained from the bullet screen database.
S240, the server sends the target bullet screen information to the user terminal.
And S250, displaying the target bullet screen information by the user terminal.
And the user terminal receives the target bullet screen information, renders the target bullet screen information and displays the target bullet screen information to the user.
In some embodiments, the user terminal may obtain target category information input by a user in a voice manner, for example, the target category information may include a source of bullet screen information, and then determine target bullet screen information corresponding to the target category information from among the at least one piece of target bullet screen information, and display the target bullet screen information corresponding to the target category information. That is to say, can filter corresponding target barrage information according to target classification information, can avoid the display effect that target barrage information too much arouses not good, improve the display effect of target barrage information.
The target category information input by the voice mode is similar to the voice input trigger information of S210, and for brevity, the detailed description is omitted here.
Optionally, in other embodiments, since the AR content is three-dimensionally displayed, the user terminal may obtain its own direction information, and specifically, may obtain the direction sensor information of the user terminal, where the direction sensor includes, but is not limited to, a gyroscope and a level meter. Then, target bullet screen information is displayed in an area corresponding to the current direction information, the target bullet screen information can be displayed in the direction in which the user looks, and the watching experience of the user is improved.
In the embodiment of the invention, the user terminal can acquire the AR scene information and the trigger information input by the user, respond to the trigger information and send the bullet screen acquisition request comprising the AR scene information to the server, the server determines the target bullet screen information corresponding to the AR scene information according to the bullet screen acquisition request, the target bullet screen information related to the AR content can be accurately determined, the target bullet screen information is sent to the user terminal, and the user terminal displays the target bullet screen information to the user. Therefore, the user can check the barrage information, namely the left message or the comment, of other users aiming at the AR content, and then real-time information sharing is carried out based on the AR content, so that the user experience is improved.
In some embodiments, the bullet screen display method may further include the steps of:
the user terminal acquires at least one item of terminal information of the user terminal and user information of the user. The terminal information may include a terminal type, network information, a terminal power level, terminal display area information, and the like, where the network information may refer to a network type of the terminal, for example, the 4th Generation mobile communication technology (4G), the 5th-Generation mobile communication technology (5G), or wireless internet access (WiFi); the user information may include user identification, user preferences, user gender, etc., and may also include user grade, user class, user achievements, etc., provided the method is applied in AR teaching.
Optionally, when the user terminal acquires the terminal information, the bullet screen acquisition request further includes the terminal information, and the server may determine target bullet screen information corresponding to the AR scene information and the terminal information.
When the user terminal acquires the user information, the bullet screen acquisition request further includes the user information, and the server can determine target bullet screen information corresponding to the AR scene information and the user information.
When the user terminal acquires the terminal information and the user information, the bullet screen acquisition request further comprises the terminal information and the user information, and the server can determine target bullet screen information corresponding to the AR scene information, the terminal information and the user information.
Thus, the target bullet screen information can be determined more accurately according to more kinds of information.
In some embodiments, the target bullet screen information may include an operation option at the display area, and the user may edit and annotate the target bullet screen information, such as to agree or disagree with the target bullet screen information representation. Taking AR teaching as an example, when the target barrage information A which expresses the question of the knowledge point A is recognized by the students, the students can directly express identity through the operation options, and the teachers can make supplementary explanation after seeing the information.
In some embodiments, the bullet screen display method may further include the steps of:
the user terminal acquires closing information input by a user, and stops bullet screen display in response to the closing information. The obtaining of the closing information input by the user is similar to the obtaining of the triggering information input by the user in S210, and for brevity, the details are not repeated here.
In the following, a bullet screen display method according to an embodiment of the present invention is described by taking an example of applying the bullet screen display method to AR teaching, as shown in fig. 3, the bullet screen display method may include the following steps:
step 1, when a user terminal is started, a system initialization request is sent to a server.
And 2, the server responds to the system initialization request and sends initialization parameters to the user terminal so that the user terminal can complete self initialization according to the initialization parameters.
And 3, the user terminal acquires the trigger keywords input by the user in a voice mode.
And step 4, responding to the trigger keyword, prompting the user that the bullet screen information is being acquired by the user terminal, for example, sending out a voice prompt of 'bullet screen acquisition in process' to prompt the user that the bullet screen is being acquired, wherein the voice content can be flexibly adjusted without limitation.
And 5, the user terminal acquires the AR scene information, the terminal information and the user information and constructs a bullet screen acquisition request according to the information.
And 6, the user terminal sends a bullet screen acquisition request to the server.
And 7, the server analyzes the bullet screen acquisition request to obtain AR scene information, terminal information and user information, inquires a bullet screen database, and determines target bullet screen information corresponding to the AR scene information, the terminal information and the user information in the bullet screen database.
As an example, building a bullet screen database may be as shown in fig. 4, including the following steps:
first, the server acquires bullet screen information. On one hand, the teacher can directly input bullet screen information through the editor and can input bullet screen information through a pre-input form. On the other hand, Artificial Intelligence (AI) mining may be performed in a resource database, which includes but is not limited to an operating system, an examination system, and a learning forum, to generate barrage information, and the mined content may include knowledge points, wrong questions, key solutions, and the like, and optionally, mining may be performed in a keyword matching manner. On the other hand, bullet screen information can be input online in real time by students in the process of displaying the AR content. And then constructing a bullet screen database according to the acquired bullet screen information.
And 8, the server sends target bullet screen information to the user terminal.
And 9, rendering the target bullet screen information in batches by the user terminal and displaying the target bullet screen information to the user.
Through the steps 1-9, the teacher and the students can communicate information with each other on the same display content in the AR teaching, and the teaching effect is improved.
In the target barrage information display process, the user may recite a closing keyword for stopping displaying when the user wants to stop displaying the target barrage information, and on the basis of fig. 4, as shown in fig. 5, the barrage display method may further include the following steps:
and step 10, acquiring a closing keyword input by a user in a voice mode.
And step 11, responding to the closing keywords, and stopping displaying the target bullet screen information.
Fig. 6 is a schematic diagram of a bullet screen display effect provided by an embodiment of the present invention, which is applied to AR teaching, and as shown in fig. 6, a student wears AR glasses to check a chemical molecular structure, and displays 5 pieces of bullet screen information, where the first piece of bullet screen information "number of examination in the last half year: 5 times, the inflection points are calculated according to the information of the second bullet screen and the third bullet screen from a resource database: one carbon is counted where there is one corner, and the four miles subtracted: the hydrogen atoms on the carbon element are calculated only by subtracting lines on the carbon from 4, namely, the hydrogen atoms come from a chemical teacher, and the fourth bullet screen information and the fifth bullet screen information are used for selecting a main chain, determining a certain alkane, coding a carbon position, determining a branched chain and a substituent, writing in the front, annotating the position and short connecting lines, namely, the hydrogen atoms come from scholara A and scholara.
Further, in AR teaching, students recite "see my note", AR glasses display their own note (i.e., bullet screen information); the students pronounce the 'notes of classmates', and the AR glasses display the notes of classmates (namely bullet screen information); the student recites "see teacher remarks", and the AR glasses display the remarks (i.e., bullet screen information) of the teacher giving lessons. Therefore, the bullet screen information of the appointed category can be displayed, and the display effect is improved.
Based on the bullet screen display method of the embodiment of the present invention, an embodiment of the present invention further provides a user terminal, as shown in fig. 7, a user terminal 700 may include: an obtaining module 710, a sending module 720, a receiving module 730, and a displaying module 740.
The obtaining module 710 is configured to obtain AR scene information and trigger information input by a user.
The sending module 720 is configured to send a bullet screen obtaining request to the server in response to the trigger information, where the bullet screen obtaining request includes AR scene information, so that the server determines target bullet screen information corresponding to the AR scene information.
The receiving module 730 is configured to receive the target barrage information sent by the server.
And the display module 740 is configured to display the target bullet screen information.
In some embodiments, the obtaining module 710 is specifically configured to: acquiring trigger information input by a user in a voice mode, or acquiring trigger information input by the user through a key arranged on a user terminal, or acquiring trigger information input by the user through a touch control element arranged on the user terminal.
In some embodiments, the obtaining module 710 is further configured to obtain at least one of terminal information of the user terminal and user information of the user. The bullet screen obtaining request further comprises at least one item of terminal information and user information, and the bullet screen obtaining request is used for the server to determine target bullet screen information corresponding to the AR scene information and the at least one item of the terminal information and the user information.
In some embodiments, the AR scene information includes at least one of a physical image, a virtual image, scene keyword information, visual identification information.
In some embodiments, the scene keyword information includes first scene keyword information input by a user through a voice manner and/or second scene keyword information acquired based on an AR scene.
In some embodiments, the number of target bullet screen information is at least one.
The obtaining module 710 is further configured to obtain target category information input by the user in a voice manner.
The display module 740 is specifically configured to: and determining target bullet screen information corresponding to the target category information in the at least one piece of target bullet screen information, and displaying the target bullet screen information corresponding to the target category information.
It can be understood that each module/unit in the user terminal 700 shown in fig. 7 has a function of implementing each step executed by the user terminal in fig. 2, and can achieve the corresponding technical effect, and for brevity, no further description is provided herein.
Fig. 8 is a schematic structural diagram of a server according to an embodiment of the present invention, and as shown in fig. 8, the server 800 may include: a receiving module 810, a determining module 820, and a sending module 830.
The receiving module 810 is configured to receive a bullet screen obtaining request sent by a user terminal, where the bullet screen obtaining request includes AR scene information.
And a determining module 820, configured to determine target barrage information corresponding to the AR scene information.
The sending module 830 is configured to send the target bullet screen information to the user terminal, so that the user terminal displays the target bullet screen information.
In some embodiments, the bullet screen acquiring request further includes at least one of terminal information of the user terminal and user information of the user.
The determining module 820 is specifically configured to: and determining target bullet screen information corresponding to at least one of the AR scene information, the terminal information and the user information.
In some embodiments, the AR scene information includes at least one of a physical image, a virtual image, scene keyword information, visual identification information.
In some embodiments, the scene keyword information includes first scene keyword information input by a user through a voice manner and/or second scene keyword information acquired based on an AR scene.
In some embodiments, the determining module 820 is specifically configured to: and determining target bullet screen information corresponding to the AR scene information in a bullet screen database, wherein the bullet screen database comprises bullet screen information input by a user and/or bullet screen information obtained from a resource database.
It can be understood that each module/unit in the server 800 shown in fig. 8 has a function of implementing each step executed by the server in fig. 2, and can achieve the corresponding technical effect, and for brevity, no further description is provided herein.
Fig. 9 is a schematic diagram of a hardware structure of a bullet screen display device according to an embodiment of the present invention.
As shown in fig. 9, the bullet screen display device 900 in the present embodiment includes an input device 901, an input interface 902, a central processing unit 903, a memory 904, an output interface 905, and an output device 906. The input interface 902, the central processing unit 903, the memory 904, and the output interface 905 are connected to each other through a bus 910, and the input device 901 and the output device 906 are connected to the bus 910 through the input interface 902 and the output interface 905, respectively, and further connected to other components of the bullet screen display device 900.
Specifically, the input device 901 receives input information from the outside, and transmits the input information to the central processor 903 through the input interface 902; central processor 903 processes input information based on computer-executable instructions stored in memory 904 to generate output information, stores the output information temporarily or permanently in memory 904, and then transmits the output information to output device 906 via output interface 905; the output device 906 outputs the output information to the outside of the bullet screen display device 900 for use by the user.
In some embodiments, the bullet screen display device 900 shown in fig. 9 includes: a memory 904 for storing programs; and a processor 903, configured to execute a program stored in the memory to execute the bullet screen display method provided in the embodiment shown in fig. 2.
An embodiment of the present invention further provides a computer-readable storage medium, where the computer-readable storage medium has computer program instructions stored thereon; the computer program instructions, when executed by a processor, implement the bullet screen display method provided by the embodiment shown in fig. 2.
It should be clear that each embodiment in this specification is described in a progressive manner, and the same or similar parts among the embodiments may be referred to each other, and for brevity, the description is omitted. The invention is not limited to the specific configurations and processes described above and shown in the figures. A detailed description of known methods is omitted herein for the sake of brevity. In the above embodiments, several specific steps are described and shown as examples. However, the method processes of the present invention are not limited to the specific steps described and illustrated, and those skilled in the art can make various changes, modifications and additions or change the order between the steps after comprehending the spirit of the present invention.
The functional blocks shown in the above-described structural block diagrams may be implemented as hardware, software, firmware, or a combination thereof. When implemented in hardware, it may be, for example, an electronic Circuit, an Application Specific Integrated Circuit (ASIC), suitable firmware, plug-in, function card, or the like. When implemented in software, the elements of the invention are the programs or code segments used to perform the required tasks. The program or code segments may be stored in a machine-readable medium or transmitted by a data signal carried in a carrier wave over a transmission medium or a communication link. A "machine-readable medium" may include any medium that can store or transfer information. Examples of machine-readable media include electronic circuits, semiconductor Memory devices, Read-Only memories (ROMs), flash memories, erasable ROMs (eroms), floppy disks, CD-ROMs, optical disks, hard disks, fiber optic media, Radio Frequency (RF) links, and so forth. The code segments may be downloaded via computer networks such as the internet, intranet, etc.
It should also be noted that the exemplary embodiments mentioned in this patent describe some methods or systems based on a series of steps or devices. However, the present invention is not limited to the order of the above-described steps, that is, the steps may be performed in the order mentioned in the embodiments, may be performed in an order different from the order in the embodiments, or may be performed simultaneously.
As described above, only the specific embodiments of the present invention are provided, and it can be clearly understood by those skilled in the art that, for convenience and brevity of description, the specific working processes of the system, the module and the unit described above may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again. It should be understood that the scope of the present invention is not limited thereto, and any person skilled in the art can easily conceive various equivalent modifications or substitutions within the technical scope of the present invention, and these modifications or substitutions should be covered within the scope of the present invention.

Claims (15)

1. A bullet screen display method is applied to a user terminal, and comprises the following steps:
acquiring augmented reality AR scene information and trigger information input by a user;
responding to the trigger information, sending the bullet screen acquisition request to a server, wherein the bullet screen acquisition request comprises the AR scene information and is used for the server to determine target bullet screen information corresponding to the AR scene information;
receiving the target bullet screen information sent by the server;
and displaying the target bullet screen information.
2. The method of claim 1, wherein obtaining user-entered triggering information comprises:
acquiring the trigger information input by the user in a voice mode; alternatively, the first and second electrodes may be,
acquiring the trigger information input by the user through a key arranged on the user terminal; alternatively, the first and second electrodes may be,
and acquiring the trigger information input by the user through a touch control element arranged on the user terminal.
3. The method of claim 1, further comprising:
acquiring at least one item of terminal information of the user terminal and user information of the user;
the bullet screen acquiring request further comprises at least one item of the terminal information and the user information, so that the server determines the target bullet screen information corresponding to the AR scene information and the at least one item of the terminal information and the user information.
4. The method of any of claims 1-3, wherein the AR scene information comprises at least one of a physical image, a virtual image, scene keyword information, and visual identification information.
5. The method according to claim 4, wherein the scene keyword information comprises first scene keyword information input by the user through a voice manner and/or second scene keyword information obtained based on an AR scene.
6. The method according to any one of claims 1-3, 5, wherein the number of the target barrage information is at least one;
the displaying the target bullet screen information comprises:
acquiring target category information input by the user in a voice mode;
and determining the target bullet screen information corresponding to the target category information in at least one piece of target bullet screen information, and displaying the target bullet screen information corresponding to the target category information.
7. A bullet screen display method is applied to a server and comprises the following steps:
receiving a barrage acquisition request sent by a user terminal, wherein the barrage acquisition request comprises AR scene information;
determining target barrage information corresponding to the AR scene information;
and sending the target bullet screen information to the user terminal so that the user terminal can display the target bullet screen information.
8. The method according to claim 7, wherein the barrage acquisition request further comprises at least one of terminal information of the user terminal and user information of a user;
the determining of the target barrage information corresponding to the AR scene information includes:
and determining the target bullet screen information corresponding to the AR scene information and at least one of the terminal information and the user information.
9. The method of claim 7 or 8, wherein the AR scene information comprises at least one of a physical image, a virtual image, scene keyword information, and visual identification information.
10. The method according to claim 9, wherein the scene keyword information comprises first scene keyword information input by a user through a voice manner and/or second scene keyword information obtained based on an AR scene.
11. The method according to claim 7 or 10, wherein the determining target barrage information corresponding to the AR scene information comprises:
and determining the target bullet screen information corresponding to the AR scene information in a bullet screen database, wherein the bullet screen database comprises bullet screen information input by a user and/or bullet screen information obtained from a resource database.
12. A user terminal, characterized in that the user terminal comprises:
the acquisition module is used for acquiring AR scene information and trigger information input by a user;
a sending module, configured to send, in response to the trigger information, the barrage acquisition request to a server, where the barrage acquisition request includes the AR scene information, so that the server determines target barrage information corresponding to the AR scene information;
the receiving module is used for receiving the target bullet screen information sent by the server;
and the display module is used for displaying the target bullet screen information.
13. A server, characterized in that the server comprises:
the system comprises a receiving module, a judging module and a sending module, wherein the receiving module is used for receiving a barrage obtaining request sent by a user terminal, and the barrage obtaining request comprises AR scene information;
the determining module is used for determining target barrage information corresponding to the AR scene information;
and the sending module is used for sending the target bullet screen information to the user terminal so that the user terminal can display the target bullet screen information.
14. A bullet screen display device, characterized in that said device comprises: a processor and a memory storing computer program instructions; the processor, when executing the computer program instructions, implements the bullet screen display method of any one of claims 1-11.
15. A computer-readable storage medium having computer program instructions stored thereon, which when executed by a processor implement the bullet screen display method of any one of claims 1 to 11.
CN202010616903.8A 2020-06-30 2020-06-30 Barrage display method, user terminal, server, device and storage medium Pending CN113873266A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010616903.8A CN113873266A (en) 2020-06-30 2020-06-30 Barrage display method, user terminal, server, device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010616903.8A CN113873266A (en) 2020-06-30 2020-06-30 Barrage display method, user terminal, server, device and storage medium

Publications (1)

Publication Number Publication Date
CN113873266A true CN113873266A (en) 2021-12-31

Family

ID=78981548

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010616903.8A Pending CN113873266A (en) 2020-06-30 2020-06-30 Barrage display method, user terminal, server, device and storage medium

Country Status (1)

Country Link
CN (1) CN113873266A (en)

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105468142A (en) * 2015-11-16 2016-04-06 上海璟世数字科技有限公司 Interaction method and system based on augmented reality technique, and terminal
CN106982387A (en) * 2016-12-12 2017-07-25 阿里巴巴集团控股有限公司 It has been shown that, method for pushing and the device and barrage application system of barrage
CN107493228A (en) * 2017-08-29 2017-12-19 北京易讯理想科技有限公司 A kind of social interaction method and system based on augmented reality
CN107743262A (en) * 2017-09-14 2018-02-27 阿里巴巴集团控股有限公司 A kind of barrage display methods and device
CN108428375A (en) * 2017-02-14 2018-08-21 深圳梦境视觉智能科技有限公司 A kind of teaching auxiliary and equipment based on augmented reality
CN108696740A (en) * 2017-02-14 2018-10-23 深圳梦境视觉智能科技有限公司 A kind of live broadcasting method and equipment based on augmented reality
CN109298781A (en) * 2018-08-28 2019-02-01 百度在线网络技术(北京)有限公司 Message processing method, device, equipment and computer storage medium based on AR
CN109451333A (en) * 2018-11-29 2019-03-08 北京奇艺世纪科技有限公司 A kind of barrage display methods, device, terminal and system
US20190095712A1 (en) * 2017-09-22 2019-03-28 Samsung Electronics Co., Ltd. Method and device for providing augmented reality service
CN110865708A (en) * 2019-11-14 2020-03-06 杭州网易云音乐科技有限公司 Interaction method, medium, device and computing equipment of virtual content carrier
CN111225287A (en) * 2019-11-27 2020-06-02 网易(杭州)网络有限公司 Bullet screen processing method and device, electronic equipment and storage medium

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105468142A (en) * 2015-11-16 2016-04-06 上海璟世数字科技有限公司 Interaction method and system based on augmented reality technique, and terminal
CN106982387A (en) * 2016-12-12 2017-07-25 阿里巴巴集团控股有限公司 It has been shown that, method for pushing and the device and barrage application system of barrage
CN108428375A (en) * 2017-02-14 2018-08-21 深圳梦境视觉智能科技有限公司 A kind of teaching auxiliary and equipment based on augmented reality
CN108696740A (en) * 2017-02-14 2018-10-23 深圳梦境视觉智能科技有限公司 A kind of live broadcasting method and equipment based on augmented reality
CN107493228A (en) * 2017-08-29 2017-12-19 北京易讯理想科技有限公司 A kind of social interaction method and system based on augmented reality
CN107743262A (en) * 2017-09-14 2018-02-27 阿里巴巴集团控股有限公司 A kind of barrage display methods and device
US20190095712A1 (en) * 2017-09-22 2019-03-28 Samsung Electronics Co., Ltd. Method and device for providing augmented reality service
CN109298781A (en) * 2018-08-28 2019-02-01 百度在线网络技术(北京)有限公司 Message processing method, device, equipment and computer storage medium based on AR
CN109451333A (en) * 2018-11-29 2019-03-08 北京奇艺世纪科技有限公司 A kind of barrage display methods, device, terminal and system
CN110865708A (en) * 2019-11-14 2020-03-06 杭州网易云音乐科技有限公司 Interaction method, medium, device and computing equipment of virtual content carrier
CN111225287A (en) * 2019-11-27 2020-06-02 网易(杭州)网络有限公司 Bullet screen processing method and device, electronic equipment and storage medium

Similar Documents

Publication Publication Date Title
CN111476871B (en) Method and device for generating video
CN106254848A (en) A kind of learning method based on augmented reality and terminal
CN110378820B (en) Teaching system and method, electronic device, and storage medium
CN109271153B (en) Method for acquiring programming language based on programming education system and electronic equipment
CN110516152A (en) Point of interest method for pushing, device, electronic equipment and storage medium
CN104035995A (en) Method and device for generating group tags
CN112364144B (en) Interaction method, device, equipment and computer readable medium
CN111414506A (en) Emotion processing method and device based on artificial intelligence, electronic equipment and storage medium
CN112232066A (en) Teaching outline generation method and device, storage medium and electronic equipment
CN110969159A (en) Image recognition method and device and electronic equipment
CN111325160B (en) Method and device for generating information
CN113873266A (en) Barrage display method, user terminal, server, device and storage medium
KR20180056728A (en) Method for controlling an image processing apparatus
CN110209267A (en) Terminal, server and virtual scene method of adjustment, medium
CN115994266A (en) Resource recommendation method, device, electronic equipment and storage medium
CN114398135A (en) Interaction method, interaction device, electronic device, storage medium, and program product
CN113837010A (en) Education assessment system and method
CN114726818B (en) Network social method, device, equipment and computer readable storage medium
CN110462659A (en) Sharing experience
CN114283638B (en) Online teaching method and device and online teaching cloud platform
KR102375736B1 (en) A Method and Apparatus for Artificial Intelligence Avatar Matching by 5G Communication-based Communication Pattern Analyzing
Resurreccion et al. Lab 2–International Assistant Product Specification
CN112712798A (en) Privatization data acquisition method and device
Owens et al. Lab 2–International Assistant Product Specification
KR20220168534A (en) Method and system for training artificial intelligence character's dialogue engine

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20211231