CN111131910A - Bullet screen implementation method and device, electronic equipment and readable storage medium - Google Patents

Bullet screen implementation method and device, electronic equipment and readable storage medium Download PDF

Info

Publication number
CN111131910A
CN111131910A CN202010000484.5A CN202010000484A CN111131910A CN 111131910 A CN111131910 A CN 111131910A CN 202010000484 A CN202010000484 A CN 202010000484A CN 111131910 A CN111131910 A CN 111131910A
Authority
CN
China
Prior art keywords
bullet screen
information
texture
texture information
rendering
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010000484.5A
Other languages
Chinese (zh)
Other versions
CN111131910B (en
Inventor
樊健荣
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangzhou Huya Technology Co Ltd
Original Assignee
Guangzhou Huya Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou Huya Technology Co Ltd filed Critical Guangzhou Huya Technology Co Ltd
Priority to CN202010000484.5A priority Critical patent/CN111131910B/en
Publication of CN111131910A publication Critical patent/CN111131910A/en
Application granted granted Critical
Publication of CN111131910B publication Critical patent/CN111131910B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/488Data services, e.g. news ticker
    • H04N21/4884Data services, e.g. news ticker for displaying subtitles
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/21Server components or server architectures
    • H04N21/218Source of audio or video content, e.g. local disk arrays
    • H04N21/2187Live feed

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Databases & Information Systems (AREA)
  • Processing Or Creating Images (AREA)

Abstract

According to the bullet screen implementation method, the bullet screen implementation device, the electronic equipment and the readable storage medium, the first thread and the second thread which run in parallel are created, the first thread is used for processing and calculating bullet screen information, texture information of the bullet screen information is obtained, and the texture information is stored in the texture array. And extracting texture information from the texture array by using a second thread, and realizing rendering display of the texture information. And further, a plurality of sub-threads are created in the second thread, and each sub-thread can respectively extract texture information from the texture array for rendering and displaying. Therefore, through the parallel first thread and the parallel second thread, when a plurality of bullet screen information to be processed exist, the steps of calculating and processing each bullet screen information and rendering display can be synchronously carried out, the overall processing speed is accelerated, and a plurality of sub-threads are used for rendering display, so that the complex rendering display process is accelerated, and the problem of unsmooth bullet screen display under the condition of a large number of bullet screens is avoided.

Description

Bullet screen implementation method and device, electronic equipment and readable storage medium
Technical Field
The application relates to the technical field of live broadcast, in particular to a bullet screen implementation method and device, electronic equipment and a readable storage medium.
Background
With the rapid development of the live broadcast industry, more and more users enjoy watching live broadcast. People can watch live videos at various clients such as a PC (personal computer) end, a mobile phone end and the like through the Internet at any time and any place. At present, most live broadcast websites adopt barrage, and the barrage can greatly enhance the interaction between audiences and anchor broadcasters and between the audiences.
With the increase of users watching live broadcast, the number of barrages generated in the live broadcast process is also increased sharply, and the processing workload of barrage processing and rendering display is also increased. For devices with low processing performance, such as a user side, the processing of a large number of bullet screens leads to performance consumption of the user side, and further leads to the problems of stuttering of bullet screen display, unsmooth running of bullet screens and the like.
Disclosure of Invention
The application aims to provide a bullet screen implementation method, a bullet screen implementation device, an electronic device and a readable storage medium, which can improve the rendering and displaying speed of bullet screen information and improve the problem of unsmooth display.
The embodiment of the application can be realized as follows:
in a first aspect, an embodiment provides a bullet screen implementation method, which is applied to a user terminal, and the method includes:
acquiring a plurality of bullet screen information to be processed;
processing the bullet screen information by utilizing a first thread aiming at each bullet screen information to obtain texture information corresponding to the bullet screen information, and storing the texture information into a texture array;
and respectively extracting texture information from the texture array by utilizing each sub-thread of the second thread, rendering the extracted texture information and displaying the rendered texture information on the target layer.
In an optional embodiment, the step of processing the bullet screen information by using a first thread to obtain texture information corresponding to the bullet screen information and storing the texture information in a texture array includes:
calculating the size of an area required to be occupied by the bullet screen information in the target layer by using a first thread;
typesetting and setting the bullet screen information, and determining a coordinate value of the bullet screen information when the bullet screen information is on the screen;
generating a bullet screen bitmap corresponding to the bullet screen information according to the size of the area occupied by the bullet screen information and the coordinate value;
and processing the bullet screen bitmap to generate corresponding texture information, and storing the texture information into a texture array.
In an optional implementation manner, the step of calculating, by using a first thread, a size of an area that the bullet screen information needs to occupy in the target map layer includes:
and calculating the size of the area occupied by the bullet screen information in the target layer by using a first thread according to the length and the width of the character string contained in the bullet screen information and the font size of the characters in the character string.
In an optional embodiment, the step of processing the bullet screen bitmap according to the size of the area occupied by the bullet screen information to generate corresponding texture information, and storing the texture information in a texture array includes:
creating a corresponding buffer pool object according to the preset bullet screen width and the preset bullet screen length;
and generating corresponding texture information by the bullet screen bitmap according to the size of the area occupied by the bullet screen information by the buffer pool object, and storing the corresponding texture information into a texture array.
In an optional implementation manner, the step of extracting, by using each sub-thread of the second thread, texture information from the texture array, rendering the extracted texture information, and displaying the rendered texture information on the target layer includes:
extracting texture information from the texture array in sequence by using each sub-thread of a second thread respectively to obtain a coordinate value of the bullet screen information contained in the texture information when the bullet screen information is on a screen;
rendering and displaying the extracted texture information in the target layer according to the coordinate values.
In an optional implementation manner, the coordinate values include vertex coordinate values of the bullet screen information, and the step of rendering and displaying the extracted texture information in the target layer according to the coordinate values includes:
passing the vertex coordinate values into a shader;
determining a rendering frame used for displaying the texture information in the target image layer according to the vertex coordinate value;
rendering and displaying the texture information in the target image layer by utilizing the shader based on the rendering frame.
In an optional embodiment, the step of rendering and displaying the texture information in the target layer by using the shader based on the rendering frame includes:
when the texture information is displayed on a screen, rendering the texture information by using the shader and displaying the texture information in the determined rendering frame in the target layer;
and performing displacement updating on the rendering frame, and rendering the texture information by using the shader and displaying the texture information in the rendering frame after displacement updating so as to enable the texture information to be displayed in the target layer in a moving manner.
In an optional embodiment, the step of performing displacement update on the rendering frame, rendering the texture information by using the shader, and displaying the texture information in the displacement updated rendering frame includes:
performing displacement updating on the rendering frame according to the created model matrix, and rendering and displaying the texture information in the rendering frame after displacement updating by using the shader;
and updating the model matrix according to a preset rendering frame rate, updating the rendering frame updated last time according to the updated model matrix each time, rendering the texture information by using a shader and displaying the texture information in the rendering frame updated again until the texture information disappears in the target layer.
In an alternative embodiment, the method further comprises:
when texture information is monitored to exist and a rendering and displaying process on the target image layer is completed, the state of a model matrix for mobile display of the texture information is set to be an idle state, and the model matrix can be used for mobile display of other texture information in the idle state.
In a second aspect, an embodiment provides an apparatus for implementing a barrage, which is applied to a user terminal, and the apparatus includes:
the information acquisition module is used for acquiring a plurality of bullet screen information to be processed;
the processing module is used for processing the bullet screen information by utilizing a first thread aiming at each bullet screen information to obtain texture information corresponding to the bullet screen information and storing the texture information into a texture array;
and the rendering module is used for extracting texture information from the texture array by utilizing each sub-thread of the second thread, rendering the extracted texture information and displaying the rendered texture information on the target layer.
In a third aspect, an embodiment provides an electronic device, where the electronic device includes a machine-readable storage medium and a processor, where the machine-readable storage medium stores machine-executable instructions, and when the processor executes the machine-executable instructions, the electronic device implements the bullet screen implementing method in any one of the foregoing embodiments.
In a fourth aspect, an embodiment provides a readable storage medium, in which machine-executable instructions are stored, and when executed, the machine-executable instructions implement the bullet screen implementation method described in any one of the foregoing embodiments.
The beneficial effects of the embodiment of the application include, for example:
according to the bullet screen implementation method, the bullet screen implementation device, the electronic equipment and the readable storage medium, the first thread and the second thread which run in parallel are created, the first thread is used for processing and calculating bullet screen information, texture information of the bullet screen information is obtained, and the texture information is stored in the texture array. And extracting texture information from the texture array by using a second thread, and realizing rendering display of the texture information. And further, a plurality of sub-threads are created in the second thread, and each sub-thread can respectively extract texture information from the texture array for rendering and displaying. Therefore, through the parallel first thread and the parallel second thread, when a plurality of bullet screen information to be processed exist, the steps of calculating and processing each bullet screen information and rendering display can be synchronously carried out, the overall processing speed is accelerated, and a plurality of sub-threads are used for rendering display, so that the complex rendering display process is accelerated, and the problem of unsmooth bullet screen display under the condition of a large number of bullet screens is avoided.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are required to be used in the embodiments will be briefly described below, it should be understood that the following drawings only illustrate some embodiments of the present application and therefore should not be considered as limiting the scope, and for those skilled in the art, other related drawings can be obtained from the drawings without inventive effort.
Fig. 1 is a schematic system architecture diagram of a live broadcast system according to an embodiment of the present application;
fig. 2 is a schematic flowchart of a bullet screen implementation method provided in the embodiment of the present application;
fig. 3 is a schematic flowchart of a texture information obtaining method according to an embodiment of the present application;
fig. 4 is a schematic diagram of setting of bullet screen information typesetting according to an embodiment of the present application;
fig. 5 is a schematic flowchart of a method for rendering and displaying texture information according to an embodiment of the present disclosure;
FIG. 6 is a flowchart illustrating the sub-steps of step S520 in FIG. 5;
fig. 7 is a schematic diagram of moving display of bullet screen information according to an embodiment of the present application;
fig. 8 is another schematic diagram of a bullet screen information moving display provided in the embodiment of the present application;
fig. 9 is another schematic diagram of a bullet screen information moving display provided in the embodiment of the present application;
fig. 10 is another schematic diagram of a bullet screen information moving display provided in the embodiment of the present application;
fig. 11 is a functional block diagram of a bullet screen implementation apparatus provided in the embodiment of the present application;
fig. 12 is a block diagram of an electronic device according to an embodiment of the present application.
Icon: 100-live broadcast providing terminal; 200-a live broadcast receiving terminal; 300-a live server; 400-an electronic device; 410-bullet screen implementing device; 411-information acquisition module; 412-a processing module; 413-a rendering module; 420-a processor; 430-machine-readable storage medium.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present application clearer, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some embodiments of the present application, but not all embodiments. The components of the embodiments of the present application, generally described and illustrated in the figures herein, can be arranged and designed in a wide variety of different configurations.
Thus, the following detailed description of the embodiments of the present application, presented in the accompanying drawings, is not intended to limit the scope of the claimed application, but is merely representative of selected embodiments of the application. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus, once an item is defined in one figure, it need not be further defined and explained in subsequent figures. It should be noted that the features of the embodiments of the present application may be combined with each other without conflict.
Referring to fig. 1, fig. 1 is a schematic view illustrating an interaction scene of a live broadcast system according to an embodiment of the present application. For example, the live system may be for a service platform such as internet live. The live broadcast system may include a live broadcast server 300, a live broadcast receiving terminal 200, and a live broadcast providing terminal 100, where the live broadcast server 300 is in communication connection with the live broadcast receiving terminal 200 and the live broadcast providing terminal 100, respectively, and is configured to provide live broadcast services for the live broadcast receiving terminal 200 and the live broadcast providing terminal 100. For example, the anchor may provide a live stream online in real time to the viewer through the live providing terminal 100 and transmit the live stream to the live server 300, and the live receiving terminal 200 may pull the live stream from the live server 300 for online viewing or playback. For another example, the live broadcast server 300 may acquire the bullet screen data transmitted by the live broadcast receiving terminal 200 and the live broadcast providing terminal 100, and issue bullet screen information to each of the live broadcast providing terminal 100 and the live broadcast receiving terminal 200, so that the live broadcast receiving terminal 100 and the live broadcast receiving terminal 200 render and display the bullet screen information.
In some implementation scenarios, the live receiving terminal 200 and the live providing terminal 100 may be used interchangeably. For example, the anchor of the live broadcast providing terminal 100 may provide a live video service to the viewer using the live broadcast providing terminal 100, or view live video provided by other anchors as the viewer. For another example, the viewer of the live broadcast receiving terminal 200 may also use the live broadcast receiving terminal 200 to view live video provided by a main broadcast concerned, or provide live video service as a main broadcast for other viewers.
In this embodiment, the live broadcast receiving terminal 200 and the live broadcast providing terminal 100 may include, but are not limited to, a mobile device, a tablet computer, a laptop computer, or a combination of any two or more thereof. In a specific implementation, there may be zero, one or more live receiving terminals 200 and live providing terminals 100, only one of which is shown in fig. 1, accessing the live server 300. The live broadcast receiving terminal 200 and the live broadcast providing terminal 100 may have internet products installed therein for providing live internet services, for example, the internet products may be applications APP, Web pages, applets, and the like used in a computer or a smart phone and related to live internet services.
In this embodiment, the live server 300 may be a single physical server, or may be a server group including a plurality of physical servers for executing different data processing functions. The server groups may be centralized or distributed (e.g., the live server 300 may be a distributed system). In some possible implementations, such as where the live server 300 employs a single physical server, different logical server components may be assigned to the physical server based on different live service functions.
It will be appreciated that the live system shown in fig. 1 is only one possible example, and that in other possible embodiments the live system may comprise only some of the components shown in fig. 1 or may also comprise other components.
In order to accelerate processing and rendering display of bullet screen information and further achieve smooth display of bullet screen information, fig. 2 shows a flowchart of a bullet screen implementation method provided in this embodiment of the present application, where in this embodiment, the bullet screen implementation method may be applied to a user terminal, and the user terminal may be the live broadcast receiving terminal 200 shown in fig. 1 or the live broadcast providing terminal 100 shown in fig. 1.
It should be understood that, in other embodiments, the order of some steps in the bullet screen implementation method of this embodiment may be interchanged according to actual needs, or some steps may be omitted or deleted. The detailed steps of the bullet screen implementation method are described as follows.
Step S210, obtaining a plurality of bullet screen information to be processed.
Step S220, aiming at each bullet screen information, processing the bullet screen information by using a first thread to obtain texture information corresponding to the bullet screen information, and storing the texture information into a texture array.
Step S230, extracting texture information from the texture array by using each sub-thread of the second thread, rendering the extracted texture information, and displaying the rendered texture information on the target layer.
In this embodiment, in the process of logging in the live broadcast room to watch the anchor live broadcast, the audience can launch barrage information to the live broadcast room to publish own comments, or interact with the anchor broadcast and other audiences through the barrage information. After the bullet screen information initiated by the audience of the live broadcast receiving terminal 200 is processed by the live broadcast server 300, the bullet screen information is distributed to each live broadcast providing terminal 100 and each live broadcast receiving terminal 200 entering the live broadcast room, and after the bullet screen information is received by each live broadcast providing terminal 100 and each live broadcast receiving terminal 200, the bullet screen information is processed and then rendered and displayed.
Of course, it should be noted that, in the live broadcast process, the anchor of the live broadcast providing terminal 100 may also initiate the barrage information, and distribute the barrage information to each live broadcast receiving terminal 200 entering the live broadcast room after being processed by the live broadcast server 300, so that the live broadcast receiving terminal 200 realizes rendering and displaying of the barrage information.
After the live broadcast receiving terminal 200 or the live broadcast providing terminal 100 receives the bullet screen information to be displayed, the first thread and the second thread are started synchronously to process the bullet screen information in parallel. The first thread realizes a corresponding Processing function based on a Central Processing Unit (CPU) in the terminal, and the second thread realizes a corresponding rendering and displaying function based on a Graphics Processing Unit (GPU) in the terminal.
In this embodiment, the first thread and the second thread may be processed in parallel, and when the first thread is used to perform calculation processing on the received bullet screen information to be processed to obtain corresponding texture information and store the texture information in the texture array, the second thread may synchronously extract the texture information from the texture array and render and display the texture information in the target layer.
Moreover, since the general rendering and displaying work is complicated and takes much time, in this embodiment, the second thread includes multiple sub-threads, the multiple sub-threads can run synchronously, and each sub-thread can extract texture information from the texture array respectively to implement the work of rendering and displaying in the target layer.
Therefore, through the parallel first thread and the parallel second thread, when a plurality of bullet screen information to be processed exist, the steps of calculating and processing each bullet screen information and rendering display can be synchronously carried out, the overall processing speed is accelerated, and a plurality of sub-threads are used for rendering display, so that the complex rendering display process is accelerated, and the problem of unsmooth bullet screen display under the condition of a large number of bullet screens is avoided.
In this embodiment, referring to fig. 3, when the first thread processes the bullet screen information to obtain the texture information, the following steps may be performed:
step S310, calculating the size of the area that the bullet screen information needs to occupy in the target layer by using a first thread.
Step S320, performing typesetting setting on the bullet screen information, and determining a coordinate value of the bullet screen information when the bullet screen information is on the screen.
And step S330, generating a bullet screen bitmap corresponding to the bullet screen information according to the size of the area occupied by the bullet screen information and the coordinate value.
And step S340, processing the bullet screen bitmap to generate corresponding texture information, and storing the texture information into a texture array.
In this embodiment, the bullet screen implementation method may be applied to the live broadcast providing terminal 100 or the live broadcast receiving terminal 200 of which the operating system is an IOS system, and of course, the bullet screen implementation method may also be applied to the terminal device of which the operating system is another system, and the following description is given by taking the terminal device of which the operating system is an IOS system as an example in this embodiment.
In order to realize the final rendering display of the bullet screen information, the size of an area that each piece of bullet screen information needs to occupy in a target layer of the final display needs to be known, where the target layer may be a camtallayer, that is, camtallayer is used as a rendering target object. Besides the size of the area occupied by a single piece of bullet screen information, the bullet screen information also needs to be typeset and set so as to determine the coordinate value of each piece of bullet screen information when the bullet screen information is on the screen, wherein the screen displaying of the bullet screen information means that the bullet screen information is displayed in the target layer at the beginning.
The setting of the typesetting of the bullet screen information may include determining which direction the bullet screen information is screened from on the layer, and a specific position when the bullet screen information is screened from a certain direction. For example, the barrage information may be displayed from the left side of the screen, from the right side of the screen, from the upper side of the screen, or from the lower side of the screen, and is not limited in this embodiment.
Through typesetting and setting each piece of bullet screen information, the problem that the overlapping phenomenon is serious when a plurality of pieces of bullet screen information are displayed can be avoided.
In this embodiment, in the first thread, the size of the area that the bullet screen information needs to occupy and the coordinate value of the bullet screen information when being on the screen can be calculated in the CPU by using the core text CoreText.
The bullet screen information may include text information, emoticon, pictures, and the like, and in this embodiment, the bullet screen information is taken as text information including a character string for example. Optionally, in the first thread, the CPU may calculate, according to the length and width of the character string included in the bullet screen information and the font size of the characters in the character string, the size of the area that the bullet screen information needs to occupy in the target layer.
If the bullet screen information is information such as an emoticon, a picture and the like, the size of the area occupied by the bullet screen information can be calculated based on the size of the emoticon, the picture and the like.
In this embodiment, if the bullet screen information is displayed on the screen from the right side of the screen, in this case, the bullet screen information is generally displayed in the upper area of the target layer. The upper area of the target layer may be divided into a plurality of rows of sub-areas, and each row of sub-area may be used for displaying bullet screen information. When the bullet screen information is processed, the bullet screen information is generally processed sequentially according to the sequence of the received bullet screen information. When the typesetting setting of the bullet screen information is carried out, the bullet screen information subjected to adjacent processing can be typeset to different subregions, so that the problem that the adjacent bullet screen information causes serious overlapping display due to short time interval between the adjacent bullet screen information is solved.
For example, as shown in fig. 4, the upper area of the target layer is schematically divided into a plurality of rows of sub-areas, wherein, assuming that the bullet screen information 1 and the bullet screen information 2 corresponding to the texture information are processed adjacently, the bullet screen information 1 may be arranged in the first row of sub-areas in the target layer, and the bullet screen information 2 may be arranged in the third row of sub-areas in the target layer. Therefore, the bullet screen information 1 and the bullet screen information 2 can be prevented from being overlapped if being typeset to the same line of subareas due to the fact that the time intervals are relatively close.
And finally, the bullet screen information is rendered and displayed in the target image layer, information which can be used for drawing needs to be provided for a GPU, a corresponding bullet screen bitmap can be generated in the CPU according to the bullet screen information, and the bullet screen bitmap is processed according to the size of the area occupied by the bullet screen information to generate corresponding texture information. And the texture information can be provided for the GPU so as to realize bullet screen drawing by the GPU. And the texture information also comprises the coordinate value of the bullet screen information obtained by calculation when the bullet screen information is on the screen, so that the GPU can determine the initial position of texture information drawing.
In this embodiment, the CG bitmap context cgbitmap context may be opened, and a bullet screen bitmap corresponding to the bullet screen information is obtained according to the calculated size of the area occupied by the bullet screen information and the coordinate value of the bullet screen when the bullet screen information is displayed on the screen, where the bullet screen bitmap is composed of a plurality of pixel points, and the color of a single pixel point is determined by the optical intensity of the three primary colors RGB, and may be used for subsequent direct display on the screen.
The GPU also needs to obtain corresponding texture information for realizing the rendering and displaying of the bullet screen information, and the texture information can be generated after being processed based on the bullet screen bitmap. In this embodiment, a corresponding buffer pool object is created according to a preset bullet screen width and a preset bullet screen length, where the preset bullet screen width and the preset bullet screen length are the allowed maximum bullet screen width and the maximum bullet screen length. And generating corresponding texture information for the bullet screen bitmap through the created buffer pool object, and storing the texture information into a texture array.
In this embodiment, a plurality of texture information obtained after being processed by the CPU may be temporarily stored in the texture array, and the texture information in the texture array may be provided to the GPU to implement rendering display.
Referring to fig. 5, in the present embodiment, the rendering and displaying of the texture information can be implemented by the following manners:
and step S510, extracting texture information from the texture array respectively and sequentially by utilizing each sub-thread of the second thread, and obtaining the coordinate value of the bullet screen information contained in the texture information when the bullet screen information is on the screen.
Step S520, rendering and displaying the extracted texture information in the target layer according to the coordinate values.
Each sub-thread in the second thread can be processed in parallel, and after each sub-thread finishes the rendering and displaying of one bullet screen information, a new grain information is extracted from the grain array for processing. Suppose that 10 pieces of texture information are temporarily stored in the texture array, including texture information a, texture information b, and texture information g. The second thread comprises three sub-threads, including a sub-thread A, a sub-thread B and a sub-thread C. It should be understood that the texture information in the texture array is continuously updated, for example, new texture information is added to the texture array, and the texture information is continuously extracted from the texture array.
And the sub-thread A, the sub-thread B and the sub-thread C can respectively extract texture information from the texture array to render and display the texture information. It is assumed that the sub-thread a extracts texture information a for processing, the sub-thread B extracts texture information B for processing, and the sub-thread C extracts texture information C for processing. After the sub-thread A finishes the rendering and displaying work of the texture information a, the texture information d can be extracted from the texture array for processing. And if the sub-thread C finishes the rendering and displaying work of the texture information C, extracting the texture information e from the texture array for processing. In this way, multiple sub-threads can process the same work in parallel, and the speed of rendering and displaying is accelerated.
In this embodiment, each child thread implements a rendering display function based on the GPU, and in order to trigger the GPU to implement rendering display, corresponding texture information needs to be put into a rendering pipeline MTLRenderPipelineState that the GPU can call, and an instruction that the GPU can recognize is encoded, and the GPU is triggered to execute a rendering action based on the instruction.
In this embodiment, a rendering pipeline and a command queue mtlcommandequeue may be created according to resources available to the GPU, and instructions are placed in the command queue. One or more instructions are buffered in the command queue for execution and compact encoding into instructions recognizable by the GPU. Based on the instructions in the command queue, each sub-thread extracts texture information from the texture array according to the rendering pipeline and realizes rendering display.
And when each sub-thread is used for rendering and displaying the texture information, rendering according to the coordinate value of the corresponding bullet screen information contained in the extracted texture information when the bullet screen information is displayed on the screen. Optionally, referring to fig. 6, the step S520 may include the following sub-steps:
in step S610, the vertex coordinate values are transmitted to a shader.
Step S620, determining a rendering frame for displaying the texture information in the target layer according to the vertex coordinate value.
Step S630, based on the rendering frame, rendering and displaying the texture information in the target layer by using the shader.
When rendering and displaying the texture information on the target layer, a specific rendering area needs to be determined, where the specific rendering area is a rendering frame, and the rendering frame may be a rectangle, a square, a circle, an ellipse, or the like.
In this embodiment, the coordinate values obtained by the calculation when the bullet screen information is displayed on the screen include vertex coordinate values, which may be coordinate values of four vertices of a rectangular frame in which the bullet screen information can be framed. The vertex coordinate values can be transmitted into a shader, and then the shader is used for realizing rendering display. Alternatively, the shader may use the draw method of drawInstance to draw texture information into the rendering frame determined by the vertex coordinate values.
When the barrage information is displayed on the screen, the barrage information is generally displayed on the screen after being displayed on the screen, and finally disappears on the screen. In this embodiment, the moving display of the bullet screen information is realized by continuously updating the rendering frame. Optionally, when the texture information is displayed on the screen, the texture information is rendered and displayed in the determined rendering frame in the target layer by using the shader. And then, performing displacement updating on the rendering frame, rendering the texture information by using a shader and displaying the texture information in the rendering frame after the displacement updating so as to enable the texture information to be displayed in the target layer in a moving mode.
The number of times of updating the displacement of the rendering frame may be multiple times, that is, the texture information moves multiple times in the target image layer by updating the rendering frame multiple times, and finally disappears from the target image layer.
In this embodiment, when performing displacement update on the rendering frame, the direction of the displacement update may be any direction, for example, as shown in fig. 7, when the texture information is displayed on the screen from the right side of the screen, the direction of the displacement update may be a lateral displacement direction from the right to the left. Alternatively, as shown in fig. 8, when the texture information is displayed on the screen from the left side of the screen, the displacement update direction may be a lateral displacement direction from left to right. When the texture information is displayed on the screen from the top of the screen, the displacement update direction may be a vertical displacement direction from top to bottom, as shown in fig. 9. When the texture information is displayed from the lower side of the screen to the upper side of the screen, as shown in fig. 10, the displacement update direction may be a vertical displacement direction from the bottom to the top.
In addition, the texture information can be displayed from the corner points of the screen, and the displacement updating direction can be to displace from the corner points of the screen to the opposite corners. Alternatively, the screen may be displaced according to a predetermined path other than the above displacement manner, for example, the screen is moved horizontally, then moved vertically, and finally moved down. The specific displacement updating manner is not particularly limited in this embodiment, and may be set accordingly according to actual requirements.
In this embodiment, the displacement update of the rendering frame is not to update the rendering frame by changing the coordinate value of the determined rendering frame, but to update the displacement of the rendering frame by using the created model matrix ModelMatrix. The model matrix comprises elements which can be used for changing the position information of the rendering frame, and particularly, the numerical values of the elements can be set according to the requirement of displacement updating.
After the initially determined rendering frame is used for displaying the texture information on the screen, the rendering frame can be subjected to displacement updating according to the created model matrix, and the texture information is rendered and displayed in the rendering frame after the displacement updating by using the shader.
Generally, the rendering frame does not disappear from the target graph layer after one displacement update, namely, the rendering frame does not screen down. At this time, the model matrix may be updated at a preset rendering frame rate, for example, 10 times per second, 60 times per second, and the like. And updating the rendering frame updated last time again according to the model matrix updated each time, and rendering and displaying the texture information in the rendering frame updated again by using the shader until the texture information disappears in the target image layer.
Therefore, the texture information shows a constantly-displaced display effect on the screen in a mode of constantly updating the rendering frame, and finally disappears from the screen.
In this embodiment, when the rendering display of one texture information is completed, or the rendering display of a predetermined number of texture information is completed, or a predetermined time length is set at intervals, newly processed texture information may be added to the texture array, so that the shader performs barrage drawing.
Optionally, when it is monitored that the texture information exists and the rendering and displaying process on the target layer is completed, the state of the model matrix for the mobile display of the texture information is set to an idle state, and the model matrix can be used for the mobile display of other texture information in the idle state.
Based on the same application concept, please refer to fig. 11, which shows a functional module diagram of the bullet screen implementation device 410 provided in the embodiment of the present application, and the present embodiment may divide the functional modules of the bullet screen implementation device 410 according to the above method embodiment. For example, the functional blocks may be divided for the respective functions, or two or more functions may be integrated into one processing block. The integrated module can be realized in a hardware mode, and can also be realized in a software functional module mode. It should be noted that, in the embodiment of the present application, the division of the module is schematic, and is only one logic function division, and there may be another division manner in actual implementation. For example, in the case of dividing each function module according to each function, the bullet screen implementation apparatus 410 shown in fig. 11 is only a schematic apparatus diagram. The bullet screen implementation apparatus 410 may include an information obtaining module 411, a processing module 412, and a rendering module 413, and the functions of the functional modules of the bullet screen implementation apparatus 410 are described in detail below.
The information obtaining module 411 is configured to obtain multiple bullet screen information to be processed. It is understood that the information obtaining module 411 may be configured to perform the step S210, and for detailed implementation of the information obtaining module 411, reference may be made to the content related to the step S210.
And the processing module 412 is configured to, for each piece of bullet screen information, process the bullet screen information by using a first thread to obtain texture information corresponding to the bullet screen information, and store the texture information in a texture array. It is understood that the processing module 412 can be used to execute the step S220, and for the detailed implementation of the processing module 412, reference can be made to the above-mentioned contents related to the step S220.
And the rendering module 413 is configured to extract texture information from the texture array by using each sub-thread of the second thread, render the extracted texture information, and display the rendered texture information on the target layer. It is understood that the rendering module 413 may be configured to perform the step S230, and for the detailed implementation of the rendering module 413, reference may be made to the content related to the step S230.
In a possible implementation, the processing module 412 may be configured to process the bullet screen information by, for example, the first thread, to obtain texture information corresponding to the bullet screen information, and store the texture information in the texture array:
calculating the size of an area required to be occupied by the bullet screen information in the target layer by using a first thread;
typesetting and setting the bullet screen information, and determining a coordinate value of the bullet screen information when the bullet screen information is on the screen;
generating a bullet screen bitmap corresponding to the bullet screen information according to the size of the area occupied by the bullet screen information and the coordinate value;
and processing the bullet screen bitmap to generate corresponding texture information, and storing the texture information into a texture array.
In one possible implementation, the processing module 412 may be configured to calculate, by using the first thread, a size of an area that the bullet screen information needs to occupy in the target layer by:
and calculating the size of the area occupied by the bullet screen information in the target layer by using a first thread according to the length and the width of the character string contained in the bullet screen information and the font size of the characters in the character string.
In a possible implementation manner, the processing module 412 may be configured to process the bullet screen bitmap according to the size of the area that the bullet screen information needs to occupy, to generate corresponding texture information:
creating a corresponding buffer pool object according to the preset bullet screen width and the preset bullet screen length;
and generating corresponding texture information by the bullet screen bitmap according to the size of the area occupied by the bullet screen information by the buffer pool object, and storing the corresponding texture information into a texture array.
In a possible implementation, the rendering module 413 may be configured to extract texture information from the texture array by using respective sub-threads of the second thread, and render and display the extracted texture information in the target layer by:
extracting texture information from the texture array in sequence by using each sub-thread of a second thread respectively to obtain a coordinate value of the bullet screen information contained in the texture information when the bullet screen information is on a screen;
rendering and displaying the extracted texture information in the target layer according to the coordinate values.
In a possible implementation manner, the coordinate values include vertex coordinate values of the bullet screen information, and the rendering module 413 may be configured to render and display the extracted texture information in the target layer according to the coordinate values by:
passing the vertex coordinate values into a shader;
determining a rendering frame used for displaying the texture information in the target image layer according to the vertex coordinate value;
rendering and displaying the texture information in the target image layer by utilizing the shader based on the rendering frame.
In one possible implementation, rendering module 413 may be configured to render and display texture information in the target layer based on the rendering block using a shader by:
when the texture information is displayed on a screen, rendering the texture information by using the shader and displaying the texture information in the determined rendering frame in the target layer;
and performing displacement updating on the rendering frame, and rendering the texture information by using the shader and displaying the texture information in the rendering frame after displacement updating so as to enable the texture information to be displayed in the target layer in a moving manner.
In one possible implementation, the rendering module 413 may be configured to perform displacement update on the rendering frame by using a shader to render and display texture information in the displacement updated rendering frame:
performing displacement updating on the rendering frame according to the created model matrix, and rendering and displaying the texture information in the rendering frame after displacement updating by using the shader;
and updating the model matrix according to a preset rendering frame rate, updating the rendering frame updated last time according to the updated model matrix each time, rendering the texture information by using a shader and displaying the texture information in the rendering frame updated again until the texture information disappears in the target layer.
In a possible implementation manner, the bullet screen implementing apparatus 410 further includes a setting module, where the setting module may be configured to set a state of a model matrix used for mobile display of texture information to an idle state when it is monitored that the texture information exists and a rendering and displaying process on the target layer is completed, where the model matrix may be used for mobile display of other texture information in the idle state.
Based on the same application concept, please refer to fig. 12, which is a block diagram illustrating a structure of an electronic device 400 for executing the bullet screen implementation method according to an embodiment of the present application, where the electronic device 400 may be the user terminal, specifically, the live broadcast receiving terminal 200 or the live broadcast providing terminal 100 shown in fig. 1. As shown in fig. 12, the electronic device 400 may include a bullet screen implementing device 410, a machine-readable storage medium 430, and a processor 420.
In this embodiment, the machine-readable storage medium 430 and the processor 420 are both located in the electronic device 400 and are separately located. However, it should be understood that the machine-readable storage medium 430 may also be separate from the electronic device 400 and accessible by the processor 420 through a bus interface. Alternatively, the machine-readable storage medium 430 may be integrated into the processor 420, such as may be a cache and/or general registers.
The processor 420 is a control center of the electronic device 400, connects various parts of the entire electronic device 400 using various interfaces and lines, performs various functions of the electronic device 400 and processes data by operating or executing software programs and/or modules stored in the machine-readable storage medium 430 and calling data stored in the machine-readable storage medium 430, thereby monitoring the electronic device 400 as a whole. Alternatively, processor 420 may include one or more processing cores, for example, processor 420 may include a central processing unit and a graphics processing unit, where the central processing unit is primarily used for data computation processing and the like, and the graphics processing unit is primarily used for rendering a display on a graphical interface and the like.
Among other things, processor 420 may include one or more processing cores (e.g., a single-core processor (S) or a multi-core processor (S)). Merely by way of example, Processor 420 may include a Central Processing Unit (CPU), an Application Specific Integrated Circuit (ASIC), an Application Specific Instruction Set Processor (ASIP), a Graphics Processing Unit (GPU), a Physical Processing Unit (PPU), a Digital Signal Processor (DSP), a Field Programmable Gate Array (FPGA), a Programmable Logic Device (PLD), a controller, a microcontroller Unit, a Reduced Instruction Set computer (Reduced Instruction Set computer), a microprocessor, or the like, or any combination thereof.
The machine-readable storage medium 430 may be mass storage, removable storage, volatile Read-and-write Memory, or Read-Only Memory (ROM), among others, or any combination thereof. By way of example, mass storage may include magnetic disks, optical disks, solid state drives, and the like; removable memory may include flash drives, floppy disks, optical disks, memory cards, zip disks, tapes, and the like; volatile read-write Memory may include Random Access Memory (RAM); the RAM may include Dynamic RAM (DRAM), Double data Rate Synchronous Dynamic RAM (DDR SDRAM); static RAM (SRAM), Thyristor-Based Random Access Memory (T-RAM), Zero-capacitor RAM (Zero-RAM), and the like. By way of example, ROMs may include Mask Read-Only memories (MROMs), Programmable ROMs (PROMs), erasable Programmable ROMs (PERROMs), Electrically Erasable Programmable ROMs (EEPROMs), compact disk ROMs (CD-ROMs), digital versatile disks (ROMs), and the like.
The machine-readable storage medium 430 may be self-contained and coupled to the processor 420 via a communication bus. The machine-readable storage medium 430 may also be integrated with the processor 420. The machine-readable storage medium 430 is used for storing, among other things, machine-executable instructions for performing aspects of the present application. The processor 420 is configured to execute machine executable instructions stored in the machine readable storage medium 430 to implement the bullet screen implementation method provided by the foregoing method embodiments.
The bullet screen implementation device 410 may include, for example, the functional modules (for example, the information obtaining module 411, the processing module 412, and the rendering module 413) described in fig. 11, and may be stored in the machine-readable storage medium 430 in the form of software program codes, and the processor 420 may implement the bullet screen implementation method provided by the foregoing method embodiment by executing the functional modules of the bullet screen implementation device 410.
Since the electronic device 400 provided in the embodiment of the present application is another implementation form of the method embodiment executed by the user terminal, and the electronic device 400 can be used to execute the bullet screen implementation method provided in the above method embodiment, the technical effect obtained by the electronic device may refer to the above method embodiment, and is not described herein again.
Further, an embodiment of the present application also provides a readable storage medium containing computer-executable instructions, where the computer-executable instructions, when executed, may be used to implement the bullet screen implementation method provided in the foregoing method embodiment.
Of course, the storage medium provided in the embodiments of the present application and containing computer-executable instructions is not limited to the above method operations, and may also perform related operations in the bullet screen implementation method provided in any embodiment of the present application.
Embodiments of the present application are described with reference to flowchart illustrations and/or block diagrams of methods, apparatus and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
While the present application has been described in connection with various embodiments, other variations to the disclosed embodiments can be understood and effected by those skilled in the art in practicing the claimed application, from a review of the drawings, the disclosure, and the appended claims. In the claims, the word "comprising" does not exclude other elements or steps, and the word "a" or "an" does not exclude a plurality. A single processor or other unit may fulfill the functions of several items recited in the claims. The mere fact that certain measures are recited in mutually different dependent claims does not indicate that a combination of these measures cannot be used to advantage.
The above description is only for various embodiments of the present application, but the scope of the present application is not limited thereto, and any person skilled in the art can easily conceive of changes or substitutions within the technical scope of the present application, and all such changes or substitutions are included in the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (12)

1. A bullet screen implementation method is applied to a user terminal, and comprises the following steps:
acquiring a plurality of bullet screen information to be processed;
processing the bullet screen information by utilizing a first thread aiming at each bullet screen information to obtain texture information corresponding to the bullet screen information, and storing the texture information into a texture array;
and respectively extracting texture information from the texture array by utilizing each sub-thread of the second thread, rendering the extracted texture information and displaying the rendered texture information on the target layer.
2. The bullet screen implementing method according to claim 1, wherein the step of processing the bullet screen information by using the first thread to obtain texture information corresponding to the bullet screen information and storing the texture information in a texture array includes:
calculating the size of an area required to be occupied by the bullet screen information in the target layer by using a first thread;
typesetting and setting the bullet screen information, and determining a coordinate value of the bullet screen information when the bullet screen information is on the screen;
generating a bullet screen bitmap corresponding to the bullet screen information according to the size of the area occupied by the bullet screen information and the coordinate value;
and processing the bullet screen bitmap to generate corresponding texture information, and storing the texture information into a texture array.
3. The bullet screen implementation method according to claim 2, wherein the step of calculating, by using a first thread, a size of an area that the bullet screen information needs to occupy in the target layer includes:
and calculating the size of the area occupied by the bullet screen information in the target layer by using a first thread according to the length and the width of the character string contained in the bullet screen information and the font size of the characters in the character string.
4. The bullet screen implementing method according to claim 2, wherein the step of processing the bullet screen bitmap according to the size of the area occupied by the bullet screen information to generate corresponding texture information and storing the texture information into a texture array includes:
creating a corresponding buffer pool object according to the preset bullet screen width and the preset bullet screen length;
and generating corresponding texture information by the bullet screen bitmap according to the size of the area occupied by the bullet screen information by the buffer pool object, and storing the corresponding texture information into a texture array.
5. The bullet screen implementation method according to claim 2, wherein the step of extracting texture information from the texture array by using each sub-thread of the second thread, rendering the extracted texture information, and displaying the rendered texture information on the target layer comprises:
extracting texture information from the texture array in sequence by using each sub-thread of a second thread respectively to obtain a coordinate value of the bullet screen information contained in the texture information when the bullet screen information is on a screen;
rendering and displaying the extracted texture information in the target layer according to the coordinate values.
6. The bullet screen implementing method according to claim 5, wherein the coordinate values include vertex coordinate values of the bullet screen information, and the step of rendering and displaying the extracted texture information in the target layer according to the coordinate values includes:
passing the vertex coordinate values into a shader;
determining a rendering frame used for displaying the texture information in the target image layer according to the vertex coordinate value;
rendering and displaying the texture information in the target image layer by utilizing the shader based on the rendering frame.
7. The bullet screen implementation method of claim 6, wherein the step of rendering and displaying the texture information in the target layer by the shader based on the rendering frame comprises:
when the texture information is displayed on a screen, rendering the texture information by using the shader and displaying the texture information in the determined rendering frame in the target layer;
and performing displacement updating on the rendering frame, and rendering the texture information by using the shader and displaying the texture information in the rendering frame after displacement updating so as to enable the texture information to be displayed in the target layer in a moving manner.
8. The bullet screen implementation method of claim 7, wherein the step of performing displacement update on the rendering frame, rendering the texture information by using the shader, and displaying the texture information in the displacement-updated rendering frame comprises:
performing displacement updating on the rendering frame according to the created model matrix, and rendering and displaying the texture information in the rendering frame after displacement updating by using the shader;
and updating the model matrix according to a preset rendering frame rate, updating the rendering frame updated last time according to the updated model matrix each time, rendering the texture information by using a shader and displaying the texture information in the rendering frame updated again until the texture information disappears in the target layer.
9. The bullet screen implementation method of claim 8, further comprising:
when texture information is monitored to exist and a rendering and displaying process on the target image layer is completed, the state of a model matrix for mobile display of the texture information is set to be an idle state, and the model matrix can be used for mobile display of other texture information in the idle state.
10. The utility model provides a bullet curtain realization device which characterized in that is applied to user terminal, the device includes:
the information acquisition module is used for acquiring a plurality of bullet screen information to be processed;
the processing module is used for processing the bullet screen information by utilizing a first thread aiming at each bullet screen information to obtain texture information corresponding to the bullet screen information and storing the texture information into a texture array;
and the rendering module is used for extracting texture information from the texture array by utilizing each sub-thread of the second thread, rendering the extracted texture information and displaying the rendered texture information on the target layer.
11. An electronic device, comprising a machine-readable storage medium and a processor, wherein the machine-readable storage medium stores machine-executable instructions, and the processor, when executing the machine-executable instructions, implements the bullet screen implementing method according to any one of claims 1 to 9.
12. A readable storage medium having stored therein machine executable instructions which, when executed, implement the bullet screen implementation method of any one of claims 1-9.
CN202010000484.5A 2020-01-02 2020-01-02 Bullet screen implementation method and device, electronic equipment and readable storage medium Active CN111131910B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010000484.5A CN111131910B (en) 2020-01-02 2020-01-02 Bullet screen implementation method and device, electronic equipment and readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010000484.5A CN111131910B (en) 2020-01-02 2020-01-02 Bullet screen implementation method and device, electronic equipment and readable storage medium

Publications (2)

Publication Number Publication Date
CN111131910A true CN111131910A (en) 2020-05-08
CN111131910B CN111131910B (en) 2022-04-12

Family

ID=70507312

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010000484.5A Active CN111131910B (en) 2020-01-02 2020-01-02 Bullet screen implementation method and device, electronic equipment and readable storage medium

Country Status (1)

Country Link
CN (1) CN111131910B (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111586489A (en) * 2020-06-22 2020-08-25 腾讯科技(深圳)有限公司 Barrage rendering method and device, computer equipment and storage medium
CN112149383A (en) * 2020-08-28 2020-12-29 杭州安恒信息技术股份有限公司 GPU-based text real-time layout method, electronic device and storage medium
CN113610699A (en) * 2021-07-19 2021-11-05 广州致远电子有限公司 Hardware layer rendering scheduling method, device, equipment and storage medium

Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105597321A (en) * 2015-12-18 2016-05-25 武汉斗鱼网络科技有限公司 Barrage display method and system in full-screen game state
CN105684037A (en) * 2013-10-02 2016-06-15 微软技术许可有限责任公司 Graphics processing unit
CN105872679A (en) * 2015-12-31 2016-08-17 乐视网信息技术(北京)股份有限公司 Barrage display method and device
CN105939493A (en) * 2016-03-30 2016-09-14 广州华多网络科技有限公司 Video barrage display method and video barrage display device
CN106131643A (en) * 2016-07-13 2016-11-16 乐视控股(北京)有限公司 A kind of barrage processing method, processing means and electronic equipment thereof
CN106534875A (en) * 2016-11-09 2017-03-22 广州华多网络科技有限公司 Barrage display control method and device and terminal
CN107092643A (en) * 2017-03-06 2017-08-25 武汉斗鱼网络科技有限公司 A kind of barrage rendering intent and device
CN107734373A (en) * 2017-10-12 2018-02-23 网易(杭州)网络有限公司 Barrage sending method and device, storage medium, electronic equipment
CN107770563A (en) * 2017-10-10 2018-03-06 武汉斗鱼网络科技有限公司 A kind of barrage message treatment method and device
CN108600852A (en) * 2018-04-28 2018-09-28 北京酷我科技有限公司 A kind of implementation method of barrage effect
CN109213607A (en) * 2017-06-30 2019-01-15 武汉斗鱼网络科技有限公司 A kind of method and apparatus of multithreading rendering
CN109600654A (en) * 2018-11-27 2019-04-09 Oppo广东移动通信有限公司 Barrage processing method, device and electronic equipment
US20190215557A1 (en) * 2016-06-29 2019-07-11 Nokia Technologies Oy Rendering of user-defined message having 3d motion information
CN110012306A (en) * 2019-04-02 2019-07-12 广州虎牙信息科技有限公司 Display methods, device, equipment and the storage medium of barrage
US20190392636A1 (en) * 2017-06-02 2019-12-26 Tencent Technology (Shenzhen) Company Limited Method and apparatus for displaying a bullet

Patent Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105684037A (en) * 2013-10-02 2016-06-15 微软技术许可有限责任公司 Graphics processing unit
CN105597321A (en) * 2015-12-18 2016-05-25 武汉斗鱼网络科技有限公司 Barrage display method and system in full-screen game state
CN105872679A (en) * 2015-12-31 2016-08-17 乐视网信息技术(北京)股份有限公司 Barrage display method and device
CN105939493A (en) * 2016-03-30 2016-09-14 广州华多网络科技有限公司 Video barrage display method and video barrage display device
US20190215557A1 (en) * 2016-06-29 2019-07-11 Nokia Technologies Oy Rendering of user-defined message having 3d motion information
CN106131643A (en) * 2016-07-13 2016-11-16 乐视控股(北京)有限公司 A kind of barrage processing method, processing means and electronic equipment thereof
CN106534875A (en) * 2016-11-09 2017-03-22 广州华多网络科技有限公司 Barrage display control method and device and terminal
CN107092643A (en) * 2017-03-06 2017-08-25 武汉斗鱼网络科技有限公司 A kind of barrage rendering intent and device
US20190392636A1 (en) * 2017-06-02 2019-12-26 Tencent Technology (Shenzhen) Company Limited Method and apparatus for displaying a bullet
CN109213607A (en) * 2017-06-30 2019-01-15 武汉斗鱼网络科技有限公司 A kind of method and apparatus of multithreading rendering
CN107770563A (en) * 2017-10-10 2018-03-06 武汉斗鱼网络科技有限公司 A kind of barrage message treatment method and device
CN107734373A (en) * 2017-10-12 2018-02-23 网易(杭州)网络有限公司 Barrage sending method and device, storage medium, electronic equipment
CN108600852A (en) * 2018-04-28 2018-09-28 北京酷我科技有限公司 A kind of implementation method of barrage effect
CN109600654A (en) * 2018-11-27 2019-04-09 Oppo广东移动通信有限公司 Barrage processing method, device and electronic equipment
CN110012306A (en) * 2019-04-02 2019-07-12 广州虎牙信息科技有限公司 Display methods, device, equipment and the storage medium of barrage

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
环浩: "基于OpenGL_ES的iPhone渲染技术研发与应用", 《中国优秀硕士学位论文全文数据库》 *

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111586489A (en) * 2020-06-22 2020-08-25 腾讯科技(深圳)有限公司 Barrage rendering method and device, computer equipment and storage medium
CN112149383A (en) * 2020-08-28 2020-12-29 杭州安恒信息技术股份有限公司 GPU-based text real-time layout method, electronic device and storage medium
CN112149383B (en) * 2020-08-28 2024-03-26 杭州安恒信息技术股份有限公司 Text real-time layout method based on GPU, electronic device and storage medium
CN113610699A (en) * 2021-07-19 2021-11-05 广州致远电子有限公司 Hardware layer rendering scheduling method, device, equipment and storage medium

Also Published As

Publication number Publication date
CN111131910B (en) 2022-04-12

Similar Documents

Publication Publication Date Title
CN111131910B (en) Bullet screen implementation method and device, electronic equipment and readable storage medium
CN109978972B (en) Method and device for editing characters in picture
US11972514B2 (en) Animation file processing method and apparatus, computer-readable storage medium, and computer device
US11620784B2 (en) Virtual scene display method and apparatus, and storage medium
CN107707965B (en) Bullet screen generation method and device
CN111277910B (en) Bullet screen display method and device, electronic equipment and storage medium
CN111078070B (en) PPT video barrage play control method, device, terminal and medium
CN111432262B (en) Page video rendering method and device
CN110784733B (en) Live broadcast data processing method and device, electronic equipment and readable storage medium
CN109903359B (en) Particle display method and device, mobile terminal and storage medium
CN107748688B (en) Information display method and device
WO2018166470A1 (en) Animation display method based on frame rate and terminal device
CN113411664B (en) Video processing method and device based on sub-application and computer equipment
CN109714627B (en) Comment information rendering method, device and equipment
WO2014036857A1 (en) Animation playing method, device and apparatus
CN111107427B (en) Image processing method and related product
CN113079408A (en) Video playing method, device and system
CN114282135A (en) Data carousel method and device, electronic equipment and storage medium
CN111477183B (en) Reader refresh method, computing device, and computer storage medium
CN110719493A (en) Barrage display method and device, electronic equipment and readable storage medium
CN112449230B (en) Character string display processing method, device, terminal and storage medium
CN109091866B (en) Display control method and device, computer readable medium and electronic equipment
CN115588064A (en) Video generation method and device, electronic equipment and storage medium
CN114449305A (en) Gift animation playing method and device in live broadcast room
CN113422914A (en) Video generation method, device, equipment and medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant