CN117611703A - Barrage character rendering method, barrage character rendering device, barrage character rendering equipment, storage medium and program product - Google Patents

Barrage character rendering method, barrage character rendering device, barrage character rendering equipment, storage medium and program product Download PDF

Info

Publication number
CN117611703A
CN117611703A CN202311638662.7A CN202311638662A CN117611703A CN 117611703 A CN117611703 A CN 117611703A CN 202311638662 A CN202311638662 A CN 202311638662A CN 117611703 A CN117611703 A CN 117611703A
Authority
CN
China
Prior art keywords
character
barrage
texture
distance field
vertex
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311638662.7A
Other languages
Chinese (zh)
Inventor
杨梓瀚
王文帅
陈建安
陈敏
周乐
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN202311638662.7A priority Critical patent/CN117611703A/en
Publication of CN117611703A publication Critical patent/CN117611703A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/001Texturing; Colouring; Generation of texture or colour
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/478Supplemental services, e.g. displaying phone caller identification, shopping application
    • H04N21/4788Supplemental services, e.g. displaying phone caller identification, shopping application communicating with other users, e.g. chatting
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/488Data services, e.g. news ticker
    • H04N21/4884Data services, e.g. news ticker for displaying subtitles

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Image Generation (AREA)

Abstract

The application provides a bullet screen character rendering method, device, equipment, storage medium and program product; the method comprises the following steps: determining whether a directed distance field texture of the barrage character to be rendered exists in the directed distance field texture set; acquiring the directed distance field texture from the directed distance field texture set when the directed distance field texture of the barrage character exists in the directed distance field texture set, and generating the directed distance field texture of the barrage character when the directed distance field texture of the barrage character does not exist in the directed distance field texture set; determining texture coordinates of each vertex in the barrage character, and extracting vertex texture information of each vertex from the directed distance field texture based on the texture coordinates of each vertex; rendering the barrage character based on the vertex texture information of each vertex to obtain a rendered barrage character; through the method and the device, the rendering efficiency and the rendering effect of barrage characters can be improved.

Description

Barrage character rendering method, barrage character rendering device, barrage character rendering equipment, storage medium and program product
Technical Field
The present disclosure relates to the field of internet technologies, and in particular, to a method, an apparatus, a device, a storage medium, and a program product for rendering barrage characters.
Background
With the rapid growth of the internet, users can express their personal views by publishing a barrage while watching or listening to media information (e.g., video, audio). Therefore, it is necessary to render and display barrage characters. In the related art, for rendering of barrage characters, a canvas component provided by an operating system is used for calling a system interface to draw a corresponding bitmap, so that a bitmap character is obtained. However, because the bullet screen characters are bitmap characters, each bullet screen character needs to draw a corresponding bitmap in real time, which leads to excessive occupation and waste of equipment resources, causes the situation of jamming, and has poor rendering effect.
Disclosure of Invention
The embodiment of the application provides a barrage character rendering method, device, electronic equipment, computer readable storage medium and computer program product, which can improve barrage character rendering efficiency and rendering effect.
The technical scheme of the embodiment of the application is realized as follows:
the embodiment of the application provides a bullet screen character rendering method, which comprises the following steps:
Determining whether a directed distance field texture of the barrage character to be rendered exists in the directed distance field texture set;
acquiring the directed distance field texture from the directed distance field texture set when the directed distance field texture of the barrage character exists in the directed distance field texture set, and generating the directed distance field texture of the barrage character when the directed distance field texture of the barrage character does not exist in the directed distance field texture set;
determining texture coordinates of each vertex in the barrage character, and extracting vertex texture information of each vertex from the directed distance field texture based on the texture coordinates of each vertex;
and rendering the barrage character based on the vertex texture information of each vertex to obtain the rendered barrage character.
The embodiment of the application also provides a device for rendering barrage characters, which comprises:
the determining module is used for determining whether the directed distance field texture of the barrage character to be rendered exists in the directed distance field texture set;
the acquisition module is used for acquiring the directed distance field texture from the directed distance field texture set when the directed distance field texture of the barrage character exists in the directed distance field texture set, and generating the directed distance field texture of the barrage character when the directed distance field texture of the barrage character does not exist in the directed distance field texture set;
The extraction module is used for determining the texture coordinates of each vertex in the barrage character and extracting vertex texture information of each vertex from the directed distance field texture based on the texture coordinates of each vertex;
and the rendering module is used for rendering the barrage character based on the vertex texture information of each vertex to obtain the rendered barrage character.
The embodiment of the application also provides electronic equipment, which comprises:
a memory for storing computer executable instructions;
and the processor is used for realizing the bullet screen character rendering method provided by the embodiment of the application when executing the computer executable instructions stored in the memory.
The embodiment of the application also provides a computer readable storage medium, which stores computer executable instructions, wherein the computer executable instructions realize the barrage character rendering method provided by the embodiment of the application when being executed by a processor.
The embodiment of the application also provides a computer program product, which comprises computer executable instructions, wherein the computer executable instructions realize the barrage character rendering method provided by the embodiment of the application when being executed by a processor.
The embodiment of the application has the following beneficial effects:
By applying the embodiment of the application, firstly, determining whether the directed distance field texture of the barrage character to be rendered exists in the directed distance field texture set; when the directed distance field texture of the barrage character exists in the directed distance field texture set, the directed distance field texture is obtained from the directed distance field texture set, and when the directed distance field texture of the barrage character does not exist in the directed distance field texture set, the directed distance field texture of the barrage character is generated; then determining texture coordinates of each vertex in the barrage character, and extracting vertex texture information of each vertex from the directed distance field texture based on the texture coordinates of each vertex; and finally, rendering the barrage character based on the vertex texture information of each vertex to obtain the rendered barrage character.
Here, the directional distance field texture of a part of the barrage characters is pre-stored through the directional distance field texture set, when the barrage characters are rendered, if the directional distance field texture of the barrage characters exists in the directional distance field texture set, the directional distance field textures of the barrage characters can be directly obtained from the directional distance field texture set, and if the directional distance field texture does not exist in the directional distance field texture set, the directional distance field textures of the barrage characters are generated. Thus, 1) the occupation of equipment resources required by generating the directional distance field textures of the barrage characters each time can be reduced, and the rendering efficiency of the barrage characters is improved; 2) For the directional distance field textures of barrage characters lacking in the directional distance field texture set, the compensation effect is achieved in a manner of immediate generation; thus, the rendering effect of barrage characters is improved.
Drawings
FIG. 1 is a schematic diagram of an architecture of a bullet character rendering system according to an embodiment of the present disclosure;
fig. 2 is a schematic structural diagram of an electronic device according to an embodiment of the present application;
FIG. 3 is a flowchart illustrating a method for rendering barrage characters according to an embodiment of the present disclosure;
FIG. 4 is a schematic display of rendered barrage characters provided in an embodiment of the present application;
FIG. 5 is a flowchart of a method for rendering barrage characters according to an embodiment of the present disclosure;
FIG. 6 is a schematic diagram of a character bitmap of a barrage character provided in an embodiment of the present application;
FIG. 7 is a schematic diagram of a process for determining directional distance field texture provided by an embodiment of the present application;
FIG. 8 is a schematic diagram of a process for determining directional distance field texture provided by an embodiment of the present application;
FIG. 9 is a schematic diagram of a process for determining directional distance field texture provided by an embodiment of the present application;
FIG. 10 is a schematic diagram of a process for determining directional distance field texture provided by an embodiment of the present application;
FIG. 11 is a schematic diagram of a process for determining directional distance field texture provided by an embodiment of the present application;
FIG. 12 is a schematic diagram of a process for determining directional distance field texture provided by an embodiment of the present application;
FIG. 13 is a schematic illustration of a display of directed distance field textures provided by an embodiment of the present application;
Fig. 14 is a schematic diagram of vertices of barrage characters provided in an embodiment of the present application.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the present application more apparent, the present application will be described in further detail with reference to the accompanying drawings, and the described embodiments should not be construed as limiting the present application, and all other embodiments obtained by those skilled in the art without making any inventive effort are within the scope of the present application.
In the following description, reference is made to "some embodiments" which describe a subset of all possible embodiments, but it is to be understood that "some embodiments" can be the same subset or different subsets of all possible embodiments and can be combined with one another without conflict.
In the following description, the terms "first", "second", "third" and the like are merely used to distinguish similar objects and do not represent a specific ordering of the objects, it being understood that the "first", "second", "third" may be interchanged with a specific order or sequence, as permitted, to enable embodiments of the application described herein to be practiced otherwise than as illustrated or described herein.
In the present embodiment, the term "module" or "unit" refers to a computer program or a part of a computer program having a predetermined function, and works together with other relevant parts to achieve a predetermined object, and may be implemented in whole or in part by using software, hardware (such as a processing circuit or a memory), or a combination thereof. Also, a processor (or multiple processors or memories) may be used to implement one or more modules or units. Furthermore, each module or unit may be part of an overall module or unit that incorporates the functionality of the module or unit.
Unless defined otherwise, all technical and scientific terms used in the embodiments of the present application have the same meaning as commonly understood by one of ordinary skill in the art. The terminology used in the embodiments of the application is for the purpose of describing the embodiments of the application only and is not intended to be limiting of the application.
Before further describing embodiments of the present application in detail, the terms and expressions that are referred to in the embodiments of the present application are described, and are suitable for the following explanation.
1) Client side: applications running in the terminal for providing various services, such as a client supporting the playing of media information (e.g. audio, video).
2) In response to: for representing a condition or state upon which an operation is performed, one or more operations performed may be in real-time or with a set delay when the condition or state upon which the operation is dependent is satisfied; without being specifically described, there is no limitation in the execution sequence of the plurality of operations performed.
3) Directed distance field (Signed Distance Field, SDF): the directed distance field is a bitmap that marks the distance from each pixel to the nearest geometric edge, and is positive if the pixel is inside the geometric body, negative if the pixel is outside the geometric body, and 0 if the pixel is just on the edge.
4) Static SDF atlas (i.e., directional distance field texture set in this application): an atlas and text file generated by a user on an electronic device using a bitmap font tool contains a SDF texture atlas and a file of character information. An SDF bitmap (SDF texture) of a number of characters is stored in the SDF texture atlas. The bitmap format is RGBA, which is a model of the color space, and represents red, green, blue and Alpha channels. Typically distance information is stored in Alpha channels. The character information file contains location information of characters appearing in the SDF bitmap, such as width, height, coordinates, etc. of the characters in the bitmap. The method has the advantages that the character SDF texture is not required to be generated during operation, and the texture information of the corresponding character is only required to be obtained from the SDF texture according to the character information.
5) Dynamic SDF atlas (i.e., dynamic directed distance field texture set in this application): an atlas dynamically generated by the program on the fly from the entered characters. Similar to the static SDF atlas, it consists of a mapping table (unicode code with primary key as character and character information with value as the character) of SDF texture atlas containing several characters and its character information. Typically only in memory, to facilitate dynamic add-drop.
6) Anti-aliasing: is a technique for alleviating the jaggies of the edges of the graph and smoothing the edges by applying a toning technique. The object edge always presents more or less triangular saw teeth due to the limitation of resolution, and anti-saw teeth refer to objects which are subjected to softening treatment on the image edge, so that the image edge looks smoother and is closer to a real object.
7) Shader (Shader): is a computer program originally used for performing shading processing of an image (calculating illumination, brightness, color, etc. in an image), but recently, it has also been used for performing work in many different fields, such as processing CG special effects, performing film post-processing independent of shading processing, and even in some other fields independent of computer graphics. Unlike a normal program, it is a Graphics Processing Unit (GPU) that runs on a graphics processor. Shaders are classified into a variety of types, such as vertex shaders and fragment shaders. The vertex shader converts each vertex in the model into a pixel on the screen (e.g., coordinate transformation), and the fragment shader converts each pixel on the model into a pixel on the screen (e.g., illumination computation, texture sampling, etc.). The developer can modify the vertex information and the pixel information of the graph through the custom shader to achieve different effects.
8) Bitmap font (or bitmap character): also known as dot matrix fonts, each glyph is represented by a set of two-dimensional pixel information. Because of the bitmap, the dot matrix fonts are difficult to scale, and the specific dot matrix fonts can only be clearly displayed under the corresponding font sizes, otherwise, the characters are only forcedly enlarged to damage the fonts, and the mosaic type saw tooth edges are generated.
9) Texture sampling: and obtaining the corresponding position color from the texture map according to the texture coordinates of the fragment. The process of texture sampling is similar to sampling on an image. When an electronic device needs to render a texture on a three-dimensional model surface, it obtains color values for corresponding locations from the texture image based on texture coordinates of each vertex of the model surface (i.e., the location of each vertex on the texture image). This process is called texture sampling. Texture coordinates are on the x and y axes, ranging from 0 to 1. During texture sampling, the electronics typically use interpolation algorithms to determine color values between texture coordinates.
Based on the above description of the terms and terminology involved in the embodiments of the present application, the embodiments of the present application will be described in detail below. The embodiment of the application provides a barrage character rendering method, device, electronic equipment, computer readable storage medium and computer program product, which can improve barrage character rendering efficiency and rendering effect.
It should be noted that, in the application of the present application, the relevant data collection process should strictly obtain the informed consent or the individual consent of the personal information body according to the requirements of the relevant laws and regulations, and develop the subsequent data use and processing actions within the authorized range of the laws and regulations and the personal information body.
The following describes a barrage character rendering system provided in an embodiment of the present application. Referring to fig. 1, fig. 1 is a schematic architecture diagram of a barrage character rendering system according to an embodiment of the present application. To enable support for one exemplary application, the bullet character rendering system 100 includes: server 200, network 300, and terminal 400. The terminal 400 is connected to the server 200 through the network 300, where the network 300 may be a wide area network or a local area network, or a combination of both, and the data transmission is implemented using a wireless or wired link.
Here, the terminal 400 (e.g., a client running with support for media information (e.g., audio, video) play) transmits a rendering request for the barrage character to the server 200 in response to an issue instruction for the barrage character; the server 200 receives a rendering request for barrage characters sent by the terminal 400; in response to a rendering request, determining whether a directed distance field texture of a barrage character to be rendered exists in the directed distance field texture set; acquiring directed distance field textures from the directed distance field texture set when the directed distance field textures of the barrage characters exist in the directed distance field texture set, and generating the directed distance field textures of the barrage characters when the directed distance field textures of the barrage characters do not exist in the directed distance field texture set; determining texture coordinates of each vertex in the barrage character, and extracting vertex texture information of each vertex from the directed distance field texture based on the texture coordinates of each vertex; returning vertex texture information of each vertex to the terminal 400; the terminal 400 receives vertex texture information of each vertex returned by the server 200; rendering the barrage character based on the vertex texture information of each vertex to obtain a rendered barrage character; and displaying the rendered barrage characters.
In some embodiments, the method for rendering barrage characters provided in the embodiments of the present application is implemented by an electronic device, for example, may be implemented by a terminal alone, may be implemented by a server alone, or may be implemented by a terminal and a server in cooperation. The embodiments of the present application may be applied to various scenarios including, but not limited to, cloud technology, artificial intelligence, intelligent transportation, assisted driving, audio-video, instant messaging, gaming, live broadcast, etc.
In some embodiments, the electronic device implementing the bullet screen character rendering method provided in the embodiments of the present application may be a terminal or a server of various types. The server (e.g., server 200) may be an independent physical server, or may be a server cluster or a distributed system formed by a plurality of physical servers. The terminal (e.g., terminal 400) may be, but is not limited to, a notebook computer, tablet computer, desktop computer, smart phone, smart voice interaction device (e.g., smart speaker), smart home appliance (e.g., smart television), smart watch, vehicle-mounted terminal, wearable device, virtual Reality (VR) device, aircraft, etc. The terminal and the server may be directly or indirectly connected through wired or wireless communication, which is not limited in the embodiment of the present application.
In some embodiments, the barrage character rendering method provided in the embodiments of the present application may be implemented by means of Cloud Technology (Cloud Technology). Cloud technology refers to a hosting technology for unifying serial resources such as hardware, software, network and the like in a wide area network or a local area network to realize calculation, storage, processing and sharing of data. The cloud technology is a generic term of network technology, information technology, integration technology, management platform technology, application technology and the like based on cloud computing business model application, can form a resource pool, and is flexible and convenient as required. Cloud computing technology will become an important support. Background services of technical network systems require a large amount of computing and storage resources. As an example, a server (e.g., server 200) may also be a cloud server that provides cloud services, cloud databases, cloud computing, cloud functions, cloud storage, web services, cloud communications, middleware services, domain name services, security services, content delivery networks (Content Delivery Network, CDNs), and basic cloud computing services such as big data and artificial intelligence platforms.
In some embodiments, the terminal or the server may implement the barrage character rendering method provided in the embodiments of the present application by running various computer executable instructions or computer programs. For example, the computer-executable instructions may be commands at the micro-program level, machine instructions, or software instructions. The computer program may be a native program or a software module in an operating system; a local (Native) Application (APP), i.e. a program that needs to be installed in an operating system to run, such as a client supporting playing of media information; or an applet that can be embedded in any APP, i.e., a program that can be run only by being downloaded into the browser environment. In general, the computer-executable instructions may be any form of instructions and the computer program may be any form of application, module, or plug-in.
The electronic device for implementing the bullet screen character rendering method provided by the embodiment of the application is described below. Referring to fig. 2, fig. 2 is a schematic structural diagram of an electronic device according to an embodiment of the present application. The electronic device 500 provided in the embodiment of the present application may be a terminal or a server. As shown in fig. 2, the electronic device 500 includes: at least one processor 510, a memory 550, at least one network interface 520, and a user interface 530. The various components in electronic device 500 are coupled together by bus system 540. It is appreciated that the bus system 540 is used to enable connected communications between these components. The bus system 540 includes a power bus, a control bus, and a status signal bus in addition to the data bus. The various buses are labeled as bus system 540 in fig. 2 for clarity of illustration.
The processor 510 may be an integrated circuit chip with signal processing capabilities such as a general purpose processor, such as a microprocessor or any conventional processor, a digital signal processor (Digital Signal Processor, DSP), or other programmable logic device, discrete gate or transistor logic, discrete hardware components, or the like.
The user interface 530 includes one or more output devices 531 that enable presentation of media content, including one or more speakers and/or one or more visual displays. The user interface 530 also includes one or more input devices 532, including user interface components that facilitate user input, such as a keyboard, mouse, microphone, touch screen display, camera, other input buttons and controls.
The memory 550 may be removable, non-removable, or a combination thereof. Memory 550 may include one or more storage devices physically located away from processor 510. Memory 550 includes volatile memory or nonvolatile memory, and may also include both volatile and nonvolatile memory. The nonvolatile Memory may be a Read Only Memory (ROM), and the volatile Memory may be a random access Memory (Random Access Memory, RAM). The memory 550 described in embodiments herein is intended to comprise any suitable type of memory.
In some embodiments, memory 550 is capable of storing data to support various operations, examples of which include programs, modules and data structures, or subsets or supersets thereof, as exemplified below.
An operating system 551 including system programs for handling various basic system services and performing hardware-related tasks, such as a framework layer, a core library layer, a driver layer, etc., for implementing various basic services and handling hardware-based tasks;
network communication module 552 is used to reach other electronic devices via one or more (wired or wireless) network interfaces 520, exemplary network interfaces 520 include: bluetooth, wireless compatibility authentication (WiFi), and universal serial bus (Universal Serial Bus, USB), etc.;
a presentation module 553 for enabling presentation of information (e.g., a user interface for operating a peripheral device and displaying content and information) via one or more output devices 531 (e.g., a display screen, speakers, etc.) associated with the user interface 530;
the input processing module 554 is configured to detect one or more user inputs or interactions from one of the one or more input devices 532 and translate the detected inputs or interactions.
In some embodiments, the barrage character rendering device provided in the embodiments of the present application may be implemented in a software manner, and fig. 2 shows a barrage character rendering device 555 stored in a memory 550, which may be software in the form of a program and a plug-in, and includes the following software modules: the determining module 5551, the acquiring module 5552, the extracting module 5553 and the rendering module 5554 are logical, and thus may be arbitrarily combined or further split according to the implemented functions, the functions of each module will be described below.
The following describes a barrage character rendering method provided in the embodiment of the present application. As described above, the method for rendering barrage characters provided in the embodiments of the present application is implemented by an electronic device, for example, may be implemented by a server or a terminal alone, or implemented by a server and a terminal cooperatively. The execution subject of each step will not be repeated hereinafter. Referring to fig. 3, fig. 3 is a flowchart of a method for rendering barrage characters according to an embodiment of the present application, where the method for rendering barrage characters according to the embodiment of the present application includes:
step 101: a determination is made as to whether a directed distance field texture of the barrage character to be rendered exists in the set of directed distance field textures.
In step 101, for a barrage character to be rendered, it is determined whether there is a directed distance field texture for the barrage character in the set of directed distance field textures. Here, the directional distance field texture set is a pre-constructed directional distance field texture set including a plurality of target barrage characters, the directional distance field texture is also called a directional distance field texture map (or directional distance field bitmap), and is in RGBA format, RGBA is a model of a color space, RGBA represents red, green, blue and Alpha channels, and distance information of the directional distance field texture is stored in the Alpha channels. The barrage characters may include, but are not limited to, characters for barrages such as text, emoticons, pigment text, specific graphics, pictures, and the like.
In some embodiments, before determining whether there is a directed distance field texture of the barrage character to be rendered in the directed distance field texture set, the following steps may also be performed: determining a plurality of target barrage characters for the barrage; generating target directional distance field textures of each target barrage character; based on the directional distance field textures of each target, a directional distance field texture set is constructed.
Here, in constructing the directed distance field texture set, a plurality of target barrage characters for the barrage are first determined. For example, the target barrage character may be a barrage character whose frequency of use reaches a frequency threshold, may be a barrage character associated with an audio-video that is being recently played, may be a special character that may be used but is not used frequently, and so on. Then, target directional distance field textures of each target barrage character are generated. And finally, constructing a directional distance field texture set based on the directional distance field textures of each target. In practical applications, when the directional distance field texture set is constructed based on each target directional distance field texture, character information of target barrage characters in the target directional distance field texture may be stored in association with each target directional distance field texture, where the character information may include a size (including a height and a width) of the target barrage characters, position information of the target barrage characters in the target directional distance field texture, and the like, and the position information may be a vertex coordinate (x, y) of a certain vertex or certain vertices in the target barrage characters in the target directional distance field texture.
In some embodiments, the directed distance field texture of the target barrage character may be generated by performing the steps of: generating a character bitmap of the target barrage character; determining the nearest distance between each pixel point in the character bitmap and the character edge of the target barrage character; normalizing the nearest distance corresponding to each pixel point to obtain a normalization result corresponding to each pixel point; and filling the transparency channel value of each pixel point into a normalization result, and filling the red, green and blue RGB channel value of each pixel point into 1 to obtain the directional distance field texture of the target barrage character.
Here, first, a character bitmap of the target barrage character is generated, for example, the system drawing interface may be invoked to draw the character bitmap of the target barrage character. And then determining the nearest distance between each pixel point in the character bitmap and the character edge of the target barrage character, for example, determining the nearest distance between each pixel point in the character bitmap and the character edge of the target barrage character through a directed distance field algorithm. And carrying out normalization processing on the nearest distance corresponding to each pixel point to obtain a normalization result corresponding to each pixel point. And finally, filling the transparency channel value of each pixel point into a normalization result, and filling the red, green and blue RGB channel values of each pixel point into 1 to obtain the directional distance field texture of the target barrage character.
It should be noted that, the directional distance field texture set may be updated according to a preset update condition, for example, at intervals, the directional distance field textures whose frequency of use is lower than a threshold value may be screened out; new directed distance field textures can be added according to recently played audios and videos; new directed distance field textures may also be added based on recently entered barrage characters by the user, and so on.
Step 102: the method includes the steps of obtaining directional distance field textures from a directional distance field texture set when the directional distance field textures of the barrage characters exist in the directional distance field texture set, and generating the directional distance field textures of the barrage characters when the directional distance field textures of the barrage characters do not exist in the directional distance field texture set.
If it is determined in step 102 that the directional distance field texture set has the directional distance field texture of the barrage character, the directional distance field texture may be directly obtained from the directional distance field texture set to render the barrage character based on the directional distance field texture in the subsequent step; if it is determined that the directed distance field texture of the barrage character does not exist in the directed distance field texture set, then the directed distance field texture of the barrage character is generated instantaneously. In practical application, for the generated directional distance field texture, character information of barrage characters in the directional distance field texture can be stored in an associated mode, the character information can include the size (including height and width) of barrage characters, position information of barrage characters in the directional distance field texture, and the like, and the position information can be vertex coordinates (x, y) of a certain vertex or certain vertices in the barrage characters in the directional distance field texture.
Steps 101-102 are adopted to consider that barrage characters related to barrage scenes are often various, so that the embodiment of the application adopts a dynamic and static character combination mode, namely: a directional distance field texture set (corresponding to static characters) of the target barrage character is built in advance, if the directional distance field texture set has the directional distance field textures of the barrage character to be rendered, the barrage character is directly used, so that occupation of equipment resources required for generating the directional distance field textures of the barrage character each time is reduced, performance cost of barrage character rendering is reduced, and barrage character rendering efficiency is improved; if the directional distance field texture set does not have the directional distance field texture of the barrage character to be rendered, the directional distance field texture (corresponding to the dynamic character) of the barrage character is generated in real time, so that the effect of character compensation by means of real-time generation when the static character is absent is achieved, and the rendering success rate of the barrage character is improved. Therefore, the rendering efficiency and the rendering success rate of the barrage characters are balanced, the performance cost of barrage character rendering is reduced, and the occupation of equipment resources is reduced.
In some embodiments, when the directional distance field texture of the barrage character does not exist in the directional distance field texture set, after generating the directional distance field texture of the barrage character, the following steps may be further performed: determining whether a free area exists in the dynamic directed distance field texture set; when the dynamic directional distance field texture set has an idle region, adding the directional distance field texture into the idle region of the dynamic directional distance field texture set; when the dynamic directional distance field texture set does not have a free area, generating a target directional distance field texture set, wherein the capacity of the target directional distance field texture set is larger than that of the dynamic directional distance field texture set; adding the directional distance field texture and the dynamic directional distance field texture in the dynamic directional distance field texture set into the target directional distance field texture set.
Here, for the directional distance field texture of the live-action barrage character, the directional distance field texture of the barrage character may be stored in a dynamic directional distance field texture set for subsequent reuse. Specifically, whether a free area exists in the dynamic directional distance field texture set can be firstly determined, and if the free area exists, the directional distance field texture of the barrage character is directly stored in the dynamic directional distance field texture set; if not, generating a target directed distance field texture set, wherein the capacity of the target directed distance field texture set is larger than that of the dynamic directed distance field texture set; and adding the directional distance field texture of the barrage character and the original dynamic directional distance field texture in the dynamic directional distance field texture set into the target directional distance field texture set. Thus, the capacity expansion of the dynamic directional distance field texture set is realized.
In some embodiments, after adding the directed distance field texture, and the dynamic directed distance field texture in the set of dynamic directed distance field textures, to the set of target directed distance field textures, the following steps may also be performed: adding the dynamic directed distance field texture set into a texture set queue to be destroyed; continuing, the following steps may also be performed: determining a component number of components that render based on the dynamic directed distance field texture set; when the number of components is zero, the dynamic directed distance field texture set is destroyed.
Here, since the original dynamic directed distance field textures in the dynamic directed distance field texture set are added to the new target directed distance field texture set, the dynamic directed distance field texture set can be added to the texture set queue to be destroyed. Based on the method, for the dynamic directed distance field texture sets added into the texture set queue to be destroyed, the number of components for performing bullet character rendering based on the dynamic directed distance field texture sets can be detected in real time or periodically; when the number of components is detected to be zero, the dynamic directed distance field texture set can be destroyed, so that the storage space is released.
In some embodiments, the directed distance field texture of the barrage character may be generated by performing the steps of: generating a character bitmap of the barrage character; determining the nearest distance between each pixel point in the character bitmap and the character edge of the barrage character; normalizing the nearest distance corresponding to each pixel point to obtain a normalization result corresponding to each pixel point; and filling the transparency channel value of each pixel point into a normalization result, and filling the red, green and blue RGB channel value of each pixel point into 1 to obtain the directional distance field texture of the barrage character.
Here, first, a character bitmap of the barrage character is generated, for example, the system drawing interface may be invoked to draw the character bitmap of the barrage character. And then determining the nearest distance between each pixel point in the character bitmap and the character edge of the barrage character, for example, determining the nearest distance between each pixel point in the character bitmap and the character edge of the barrage character through a directed distance field algorithm. And carrying out normalization processing on the nearest distance corresponding to each pixel point to obtain a normalization result corresponding to each pixel point. And finally, filling the transparency channel value of each pixel point into a normalization result, and filling the red, green and blue RGB channel values of each pixel point into 1 to obtain the directional distance field texture of the barrage character. In practical application, for the generated directional distance field texture, character information of barrage characters in the directional distance field texture can be stored in an associated mode, the character information can include the size (including height and width) of barrage characters, position information of barrage characters in the directional distance field texture, and the like, and the position information can be vertex coordinates (x, y) of a certain vertex or certain vertices in the barrage characters in the directional distance field texture.
In some embodiments, after the transparency (alpha) channel value of each pixel point in the character bitmap of the barrage character is filled with the normalization result of the nearest distance corresponding to each pixel point, the barrage character may be rendered based on the vertex texture information of each vertex by performing the following steps to obtain the rendered barrage character: determining a target pixel point with a transparency channel value higher than a transparency threshold value in the pixel points of the barrage character; and rendering the target pixel points of the barrage character based on the vertex texture information of each vertex to obtain the rendered barrage character.
Here, a transparency threshold (for example, 0.1) may be preset, and then a target pixel point with a transparency channel value higher than the transparency threshold in the pixel points of the barrage character may be determined, so that when the barrage character is rendered, the target pixel point of the barrage character may be rendered based on vertex texture information of each vertex, and the rendered barrage character is obtained. Thus, the rendering efficiency of barrage characters is improved.
Step 103: and determining texture coordinates of each vertex in the barrage character, and extracting vertex texture information of each vertex from the directed distance field texture based on the texture coordinates of each vertex.
In step 103, after the directional distance field texture of the barrage character is obtained, the texture coordinates of each vertex in the barrage character in the directional distance field texture are determined, so that vertex texture information of each vertex can be extracted from the directional distance field texture based on the texture coordinates of each vertex for rendering the barrage character in the subsequent step.
In some embodiments, the texture coordinates of each vertex in the barrage character may be determined by performing the following steps: obtaining vertex coordinates of a target character vertex of a target character in directed distance field texture and character size of the target character; obtaining the texture size of the directional distance field texture; determining vertex coordinates of each character vertex in the target character based on the vertex coordinates of the target character vertex, the texture size and the character size; and determining the vertex coordinates of the character vertices as the texture coordinates of the vertices of the corresponding character vertices in the barrage character for each character vertex.
Here, first, the vertex coordinates of the target character vertices of the target character in the directed distance field texture, and the character size of the target character are acquired. It should be noted that, the target character is a character adopted when the directional distance field texture is generated, the content of the target character and the barrage character is the same (i.e. the same character), but the sizes of the target character and the barrage character may be different, for example, the font size of the target character (such as a character) adopted when the directional distance field texture is generated may be 24 # and the font size of the barrage character to be rendered may be 32 #; since character information of a target character is often also stored when generating a directional distance field texture of the target character, the character information may include the size of the target character (including the height and width of the target character), position information of the target character in the directional distance field texture, and the like, for example, the position information may be a vertex in the target character or vertex coordinates (x, y) of a certain vertex in the directional distance field texture. Therefore, when the vertex coordinates of the target character vertices of the target character in the directed distance field texture and the character size of the target character are acquired, the character information of the target character can be acquired, so that the vertex coordinates of the target character vertices (for example, the vertices of the upper left corner) of the target character and the character size of the target character can be extracted from the character information.
Continuing, it is also desirable to obtain the texture size of the directional distance field texture (including the width and height of the directional distance field texture). The texture size of the directed distance field texture may also be stored when the directed distance field texture of the target character is generated. The vertex coordinates of the vertices of each character in the target character can be determined based on the vertex coordinates of the vertices of the target character, the texture size, and the character size. For example, if the vertex coordinates of the vertex of the target character are (x, y) and the vertex of the target character is the vertex of the upper left corner of the target character, and the character size is width height, then the vertex coordinates of the vertex of the character in the upper right corner of the target character may be (x+width, y), the vertex coordinates of the vertex in the lower left corner of the target character may be (x, y+height), and the vertex coordinates of the vertex of the character in the lower right corner of the target character may be (x+width, y+height). It should be noted that the texture size obtained here is used to ensure that the vertex coordinates of the vertices of each character calculated are all on the directional distance field texture. And finally, aiming at each character vertex, determining the vertex coordinates of the character vertex as the texture coordinates of the vertex corresponding to the character vertex in the barrage character.
Step 104: and rendering the barrage character based on the vertex texture information of each vertex to obtain the rendered barrage character.
In step 104, after obtaining the vertex texture information of each vertex in the barrage character, rendering the barrage character based on the vertex texture information of each vertex to obtain a rendered barrage character.
In some embodiments, before rendering the barrage character based on the vertex texture information of each vertex, the following steps may be further performed: determining vertex coordinates of each vertex; based on the vertex texture information of each vertex, the barrage character can be rendered by executing the following steps: rendering the barrage character based on the vertex coordinates of the vertexes and the vertex texture information of the vertexes to obtain the rendered barrage character.
Here, in rendering the barrage character, it is necessary to determine the vertex coordinates of each vertex in the barrage character to determine at which position the rendering of the vertex of the barrage character is performed. In some embodiments, the vertex coordinates of each vertex may be determined by performing the steps of: acquiring a first size of a barrage character, a second size of a target character in a directed distance field pattern and vertex coordinates of vertexes of all characters in the target character; and determining the size ratio of the first size to the second size, and determining the vertex coordinates of the vertexes based on the vertex coordinates of the vertexes of the characters and the size ratio.
It should be noted that, the target character is a character used when the directional distance field texture is generated, the target character and the barrage character are identical in content (i.e., the same character), but the target character and the barrage character may be different in size, for example, the character size of the target character (e.g., the character) used when the directional distance field texture is generated may be 24, and the character size of the barrage character to be rendered may be 32. Therefore, when the vertex coordinates of each vertex in the barrage character are determined, the first size of the barrage character is firstly obtained, and the second size of the target character in the directed distance field pattern and the vertex coordinates of each character vertex in the target character are obtained; and determining the size ratio of the first size to the second size, so as to determine the vertex coordinates of the vertexes based on the vertex coordinates and the size ratio of the vertexes of the characters, for example, multiplying the vertex coordinates of the vertexes of the characters by the size ratio to obtain the vertex coordinates of the corresponding vertexes. In this way, by calculating the ratio between the first size of the barrage character to be rendered and the second size of the target character in the directed distance field texture for rendering the barrage character, the reduction or expansion of the rendered barrage character can be achieved; the directional distance field texture stores distance information (namely the nearest distance from each pixel point in the character bitmap of the barrage character to the character edge of the barrage character), and the distance information of a new pixel point generated by amplification is interpolated and calculated during texture sampling aiming at the rendering of the amplified barrage character, so that lossless amplification can be realized; and the size ratio of the barrage character and the vertex coordinates of the vertexes are only needed to be calculated for zooming the barrage character, so that additional expenditure is not increased.
Based on the vertex coordinates of the vertices and vertex texture information of the vertices, rendering the barrage character to obtain the rendered barrage character. For example, vertex textures of vertices located at corresponding vertex coordinates may be drawn based on vertex texture information of each vertex; the texture information of the pixel point between any two vertexes can be obtained by carrying out interpolation calculation on the vertex texture information of the two vertexes, and the pixel point coordinates of the pixel point between any two vertexes can also be obtained by carrying out interpolation calculation on the vertex coordinates of the two vertexes. In this way, the texture information of the pixel points between any two vertexes is adopted to draw the pixel point textures of the pixel points positioned at the corresponding pixel point coordinates.
In some embodiments, before rendering the barrage character based on the vertex texture information of each vertex, the following steps may be further performed: obtaining the vertex color of each vertex and the special effect information of barrage characters; based on the vertex coordinates of each vertex and vertex texture information of each vertex, the barrage character can be rendered by executing the following steps to obtain the rendered barrage character: drawing character textures of barrage characters based on vertex coordinates of the vertexes and vertex texture information of the vertexes; based on the vertex colors of the vertexes, performing color drawing on the character textures to obtain middle barrage characters; and performing special effect processing on the intermediate barrage characters based on the special effect information of the barrage characters to obtain rendered barrage characters.
Here, when rendering the barrage character, it is also necessary to acquire the vertex colors of the vertices in the barrage character and special effect information of the barrage character. Based on this, first, the character texture of the barrage character is drawn based on the vertex coordinates of the vertices and the vertex texture information of the vertices. For example, vertex textures of vertices located at corresponding vertex coordinates may be drawn based on vertex texture information of each vertex; the texture information of the pixel point between any two vertexes can be obtained by carrying out interpolation calculation on the vertex texture information of the two vertexes, and the pixel point coordinates of the pixel point between any two vertexes can also be obtained by carrying out interpolation calculation on the vertex coordinates of the two vertexes. In this way, the texture information of the pixel points between any two vertexes is adopted to draw the pixel point textures of the pixel points positioned at the corresponding pixel point coordinates.
And then, carrying out color drawing on the character texture based on the vertex colors of the vertexes to obtain the intermediate barrage character. For example, color modification may be performed on the character texture of the vertices based on the vertex color of each vertex; for the pixel point color of the pixel point between any two vertexes, the pixel point color can be obtained by carrying out interpolation calculation on the vertex colors of the two vertexes. In this way, the color of the pixel point between any two vertexes is adopted to modify the color of the character texture of the corresponding pixel point.
And finally, performing special effect processing on the intermediate barrage characters based on the special effect information of the barrage characters to obtain rendered barrage characters. For example, effect processing may include a stroked effect, a graded effect, a shaded effect, and so forth.
In some embodiments, the number of barrage characters is a plurality, the plurality of barrage characters being arranged in sequence from a starting position to an ending position; the effect of the barrage character is a gradient effect from the start position to the end position. Based on the special effect information of the barrage characters, special effect processing can be carried out on the intermediate barrage characters by executing the following steps to obtain rendered barrage characters: determining character vertex coordinates of vertexes in each barrage character during arrangement; acquiring a starting color of a starting position of the barrage character positioned at the starting position and an ending color of an ending position of the barrage character positioned at the ending position; determining character special effect colors of all vertexes in each barrage character based on character vertex coordinates, starting colors and ending colors of all vertexes in each barrage character; and taking the character special effect color of each vertex in each barrage character as the character special effect color of each vertex in the middle barrage character corresponding to each barrage character, and carrying out special effect processing on the middle barrage character corresponding to each barrage character based on the character special effect color of each vertex in each barrage character to obtain each barrage character after rendering.
Here, the special effect processing procedure of the gradation special effect is described. Character vertex coordinates of vertices in each barrage character at the time of arrangement are first determined. For example, the plurality of barrage characters are arranged transversely, the width of the occupied area is W and the height H, the height H and the width W of the barrage character are firstly obtained from the 1 st barrage character, the left top point of the barrage character is taken as the origin (0, 0) of the coordinate axis, therefore, the character vertex coordinate of the left top point of the barrage character is (0, 0), the character vertex coordinate of the right top point is (the distance/width W of the right top point and the x direction of the origin, 0), the character vertex coordinate of the left bottom point is (0, the distance/height H of the left bottom point and the y direction of the origin), and the character vertex coordinate of the right bottom point is (the distance/width W of the right top point and the y direction of the left bottom point and the origin).
Then, a start color of a start position of the bullet character at the start position and an end color of an end position of the bullet character at the end position are acquired. In practical applications, the color value of the starting color of the starting position and the color value of the ending color of the ending position are obtained, wherein the starting position is actually one vertex of the barrage character, and the ending position is also one vertex of the barrage character. And determining the character special effect color of each vertex in each barrage character based on the character vertex coordinates, the starting color and the ending color of each vertex in each barrage character. Specifically, the character effect color of each vertex between the two vertices (the two vertices corresponding to the start position and the end position) may be obtained by performing interpolation calculation on the character effect colors of the two vertices (including the start color of the start position and the end color of the end position). It should be noted that, the vertices of the barrage characters and the intermediate barrage characters are in one-to-one correspondence, so that the character special effect color of each vertex in each barrage character can be used as the character special effect color of each vertex in the intermediate barrage characters corresponding to each barrage character, and the intermediate barrage characters corresponding to each barrage character are specially treated based on the character special effect color of each vertex in each barrage character, so as to obtain each barrage character after rendering.
By applying the embodiment of the application, firstly, determining whether the directed distance field texture of the barrage character to be rendered exists in the directed distance field texture set; when the directed distance field texture of the barrage character exists in the directed distance field texture set, the directed distance field texture is obtained from the directed distance field texture set, and when the directed distance field texture of the barrage character does not exist in the directed distance field texture set, the directed distance field texture of the barrage character is generated; then determining texture coordinates of each vertex in the barrage character, and extracting vertex texture information of each vertex from the directed distance field texture based on the texture coordinates of each vertex; and finally, rendering the barrage character based on the vertex texture information of each vertex to obtain the rendered barrage character.
Here, the directional distance field texture of a part of the barrage characters is pre-stored through the directional distance field texture set, when the barrage characters are rendered, if the directional distance field texture of the barrage characters exists in the directional distance field texture set, the directional distance field textures of the barrage characters can be directly obtained from the directional distance field texture set, and if the directional distance field texture does not exist in the directional distance field texture set, the directional distance field textures of the barrage characters are generated. Thus, 1) the occupation of equipment resources required by generating the directional distance field textures of the barrage characters each time can be reduced, and the rendering efficiency of the barrage characters is improved; 2) For the directional distance field textures of barrage characters lacking in the directional distance field texture set, the compensation effect is achieved in a manner of immediate generation; thus, the rendering effect of barrage characters is improved.
An exemplary application of the embodiments of the present application in a practical application scenario is described below. In the related art, for drawing (rendering) of barrage characters, a canvas component provided by an operating system calls a system interface to draw corresponding bitmap characters, so that the implementation is simpler. However, since the drawn character is a bitmap character, there is often a significant edge jaggy when the character is scaled, and the rendering effect of the barrage character is poor.
Based on this, the embodiment of the application provides a barrage character rendering method to at least solve the above-mentioned problems. In the embodiment of the application, aiming at the difference of bullet screen rendering schemes on different platforms (Android/iOS/Mac/Windows/Web), a new bullet screen rendering scheme is provided, the difference of multi-terminal bullet screen rendering schemes is smoothed, and the performance cost of bullet screen scaling is optimized. The text rendering assembly based on the directed distance field can be realized (1) and used for drawing the barrage and solving the problem of edge saw-tooth, as shown in (1) in fig. 4, the upper character 'AB' is barrage characters drawn by the application, no edge saw-tooth exists, and the lower character 'AB' is bitmap characters, and obvious edge saw-tooth exists. (2) The method supports coexistence of a static atlas (comprising the pre-generated SDF texture and the position information of the character in the SDF texture according to the input character) and a dynamic atlas (the dynamic generation of the SDF texture and the position information of the character in the SDF texture according to the required character in the running process), and has the advantages of small time consumption for loading the character by the static SDF atlas and solving the problem of character missing. (3) The characters in the graph set are only related to fonts, are not related to character sizes and the like, the character sizes of the barrage characters are modified without regenerating SDF textures corresponding to the character sizes, and the memory and CPU occupation of the textures are reduced to the greatest extent. (4) And supporting image-text mixed arrangement, wherein as shown in (2) in fig. 4, the image and emoji expression mixed arrangement of characters and operation configuration is supported. (5) The dynamic atlas expansion is supported, the character effects of lossless amplification, tracing, shading, external lighting, gradual change and the like are supported, the gradual change character effect shown in (3) in fig. 4 is supported, and three gradual change modes (horizontal gradual change, vertical gradual change and mixed gradual change) are supported, so that the self-defined gradual change color is realized. And (6) reducing the memory and CPU occupation.
The following detailed description is given. Referring to fig. 5, the method for rendering barrage characters provided in the embodiment of the present application includes: 1. whether or not the barrage character to be rendered (e.g., barrage character a in the format of "Song Ti" and type number 32) is present in the static SDF atlas, if not, execute 2, if so, execute 5.2. A rendering interface of an operating system (e.g., android, web, windows, iOS, mac) is invoked to generate a character bitmap based on a target character (e.g., target character a in "Song Ti" format and having a word size of 24) used to generate the SDF texture. 3. SDF textures of barrage characters are generated based on the character bitmaps. 4. The SDF texture of the barrage character is stored in a dynamic SDF map set. 5. SDF texture and character information is obtained from SDF atlases, including static SDF atlases and dynamic SDF atlases. 6. And calculating the scaling value of the word size of the bullet character to be rendered and the word size of the target character in the SDF texture. 7. And calculating the vertex coordinates of each vertex in the rendered barrage character. 8. Vertex data such as vertex coordinates, SDF textures, character information, vertex colors, character special effect information and the like are written into the shader. 9. The shader performs rendering of barrage characters.
Fig. 5 above describes the process of drawing a barrage character a of character number 32 using the sons. At the time of drawing, it is checked whether the barrage character is already present in the static SDF graph set. If so, the SDF texture and character information of the barrage character is directly obtained from the static SDF atlas. If the character bitmap does not exist, a drawing interface of the operating system in the running process is called to generate a character bitmap of the barrage character, then the SDF texture of the barrage character is generated based on the character bitmap, and character information (character height, character width and the like) is stored in a dynamic SDF graph set. Since the font size (24 th) used in generating the SDF texture of the barrage character may not be consistent with the font size (32 th) of the barrage character to be rendered, a scaling value needs to be calculated. And then, recalculating the size of the barrage character by utilizing the ratio to determine the vertex coordinates of each vertex in the barrage character, and filling the vertex coordinates of each vertex into the vertex data of the character. SDF textures, vertex data and character special effect information are submitted to the GPU for rendering. In the rendering process, the shader samples corresponding textures from the SDF textures according to the texture coordinates of the vertexes; scaling the character texture according to the vertex coordinates of the vertices; according to the vertex color of the vertex, carrying out color modification on the character texture; and carrying out special effect processing on barrage characters according to the character special effect information, such as edge drawing, gradual change and the like. And finally, rendering the effect of the expected barrage character. It should be noted that, since the SDF texture stores distance information, the GPU will interpolate and calculate the distance data in the new pixel point when sampling the texture after amplifying the character, so that lossless amplification can be achieved. And the scaling of the character only requires recalculating the character size, without adding additional overhead.
(one) illustrates the generation of SDF textures. (1) generating a character bitmap: and drawing a character bitmap of the barrage character by using a file text () method of Canvas on Android, which is provided by different operating systems. The character bitmap is a black-and-white image, as shown in fig. 6, which is a character bitmap of barrage character a. (2) Based on the character bitmap, the SDF texture of the barrage character is obtained through SDF algorithm processing.
The SDF algorithm is described herein. The SDF algorithm essentially calculates the distance of a pixel from its nearest object edge, in the context of a character, i.e., the distance of a pixel on a character picture from the nearest character edge (the edge of a stroke). To reduce the time consumption, the SDF distance can be calculated within a linear time (the calculation time consumption increases linearly with the picture size) using the 8SSEDT algorithm of the Euclidean Distance Transform (EDT) algorithm. The core idea of the 8SSEDT algorithm is: the nearest distance of a certain pixel point can be calculated by the adjacent points in eight directions near the nearest distance. For each pixel, a 3x3 template is used to calculate its distance to the nearest edge. The template contains information of the current pixel point and 8 surrounding pixel points, and can be used for judging whether the current pixel point is on an edge or not and the distance from the current pixel point to the edge. The distance from each pixel point to the nearest contour line can be obtained by traversing the pixel points around the pixel point and obtaining the minimum distance from the known nearest distance of the surrounding pixel points to the distance from the surrounding pixel points.
Specifically, the traversal of eight directions is split into two PASS (to ensure that the values of the neighbors in the corresponding directions have been calculated). (1) PASS0: traversing from the upper left corner, traversing row by row, calculating four directions at a time from the upper left. (2) PASS1: traversing from the lower right corner, traversing row by row, calculating four directions at a time from the lower right. The nearest distance from the left upper half to the right upper half of each pixel point is obtained through traversing from the left upper part to the right lower part, and the nearest distance from the right lower half to the right lower half of each pixel point is obtained through scanning traversing from the right lower part to the left upper part, so that the omnidirectional nearest distance from each pixel point to the contour line is obtained through integration. As shown in fig. 7, for the first traversal, the distances x between the pixel and the pixels on the left, upper right, and upper right are calculated, and the minimum value of x and the sum of the distances between the pixel and the edge of the character is taken. And traversing for the second time, calculating the distance x between the pixel point and the pixel points on the right side, the lower left side and the lower left side of the pixel point, and taking the minimum value of the sum of the x and the distances between the pixel points and the edge of the character.
For example, as shown in fig. 8 (1), there is a 5×5 (width by height) black-and-white chart, in which black pixels are marked with 1 and white pixels are marked with 0. Two tables are established for determining whether the pixel points are located inside or outside the barrage character (black section). (1) A table is used to calculate the distance from the outside of the character to the edge of the character: the distance of the white pixel point may be initialized to 0 and the distance of the black pixel point may be infinity. This table is noted as table a. (2) A table is used to calculate the distance from the edge of the character to the inside of the character: the distance of the black pixel point is initialized to 0, and the distance of the white pixel point is infinity. This table is noted as table B.
The initial state of table a is as follows:
0 0 0
0
0 0 0
0 0
0 0 0 0
starting the first calculation, traversing from the upper left corner, traversing from left to right, traversing from top to bottom, and calculating the minimum value of the four directions (the distance between the point and the object+the distance between the point and the object) at the upper left each time. Specifically, a coordinate system is established on the table a, the origin is located at the upper left corner of the table a, the horizontal direction is the x-axis, the right is the positive direction, the vertical direction is the y-axis, and the downward direction is the positive direction. Assuming that the coordinates of the pixel point a to be calculated are (x, y), the distances between the four pixel points (x-1, y), (x-1, y-1), (x, y-1) and the character edge are required to be taken, the sum of the distances corresponding to the four pixel points and the distances between the pixel point a and the four pixel points is calculated, and the minimum value is taken. Table a after the first calculation is as follows:
0 0 1 2 0
0 1 root number 2 Root number 2 1
0 0 1 2 0
0 1 Root number 2 0 1
1 0 0 0 0
And starting the second calculation, traversing from the lower right corner, traversing from right to left from bottom to top, and calculating the minimum value of 'the distance between the pixel point of the right lower part of the pixel point to be calculated and the character edge in four directions + the distance between the pixel point to be calculated and the pixel point of the four directions' each time. Table a after the second calculation is as follows:
0 0 1 1 0
0 1 Root number 2 Root number 2 1
0 0 1 1 0
0 1 1 0 1
1 0 0 0 0
The initial state of table B is as follows:
0 0
0 0 0 0
0 0
0 0 0
0
similarly, the calculation is performed twice, and the minimum value of "the distance between the pixel point in the four directions above the left of the pixel point to be calculated and the character edge+the distance between the pixel point to be calculated and the pixel point in the four directions" is traversed from the upper left corner, from left to right, and from top to bottom, line by line. Table B after the first calculation is as follows:
0 0 1
0 0 0 0
root number 2 1 0 0 1
1+root number 2 0 0 1 0
0 1 1 Root number 2 1
And traversing from the lower right corner to the left, traversing from the right to the left from the bottom to the top, and calculating the minimum value of the distance between the pixel points in the four directions at the lower right of the pixel point to be calculated and the edge of the character and the distance between the pixel point to be calculated and the pixel points in the four directions each time. Table B after the second calculation is as follows:
root number 2 1 0 0 1
1 0 0 0 0
Root number 2 1 0 0 1
1 0 0 1 0
0 1 1 Root number 2 1
Subtracting table B from table a, and obtaining the result shown in fig. 8 (2), wherein the region composed of positive numbers is the region of the original picture (i.e. the black-and-white picture of 5x 5), the positive numbers can represent the distance between the pixel located inside the character and the edge of the character, and the negative numbers represent the distance between the pixel located outside the character and the edge of the character.
However, since the pixel points are discrete points, the character edge does not necessarily pass completely through the pixel points. As shown in fig. 9, the pixels near the edges of the circles have different gray values. If only 0 is used as a threshold value in rendering, pixels with gray values larger than 0 are rendered, and other pixels are discarded, the edge of the actual effect has a lot of saw teeth. While there are many strokes of Chinese characters, many characters will do so. Therefore, antialiasing needs to be performed on the generated SDF texture, so that the calculated distance reflects the actual distance as far as possible, and the edge of the character is ensured to be as smooth as possible. In practical implementation, antialiasing treatment is performed for some two cases:
(1) As shown in fig. 10, if an edge passes through a pixel horizontally or vertically, the gray value of the pixel represents the distance d corresponding to the pixel f =0.5-b. Wherein d f And b is the gray value of the pixel point.
(2) If the contour is diagonal across the pixel points, as shown in fig. 11, then a gradient based on this contour needs to be calculated. The three squares shown in fig. 11 represent three cases where the contour line passes diagonally through the pixel point. The shaded and unshaded portions have different gray values, and the middle parallelogram region can be regarded as a stroke of a character. The above three cases are expressed by the following formula (b is the gray value of the pixel):
b<b 1
b 1 <b<b 1 +b 2
b 1 +b 2 ≤b<1;
as shown in fig. 12, several constants are defined, including: b 1 : the edge passes through the area on the left side of the pixel point and passes through the point on the extreme edge of the pixel point; b 2 : a middle region. The following constants can be found from fig. 12:
b 2 =1-2b 1 ; (2)
thus d f Can be obtained by the following formula:
wherein g x And g y The gradients of the pixel points in the x-direction and the y-direction, respectively. In practical implementations, g may be calculated separately using the 3x3 Isotopic operator x And g y
After substituting the formula (5)D can be obtained f I.e. the closest distance sought. Thus, the nearest distance corresponding to each pixel point in the character bitmap is calculated. After the nearest distance from each pixel point to the nearest character edge is obtained through an SDF algorithm, the nearest distance is converted into a value of 0 to 1, and the value is stored in an alpha channel of the picture, namely, the transparency, and other channels are filled with 1. The effect is thus shown in FIG. 13, where the character is white and the edges have a diffuse, blurry shade, indicating that the alpha value is progressively smaller (indicating that it is progressively farther from the edge of the character).
During rendering, vertex data for each character is submitted to the GPU. Each character image is a rectangle composed of two triangles, as shown in fig. 14. Thus, a character has four vertices, upper left, upper right, lower left, and lower right. The vertex data of each vertex may include position information (vertex coordinates), texture information, pixel color, and the like of the vertex. During rendering, coordinates of 4 vertexes in the bullet screen character in the whole SDF texture are calculated according to x, y, width, height in the character information and the height and width of the whole SDF texture, and are recorded as texture coordinates, vertex colors (character colors) and the like are used as vertex data to be transmitted to the GPU. When the GPU is used for rendering, the area of the character on the SDF texture is determined according to the texture coordinates of the four vertexes, texture sampling is carried out to obtain texture information, and the effects such as edge drawing, shadow shading and the like are overlapped in the fragment shader to carry out processing. Specifically, an alpha threshold may be set for rendering. For example, pixels with alpha values greater than an alpha threshold (e.g., 0.1) are rendered and others are discarded.
(II) description of the combination of a static SDF atlas and a dynamic SDF atlas.
If SDF textures are dynamically generated according to characters in the running process of the full-reliance program, when a large number of characters appearing for the first time are rendered for the first time, the SDF textures of the large number of characters need to be generated, more CPU resources are occupied, and obvious clamping and stopping can occur. The fonts and commonly used characters can thus be entered on the electronic device into bitmap font generating software (e.g., hiero) to generate text files of SDF textures and character information.
The text file of the character information is a JSON character string, and contains the information of each character, as follows:
where size represents the font size of the barrage character used in drawing the SDF texture and padding represents the spacing of adjacent characters in the SDF texture map set. "scaleW" means the width of the SDF texture atlas and "scaleH" means the height of the SDF texture atlas. CharData is an array that contains information for all characters in the SDF texture. Id is the unicode code of the character, x, y represents the vertex coordinates of the top left corner vertex of the character at the SDF texture, and width and height represent the width and height of the character. Xoffset and yoffset represent how many pixel points the character needs to be shifted to the right or down, respectively, when rendering. Xdvanceb represents the distance that the current position needs to advance backward (i.e., the starting position of the next character) after the character is drawn. When the program is initialized, the generated SDF texture and character information text file are input into the program for analysis, so that the texture data of each character can be known.
When a character not present in the static SDF map set is encountered, then the SDF texture and character information for that character is dynamically generated. Firstly, calling an interface of an operating system for drawing characters to draw character bitmaps of the characters, and storing character information of the character bitmaps. And then, performing SDF algorithm processing, obtaining the SDF texture of the character based on the character bitmap, and adding the SDF texture to the free area of the dynamic SDF atlas. If the dynamic SDF atlas is full, a larger sized texture atlas is generated, the data on the old texture atlas is copied to the new texture atlas, and the SDF texture just generated is added to the new texture atlas. The old texture atlas is added into the queue to be destroyed. Each dynamic atlas has a count that indicates how many textmeashfro components use this dynamic atlas to render text. When the textmeashfro component is destroyed, the dynamic atlas it holds is released and the dynamic atlas count is decremented by one. When the count of the dynamic atlas is decremented to 0, if it is in the queue to be destroyed, the texture of the dynamic atlas is destroyed.
And thirdly, explaining the graphic and text mixed arrangement. As shown in fig. 4 (2), the bullet screen characters such as pictures and expressions are the same as the characters, and the image-text mixed arrangement effect of the bullet screen can be realized by adopting the rendering mode.
And (IV) explaining the support of the gradient color special effect. The basic idea of gradual change realization is as follows: and determining color values of the left and right ends of the character, wherein if the left color is pink and the right color is yellow, the color value of the pixel of the middle character is obtained by linear interpolation and the like. Specifically, the left (start) color is split into four channels of RGBA, the right (end) color is also split into four channels of RGBA, and the four channels each construct a linear function (the value of the left color channel is taken as the ordinate of the starting point, the coordinate position of the corresponding pixel is taken as the abscissa, and the right color is taken as the vice versa). Then the middle pixel substitutes the primary function according to the position of the middle pixel, the color values corresponding to the four channels are obtained, and the final color of the pixel is obtained after the color values are combined.
If the two words are ordered sequentially in the same picture, the top left corner of "yes" is (0, y), and the top left corner of "me" may be (0.3, y), as shown in fig. 4 (3). However, since the present solution stores all characters on one texture in order to save memory space and optimize performance, the texture coordinates of each character are calculated based on the texture of the atlas. Therefore, the vertex coordinates of its character do not reflect the position of the character throughout the text. The first word "yes" as exemplified above, the top left corner of which may be (0.5,0.3), while the top left corner of the second word "me" may be (0.2, 0.4). This does not allow accurate calculation of the color value of the word "me". Therefore, the normalized coordinates of each vertex in the character need to be calculated by itself.
By way of example, assume that the text segment shown in fig. 4 (3) has a width of 540 and a height of 40. The first character "yes" has a width of 30 and a height of 40. Assuming that the coordinate axis origin is in the upper left corner, the normalized coordinates of the four vertices of the character "yes" are calculated: upper left vertex: (0, 0) overlapping the origin; upper right vertex: (0.06,0), the distance in the x-direction of the upper right corner from the origin/width of the entire piece of text = 30/540 = 0.0555 … … is approximately equal to 0.06; left lower vertex: (0, 1), the distance of the character bottom from the origin in the y direction/the height of the piece of text=40/40=1; lower right vertex: (0.06,1).
After the first character is calculated, the second character is calculated until the calculation of the normalized coordinates of the vertices of all the characters is completed. Using these normalized coordinates, the color value of each vertex can be interpolated in the fragment shader based on the color values of the left and right ends of the character; the color value of the pixel point between any two vertexes is given to the GPU to perform interpolation calculation based on the color value of the pixel point between the two vertexes.
By applying the embodiment of the application, a text rendering assembly based on a directed distance field can be realized (1) and used for drawing a barrage and solving the problem of edge saw teeth; (2) The method supports coexistence of a static atlas (comprising the pre-generated SDF texture and the position information of the character in the SDF texture according to the input character) and a dynamic atlas (the dynamic generation of the SDF texture and the position information of the character in the SDF texture according to the required character in the running process), and has the advantages of small time consumption for loading the character by the static SDF atlas and solving the problem of character missing. (3) The characters in the graph set are only related to fonts, are not related to character sizes and the like, the character sizes of the barrage characters are modified without regenerating SDF textures corresponding to the character sizes, and the memory and CPU occupation of the textures are reduced to the greatest extent. And (4) supporting graphic and text mixed arrangement. (5) The dynamic atlas expansion is supported, and the character effects of lossless amplification, tracing, shading, external lighting, gradual change and the like are supported.
Continuing with the description below of an exemplary structure of the barrage character rendering device 555 provided in embodiments of the present application implemented as a software module, in some embodiments, as shown in fig. 2, the software module stored in the barrage character rendering device 555 of the memory 550 may include: a determining module 5551, configured to determine whether there is a directed distance field texture of the barrage character to be rendered in the directed distance field texture set; an acquisition module 5552 for acquiring the directed distance field texture from the directed distance field texture set when the directed distance field texture of the barrage character is present in the directed distance field texture set, and generating the directed distance field texture of the barrage character when the directed distance field texture of the barrage character is not present in the directed distance field texture set; an extraction module 5553, configured to determine texture coordinates of each vertex in the barrage character, and extract vertex texture information of each vertex from the directed distance field texture based on the texture coordinates of each vertex; and the rendering module 5554 is configured to render the barrage character based on vertex texture information of each vertex, so as to obtain a rendered barrage character.
In some embodiments, the determining module 5551 is further configured to determine a plurality of target barrage characters for the barrage before the determining whether there is a directed distance field texture for the barrage characters to be rendered in the directed distance field texture set; generating target directional distance field textures of each target barrage character; and constructing the directional distance field texture set based on each target directional distance field texture.
In some embodiments, the obtaining module 5552 is further configured to determine, after the generating the directed distance field texture of the barrage character when the directed distance field texture of the barrage character does not exist in the directed distance field texture set, whether a free area exists in a dynamic directed distance field texture set; when the dynamic directional distance field texture set has an idle region, adding the directional distance field texture into the idle region of the dynamic directional distance field texture set; when the dynamic directional distance field texture set does not have a free area, generating a target directional distance field texture set, wherein the capacity of the target directional distance field texture set is larger than that of the dynamic directional distance field texture set; and adding the directional distance field texture and the dynamic directional distance field texture in the dynamic directional distance field texture set into the target directional distance field texture set.
In some embodiments, the obtaining module 5552 is further configured to add the dynamic directed-distance-field texture set to a texture set queue to be destroyed after adding the directed-distance-field texture set and the dynamic directed-distance-field texture set to the target directed-distance-field texture set; accordingly, the acquiring module 5552 is further configured to determine a component number of components that render based on the dynamic directed distance field texture set; and destroying the dynamic directed distance field texture set when the number of components is zero.
In some embodiments, the obtaining module 5552 is further configured to generate a character bitmap of the barrage character; determining the nearest distance between each pixel point in the character bitmap and the character edge of the barrage character; normalizing the nearest distance corresponding to each pixel point to obtain a normalization result corresponding to each pixel point; and filling the transparency channel value of each pixel point as the normalization result, and filling the red, green and blue (RGB) channel value of each pixel point as 1 to obtain the directional distance field texture of the barrage character.
In some embodiments, the rendering module 5554 is further configured to determine a target pixel point of the bullet screen character having a transparency channel value higher than a transparency threshold; and rendering the target pixel points of the barrage character based on the vertex texture information of each vertex to obtain the rendered barrage character.
In some embodiments, the extracting module 5553 is further configured to obtain vertex coordinates of a target character vertex of a target character in the directed distance field texture, and a character size of the target character; obtaining the texture size of the directional distance field texture; determining vertex coordinates of each character vertex in the target character based on the vertex coordinates of the target character vertex, the texture size, and the character size; and determining the vertex coordinates of the character vertices as texture coordinates of vertices corresponding to the character vertices in the barrage character according to the vertex coordinates of the character vertices.
In some embodiments, the rendering module 5554 is further configured to determine vertex coordinates of each vertex before rendering the barrage character based on the vertex texture information of each vertex to obtain a rendered barrage character; correspondingly, the rendering module 5554 is further configured to render the barrage character based on the vertex coordinates of each vertex and the vertex texture information of each vertex, so as to obtain a rendered barrage character.
In some embodiments, the rendering module 5554 is further configured to obtain a first size of the barrage character, and obtain a second size of the target character in the directed distance field texture, and vertex coordinates of vertices of each character in the target character; and determining the size ratio of the first size to the second size, and determining the vertex coordinates of the vertexes based on the vertex coordinates of the vertexes of the characters and the size ratio.
In some embodiments, the rendering module 5554 is further configured to, before the rendering the barrage character based on the vertex texture information of each vertex to obtain a rendered barrage character, obtain a vertex color of each vertex and special effect information of the barrage character; correspondingly, the rendering module 5554 is further configured to draw a character texture of the barrage character based on the vertex coordinates of each vertex and vertex texture information of each vertex; performing color drawing on the character textures based on the vertex colors of the vertexes to obtain middle barrage characters; and carrying out special effect processing on the intermediate barrage characters based on the special effect information of the barrage characters to obtain the rendered barrage characters.
In some embodiments, the number of barrage characters is a plurality, and the barrage characters are sequentially arranged from a starting position to an ending position; the special effect of the barrage character is a gradient color special effect from a starting position to an ending position; the rendering module 5554 is further configured to determine character vertex coordinates of vertices in each barrage character when arranged; acquiring a starting color of a starting position of the barrage character positioned at the starting position and an ending color of an ending position of the barrage character positioned at the ending position; determining character special effect colors of the vertexes in the barrage characters based on character vertex coordinates of the vertexes in the barrage characters, the starting colors and the ending colors; and taking the character effect color of each vertex in each barrage character as the character effect color of each vertex in the intermediate barrage character corresponding to each barrage character, and carrying out effect processing on the intermediate barrage character corresponding to each barrage character based on the character effect color of each vertex in each barrage character to obtain each barrage character after rendering.
It should be noted that, the description of the embodiments of the device in this application is similar to the description of the embodiments of the method described above, and has similar beneficial effects as the embodiments of the method, which are not described herein. The technical details that are not found in the bullet screen character rendering device provided in the embodiment of the present application may be understood based on the description of the technical details in the foregoing method embodiment.
Embodiments of the present application also provide a computer program product comprising computer-executable instructions stored in a computer-readable storage medium. The processor of the electronic device reads the computer executable instructions from the computer readable storage medium, and the processor executes the computer executable instructions, so that the electronic device executes the barrage character rendering method provided by the embodiment of the application.
The present embodiments also provide a computer-readable storage medium having stored therein computer-executable instructions that, when executed by a processor, cause the processor to perform the method for rendering barrage characters provided by the embodiments of the present application.
In some embodiments, the computer readable storage medium may be RAM, ROM, flash memory, magnetic surface memory, optical disk, or CD-ROM; but may be a variety of devices including one or any combination of the above memories.
In some embodiments, computer-executable instructions may be written in any form of programming language, including compiled or interpreted languages, or declarative or procedural languages, in the form of programs, software modules, scripts, or code, and they may be deployed in any form, including as stand-alone programs or as modules, components, subroutines, or other units suitable for use in a computing environment.
As an example, computer-executable instructions may, but need not, correspond to files in a file system, may be stored as part of a file that holds other programs or data, such as in one or more scripts in a hypertext markup language (Hyper Text Markup Language, HTML) document, in a single file dedicated to the program in question, or in multiple coordinated files (e.g., files that store one or more modules, sub-programs, or portions of code).
As an example, computer-executable instructions may be deployed to be executed on one electronic device or on multiple electronic devices located at one site or, alternatively, on multiple electronic devices distributed across multiple sites and interconnected by a communication network.
The foregoing is merely exemplary embodiments of the present application and is not intended to limit the scope of the present application. Any modifications, equivalent substitutions, improvements, etc. that are within the spirit and scope of the present application are intended to be included within the scope of the present application.

Claims (15)

1. A method of rendering barrage characters, the method comprising:
determining whether a directed distance field texture of the barrage character to be rendered exists in the directed distance field texture set;
acquiring the directed distance field texture from the directed distance field texture set when the directed distance field texture of the barrage character exists in the directed distance field texture set, and generating the directed distance field texture of the barrage character when the directed distance field texture of the barrage character does not exist in the directed distance field texture set;
determining texture coordinates of each vertex in the barrage character, and extracting vertex texture information of each vertex from the directed distance field texture based on the texture coordinates of each vertex;
and rendering the barrage character based on the vertex texture information of each vertex to obtain the rendered barrage character.
2. The method of claim 1, wherein prior to determining whether there is a directed distance field texture for the barrage character to be rendered in the set of directed distance field textures, the method further comprises:
Determining a plurality of target barrage characters for the barrage;
generating target directional distance field textures of each target barrage character;
and constructing the directional distance field texture set based on each target directional distance field texture.
3. The method of claim 1, wherein the method further comprises, after generating the directed distance field texture for the barrage character when the directed distance field texture for the barrage character is absent from the set of directed distance field textures:
determining whether a free area exists in the dynamic directed distance field texture set;
when the dynamic directional distance field texture set has an idle region, adding the directional distance field texture into the idle region of the dynamic directional distance field texture set;
when the dynamic directional distance field texture set does not have a free area, generating a target directional distance field texture set, wherein the capacity of the target directional distance field texture set is larger than that of the dynamic directional distance field texture set;
and adding the directional distance field texture and the dynamic directional distance field texture in the dynamic directional distance field texture set into the target directional distance field texture set.
4. The method of claim 3, wherein the adding the directional distance field texture, and the dynamic directional distance field texture in the set of dynamic directional distance field textures, to the set of target directional distance field textures further comprises:
Adding the dynamic directed distance field texture set into a texture set queue to be destroyed;
the method further comprises the steps of:
determining a component number of components that render based on the dynamic directed distance field texture set;
and destroying the dynamic directed distance field texture set when the number of components is zero.
5. The method of claim 1, wherein said generating a directed distance field texture for said barrage character comprises:
generating a character bitmap of the barrage character;
determining the nearest distance between each pixel point in the character bitmap and the character edge of the barrage character;
normalizing the nearest distance corresponding to each pixel point to obtain a normalization result corresponding to each pixel point;
and filling the transparency channel value of each pixel point as the normalization result, and filling the red, green and blue (RGB) channel value of each pixel point as 1 to obtain the directional distance field texture of the barrage character.
6. The method of claim 5, wherein rendering the barrage character based on vertex texture information for each of the vertices to obtain a rendered barrage character, comprising:
determining a target pixel point with a transparency channel value higher than a transparency threshold value in the pixel points of the barrage character;
And rendering the target pixel points of the barrage character based on the vertex texture information of each vertex to obtain the rendered barrage character.
7. The method of claim 1, wherein said determining texture coordinates for each vertex in said barrage character comprises:
obtaining vertex coordinates of a target character vertex of a target character in the directed distance field texture and a character size of the target character;
obtaining the texture size of the directional distance field texture;
determining vertex coordinates of each character vertex in the target character based on the vertex coordinates of the target character vertex, the texture size, and the character size;
and determining the vertex coordinates of the character vertices as texture coordinates of vertices corresponding to the character vertices in the barrage character according to the vertex coordinates of the character vertices.
8. The method of claim 1, wherein the method further comprises, prior to rendering the barrage character based on vertex texture information for each of the vertices to obtain a rendered barrage character: determining vertex coordinates of each vertex;
rendering the barrage character based on vertex texture information of each vertex to obtain a rendered barrage character, wherein the method comprises the following steps:
And rendering the barrage character based on the vertex coordinates of the vertexes and the vertex texture information of the vertexes to obtain the rendered barrage character.
9. The method of claim 8, wherein said determining vertex coordinates for each of said vertices comprises:
acquiring a first size of the barrage character, a second size of a target character in the directed distance field pattern and vertex coordinates of vertexes of all characters in the target character;
and determining the size ratio of the first size to the second size, and determining the vertex coordinates of the vertexes based on the vertex coordinates of the vertexes of the characters and the size ratio.
10. The method of claim 8, wherein the method further comprises, prior to rendering the barrage character based on vertex texture information for each of the vertices to obtain a rendered barrage character: obtaining the vertex color of each vertex and the special effect information of the barrage character;
rendering the barrage character based on the vertex coordinates of the vertexes and the vertex texture information of the vertexes to obtain a rendered barrage character, wherein the method comprises the following steps:
Drawing character textures of the barrage characters based on vertex coordinates of the vertexes and vertex texture information of the vertexes;
performing color drawing on the character textures based on the vertex colors of the vertexes to obtain middle barrage characters;
and carrying out special effect processing on the intermediate barrage characters based on the special effect information of the barrage characters to obtain the rendered barrage characters.
11. The method of claim 10, wherein the number of barrage characters is a plurality, and the plurality of barrage characters are arranged sequentially from a start position to an end position; the special effect of the barrage character is a gradient color special effect from a starting position to an ending position;
and performing special effect processing on the intermediate barrage character based on the special effect information of the barrage character to obtain the rendered barrage character, wherein the method comprises the following steps:
determining character vertex coordinates of vertexes in the barrage characters during arrangement;
acquiring a starting color of a starting position of the barrage character positioned at the starting position and an ending color of an ending position of the barrage character positioned at the ending position;
determining character special effect colors of the vertexes in the barrage characters based on character vertex coordinates of the vertexes in the barrage characters, the starting colors and the ending colors;
The character special effect color of each vertex in each barrage character is used as the character special effect color of each vertex in the middle barrage character corresponding to each barrage character, and
and carrying out special effect processing on the middle barrage characters corresponding to the barrage characters based on the character special effect colors of the vertexes of the barrage characters to obtain the rendered barrage characters.
12. A bullet screen character rendering apparatus, the apparatus comprising:
the determining module is used for determining whether the directed distance field texture of the barrage character to be rendered exists in the directed distance field texture set;
the acquisition module is used for acquiring the directed distance field texture from the directed distance field texture set when the directed distance field texture of the barrage character exists in the directed distance field texture set, and generating the directed distance field texture of the barrage character when the directed distance field texture of the barrage character does not exist in the directed distance field texture set;
the extraction module is used for determining the texture coordinates of each vertex in the barrage character and extracting vertex texture information of each vertex from the directed distance field texture based on the texture coordinates of each vertex;
And the rendering module is used for rendering the barrage character based on the vertex texture information of each vertex to obtain the rendered barrage character.
13. An electronic device, the electronic device comprising:
a memory for storing computer executable instructions;
a processor for implementing the method of rendering barrage characters of any one of claims 1 to 11 when executing computer executable instructions stored in the memory.
14. A computer readable storage medium storing computer executable instructions which, when executed by a processor, implement the method of rendering barrage characters of any one of claims 1 to 11.
15. A computer program product comprising computer executable instructions which, when executed by a processor, implement the method of rendering barrage characters of any one of claims 1 to 11.
CN202311638662.7A 2023-11-30 2023-11-30 Barrage character rendering method, barrage character rendering device, barrage character rendering equipment, storage medium and program product Pending CN117611703A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311638662.7A CN117611703A (en) 2023-11-30 2023-11-30 Barrage character rendering method, barrage character rendering device, barrage character rendering equipment, storage medium and program product

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311638662.7A CN117611703A (en) 2023-11-30 2023-11-30 Barrage character rendering method, barrage character rendering device, barrage character rendering equipment, storage medium and program product

Publications (1)

Publication Number Publication Date
CN117611703A true CN117611703A (en) 2024-02-27

Family

ID=89944108

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311638662.7A Pending CN117611703A (en) 2023-11-30 2023-11-30 Barrage character rendering method, barrage character rendering device, barrage character rendering equipment, storage medium and program product

Country Status (1)

Country Link
CN (1) CN117611703A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117911578A (en) * 2024-03-20 2024-04-19 广州中望龙腾软件股份有限公司 Text rendering method and device, computer equipment and storage medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110784773A (en) * 2019-11-26 2020-02-11 北京奇艺世纪科技有限公司 Bullet screen generation method and device, electronic equipment and storage medium
US20220021949A1 (en) * 2019-08-28 2022-01-20 Tencent Technology (Shenzhen) Company Limited Character string display processing method and apparatus, terminal, and storage medium
CN115272535A (en) * 2022-08-19 2022-11-01 杭州新迪数字工程系统有限公司 Method and system for drawing font consistency of DWG drawing under Web
CN116471438A (en) * 2023-03-15 2023-07-21 北京奇艺世纪科技有限公司 Special effect processing method, device, processing equipment and storage medium

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20220021949A1 (en) * 2019-08-28 2022-01-20 Tencent Technology (Shenzhen) Company Limited Character string display processing method and apparatus, terminal, and storage medium
CN110784773A (en) * 2019-11-26 2020-02-11 北京奇艺世纪科技有限公司 Bullet screen generation method and device, electronic equipment and storage medium
CN115272535A (en) * 2022-08-19 2022-11-01 杭州新迪数字工程系统有限公司 Method and system for drawing font consistency of DWG drawing under Web
CN116471438A (en) * 2023-03-15 2023-07-21 北京奇艺世纪科技有限公司 Special effect processing method, device, processing equipment and storage medium

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
丁剑飞;徐昆;胡国桥;: "基于GPU的自由立体显示器通用渲染算法", 系统仿真学报, no. 07, 8 July 2012 (2012-07-08), pages 70 - 75 *
张亚平: "基于 FreeType 的双缓存文字快速渲染方法", 长江信息通信, 15 January 2023 (2023-01-15), pages 90 - 92 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117911578A (en) * 2024-03-20 2024-04-19 广州中望龙腾软件股份有限公司 Text rendering method and device, computer equipment and storage medium

Similar Documents

Publication Publication Date Title
US20230053462A1 (en) Image rendering method and apparatus, device, medium, and computer program product
EP3259753B1 (en) Systems and methods for reducing memory bandwidth using low quality tiles
EP3180773B1 (en) Bandwidth reduction using texture lookup by adaptive shading
CN107154063A (en) The shape method to set up and device in image shows region
CN109377554B (en) Large three-dimensional model drawing method, device, system and storage medium
CN113096233B (en) Image processing method and device, electronic equipment and readable storage medium
KR20160051154A (en) Rendering method and apparatus, and electronic apparatus
CN110750664B (en) Picture display method and device
CN110917617B (en) Method, device, equipment and storage medium for generating water ripple image
CN117611703A (en) Barrage character rendering method, barrage character rendering device, barrage character rendering equipment, storage medium and program product
CN112102437A (en) Canvas-based radar map generation method and device, storage medium and terminal
US20180232915A1 (en) Line stylization through graphics processor unit (gpu) textures
US20240257436A1 (en) Image rendering method and apparatus, electronic device, and storage medium
CN109448123B (en) Model control method and device, storage medium and electronic equipment
CN107038729B (en) Digital instrument panel drawing method based on OpenGL-ES
US9501812B2 (en) Map performance by dynamically reducing map detail
US20100053205A1 (en) Method, apparatus, and system for displaying graphics using html elements
CN112580213B (en) Method and device for generating display image of electric field lines and storage medium
CN114332323A (en) Particle effect rendering method, device, equipment and medium
CN113963083A (en) Programming building block drawing method, building block building method and device and electronic equipment
CN112734900A (en) Baking method, baking device, baking equipment and computer-readable storage medium of shadow map
CN115082356B (en) Method, device and equipment for correcting video stream image based on shader
CN113192173B (en) Image processing method and device of three-dimensional scene and electronic equipment
CN112465692A (en) Image processing method, device, equipment and storage medium
JP7352032B2 (en) Video generation method, apparatus, electronic device and computer readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination