CN108765520A - Rendering intent and device, storage medium, the electronic device of text message - Google Patents

Rendering intent and device, storage medium, the electronic device of text message Download PDF

Info

Publication number
CN108765520A
CN108765520A CN201810482906.XA CN201810482906A CN108765520A CN 108765520 A CN108765520 A CN 108765520A CN 201810482906 A CN201810482906 A CN 201810482906A CN 108765520 A CN108765520 A CN 108765520A
Authority
CN
China
Prior art keywords
pixel
textures
target
parameter
value
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201810482906.XA
Other languages
Chinese (zh)
Other versions
CN108765520B (en
Inventor
傅强
宋立强
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN201810482906.XA priority Critical patent/CN108765520B/en
Publication of CN108765520A publication Critical patent/CN108765520A/en
Application granted granted Critical
Publication of CN108765520B publication Critical patent/CN108765520B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/60Editing figures and text; Combining figures or text

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Image Generation (AREA)

Abstract

The invention discloses a kind of rendering intents of text message and device, storage medium, electronic device.Wherein, this method includes:Instruction information is got, instruction information is used to indicate renders the target text information with side is retouched in the target image;In response to indicating information, the second textures are obtained by being sampled to the first textures, the first textures are used to indicate that the stroke texture of target text information, the second textures to be used to indicate that the fringe of retouching of target text information to be managed;It is rendered in the target image with the target text information for retouching side using the second textures.The present invention, which solves, to be carried out word to retouch the technical issues of side needs to consume compared with multiple terminals calculation resources in the related technology.

Description

Rendering intent and device, storage medium, the electronic device of text message
Technical field
The present invention relates to internet arenas, and the rendering intent and device, storage in particular to a kind of text message are situated between Matter, electronic device.
Background technology
It is a contour line being drawn on around word that word, which retouches side, is located at the periphery at text border edge, visually It can play the role of prominent word, in the scenes such as game, film, animation, live streaming, often retouch side with word with prominent text This information content carried.
In the related art, text can be realized by the included components of UGUI (a kind of UI systems that Unity engines are released) Originally side is retouched, using the component, can respectively be painted on offset direction in upper left, upper right, lower-left, bottom right four on the basis of former vertex Once figure identical with original text word is made, is finally superimposed original text word visually to form the effect for retouching side, as shown in Figure 1.
The source code being had disclosed by checking UGUI, it is known that the component is achieved in that:On the basis of former vertex, (x, y), (x ,-y), (- x, y), on four offset directions (- x ,-y) with 4 Shadow components in X-axis displacement distance x, Displacement distance y is realized in Y-axis, 4 " A " (words i.e. for retouching side) compared with light colour as shown in Figure 2, and Shadow groups Part is to draw a word as original text word at the place that specifies Offsets with designated color and simulate hatching effect, as shown in Figure 2 Relatively light colour " A ", that is, Outline components realize that Outline is equivalent to 4 on the basis of Shadow components Shadow on different offset directions is Outline components as shown in Figure 2 it is larger retouch boundary values when realization principle figure.
Following disadvantage is primarily present to realize that text retouches side using the UGUI components carried:1) it is discontinuous, no to retouch side effect Uniformly, the case where side is in the presence of disconnecting is being retouched on the four direction of upper and lower, left and right;2) vertex quantity is significantly increased, and 1 is not retouched side Word vertex quantity is 6, is 30 (that is, repeating to have rendered 5 originals with the word vertex quantity after this scheme Word), in the case where word is more larger performance pressures can be caused to the CPU and GPU of mobile device;3) side size is retouched only to prop up Smaller value is held, retouching side and crossing conference causes fracture apparent, cannot visually be formed and coherent retouch side effect.
For above-mentioned problem, currently no effective solution has been proposed.
Invention content
An embodiment of the present invention provides a kind of rendering intents of text message and device, storage medium, electronic device, so that It is few to solve word to be carried out retouching the technical issues of side needs to consume compared with multiple terminals calculation resources in the related technology.
One side according to the ... of the embodiment of the present invention provides a kind of rendering intent of text message, including:Get finger Show information, instruction information is used to indicate renders the target text information with side is retouched in the target image;Believe in response to instruction Breath obtains the second textures by being sampled to the first textures, and the first textures are used to indicate the stroke texture of target text information, Second textures are used to indicate that the fringe of retouching of target text information to be managed;It renders to have in the target image using the second textures and retouches side Target text information.
Another aspect according to the ... of the embodiment of the present invention additionally provides a kind of rendering device of text message, including:It obtains single Member, for getting instruction information, instruction information is used to indicate renders the target text letter with side is retouched in the target image Breath;Sampling unit, in response to indicating information, obtaining the second textures by being sampled to the first textures, the first textures are used In the stroke texture for indicating target text information, the second textures are used to indicate that the fringe of retouching of target text information to be managed;Rendering unit, For using the second textures to be rendered in the target image with the target text information for retouching side.
Another aspect according to the ... of the embodiment of the present invention additionally provides a kind of storage medium, which includes storage Program, program execute above-mentioned method when running.
Another aspect according to the ... of the embodiment of the present invention, additionally provides a kind of electronic device, including memory, processor and deposits The computer program that can be run on a memory and on a processor is stored up, processor executes above-mentioned side by computer program Method.
In embodiments of the present invention, instruction information is got, instruction information is used to indicate to render in the target image and provide There is the target text information for retouching side;By being sampled to obtain the second textures to the first textures, the first textures are for indicating target The stroke texture of text message, the second textures are used to indicate that the fringe of retouching of target text information to be managed;Using the second textures in target Being rendered in image in other words, can be only to the second textures in embodiments herein with the target text information for retouching side It renders once, rather than in the related technology five times, it largely reduces, can solve with the word vertex quantity after this scheme Word is carried out in the related technology to retouch the technical issues of side needs to consume compared with multiple terminals calculation resources, and then reaches reduction and is retouched Side needs the technique effect of the terminal calculation resources consumed.
Description of the drawings
Attached drawing described herein is used to provide further understanding of the present invention, and is constituted part of this application, this hair Bright illustrative embodiments and their description are not constituted improper limitations of the present invention for explaining the present invention.In the accompanying drawings:
Fig. 1 is a kind of schematic diagram for retouching side rendering effect of optional text message;
Fig. 2 is a kind of schematic diagram for retouching side rendering effect of optional text message;
Fig. 3 is the schematic diagram of the hardware environment of the rendering intent of text message according to the ... of the embodiment of the present invention;
Fig. 4 is a kind of flow chart of the rendering intent of optional text message according to the ... of the embodiment of the present invention;
Fig. 5 is a kind of schematic diagram of optional interface according to the ... of the embodiment of the present invention;
Fig. 6 is a kind of schematic diagram of optional interface according to the ... of the embodiment of the present invention;
Fig. 7 is a kind of schematic diagram of the pixel of optional word according to the ... of the embodiment of the present invention;
Fig. 8 is a kind of schematic diagram of the texture mapping of optional word according to the ... of the embodiment of the present invention;
Fig. 9 is a kind of schematic diagram of the texture mapping of optional word according to the ... of the embodiment of the present invention;
Figure 10 is a kind of schematic diagram of the texture mapping of optional word according to the ... of the embodiment of the present invention;
Figure 11 is a kind of schematic diagram of the texture mapping of optional word according to the ... of the embodiment of the present invention;
Figure 12 is a kind of schematic diagram of the pixel of optional word according to the ... of the embodiment of the present invention;
Figure 13 is a kind of schematic diagram of the texture mapping of optional word according to the ... of the embodiment of the present invention;
Figure 14 is a kind of schematic diagram of the texture mapping of optional word according to the ... of the embodiment of the present invention;
Figure 15 is a kind of schematic diagram of optional text information according to the ... of the embodiment of the present invention;
Figure 16 is a kind of schematic diagram of the texture mapping of optional word according to the ... of the embodiment of the present invention;
Figure 17 is a kind of schematic diagram of the rendering device of optional text message according to the ... of the embodiment of the present invention;And
Figure 18 is a kind of structure diagram of terminal according to the ... of the embodiment of the present invention.
Specific implementation mode
In order to enable those skilled in the art to better understand the solution of the present invention, below in conjunction in the embodiment of the present invention Attached drawing, technical scheme in the embodiment of the invention is clearly and completely described, it is clear that described embodiment is only The embodiment of a part of the invention, instead of all the embodiments.Based on the embodiments of the present invention, ordinary skill people The every other embodiment that member is obtained without making creative work should all belong to the model that the present invention protects It encloses.
It should be noted that term " first " in description and claims of this specification and above-mentioned attached drawing, " Two " etc. be for distinguishing similar object, without being used to describe specific sequence or precedence.It should be appreciated that using in this way Data can be interchanged in the appropriate case, so as to the embodiment of the present invention described herein can in addition to illustrating herein or Sequence other than those of description is implemented.In addition, term " comprising " and " having " and their any deformation, it is intended that cover It includes to be not necessarily limited to for example, containing the process of series of steps or unit, method, system, product or equipment to cover non-exclusive Those of clearly list step or unit, but may include not listing clearly or for these processes, method, product Or the other steps or unit that equipment is intrinsic.
First, the part noun or term occurred during the embodiment of the present invention is described is suitable for as follows It explains:
UGUI:Refer to a set of UI systems being added in game engine Unity, includes a series of basic UI controls, it is fixed The common UI controls specification of justice.
Piece member tinter:Fragment Shader are an important programmable shader stage in rendering pipeline, This stage can complete texture sampling (being one of the stage in render process).
One side according to the ... of the embodiment of the present invention provides a kind of embodiment of the method for the rendering intent of text message.
Optionally, in the present embodiment, the rendering intent of above-mentioned text message can be applied to as shown in Figure 3 by servicing In the hardware environment that device 301 and terminal 303 are constituted.As shown in figure 3, server 301 is connected by network and terminal 303 It connects, can database 305 be set on server 301 or independently of server 301, for providing data storage for server 301 Service, above-mentioned network include but not limited to:Wide area network, Metropolitan Area Network (MAN) or LAN, terminal 303 are not limited to PC, mobile phone, tablet Computer etc..
The rendering intent of the text message of the embodiment of the present invention can be executed by terminal 303.Fig. 4 is according to of the invention real The flow chart for applying a kind of rendering intent of optional text message of example, as shown in figure 4, this method may comprise steps of:
Step S402, terminal get instruction information, and instruction information is used to indicate to render to have in the target image and retouch The target text information on side.
Above-mentioned instruction information can be when rendering by system or application bottom configuration triggering, can also be that user exists What the configuration in system or application triggered.Target text information text i.e. to be drawn out, including but not limited to various language Word, number.
Step S404, in response to indicating that information, terminal obtain the second textures, the first patch by being sampled to the first textures Stroke texture of the figure for indicating target text information, the second textures are used to indicate that the fringe of retouching of target text information to be managed, such as Fig. 1 In retouch sideline item around " China " word.
Step S406, terminal are rendered in the target image using the second textures with the target text information for retouching side.
The rendering intent of the text message of the embodiment of the present invention can also be executed by server 301, with above-described embodiment Difference lies in the executive agent of step S402 to step S406 are transformed to server by terminal;The text of the embodiment of the present invention The rendering intent of information can also be to be executed jointly by server 301 and terminal 303.Wherein, terminal 303 executes implementation of the present invention The rendering intent of the text message of example can also be to be executed by client mounted thereto.
S402 to step S406 through the above steps, gets instruction information, and instruction information is used to indicate in target image In render with retouching the target text information on side;By being sampled to obtain the second textures to the first textures, the first textures are used In the stroke texture for indicating target text information, the second textures are used to indicate that the fringe of retouching of target text information to be managed;Use second Textures render in the target image in other words, can be only in embodiments herein with the target text information for retouching side Second textures are rendered once, rather than in the related technology five times, largely subtract with the word vertex quantity after this scheme It is few, it can solve word to be carried out retouching the technical issues of side needs to consume compared with multiple terminals calculation resources, Jin Erda in the related technology To reduction retouch the technique effect for the terminal calculation resources that side needs consume.
The application describes one kind in the engines such as Unity using character component etc., by components such as piece member tinters The texture sampling stage to word texture multiple repairing weld formed word retouch the technical solution of side effect can by the technical solution Efficiently to realize that word retouches side effect in the scenes such as game, film, animation, live streaming, mitigate the fortune of mobile device, computer etc. Calculate the pressure of resource (such as central processor CPU and image processor GPU).With reference to step shown in Fig. 4, by the application Technical solution be applied to that embodiments herein is described in further detail for scene of game, for other scenes such as film, live streaming, Embodiment with it is described below similar, repeat no more.
As shown in figure 5, more people shield together the RPG hands trip freely fought, first person shooting game FPS, the third person are penetrated It hits in the game such as game TPS, huge with screen number of players, from performance perspective, player's image can be omitted selectively, but It is crown word as cannot omit with shielding the last symbol of other players;In addition, the background constructing of certain hand trips is in Chinese ancient On the basis of mythical works, whole fine arts style of playing is biased to allusion and sheepskin paper pattern needs that are thick and heavy, using on a large scale Retouching side word can just highlight, as shown in Figure 6.As it can be seen that the application scenarios that game retouches word on side are more, as presence When demand, it can be configured in game client (can be mounted in above-mentioned terminal), as configuration needs progress word to retouch side When, trigger the technical solution of the application.
After configuring in the client, when needing progress word to retouch side, which will trigger instruction information, in step In the technical solution that rapid S402 is provided, terminal can receive the instruction information of client Lower level logical triggering, and Lower level logical passes through Indicate that information instruction is rendered in the target image with the target text information for retouching side.
In the technical solution that step S404 is provided, in response to indicating information, terminal is by sampling the first textures The second textures are obtained, the first textures are used to indicate the stroke texture of target text information, and the second textures are for indicating target text Information retouches fringe reason.
Optionally, by that before being sampled to obtain the second textures to the first textures, can create as follows Texture mapping:From the stroke texture for obtaining target text information in the font file (such as font files) of target text information;Wound The first textures of stroke texture and first position information including target text information are built, first position information (such as UV coordinates) is used In the stroke texture mapping first area in the target image for indicating target text information.
In embodiments herein, it may include following steps by being sampled to obtain the second textures to the first textures:
Step 1, by carrying out the texel value that sampling obtains pixel in the first textures to the first textures.
It optionally, can be by by carrying out the texel value that sampling obtains pixel in the first textures to the first textures A texel value for obtaining each pixel in the first textures still can have problem in this way, as shown in figure 15, work as word When than comparatively dense (word towards can be any direction, word be not of uniform size, word intersects arrangement), possibility between word There are the regions of intersection, as shown in figure 16, can there is dirty pixel, that is, be not belonging to the pixel of " word ".
Optionally, it can be filtered as follows at this time, pixel is to be mapped in the firstth area in the first textures In the case of pixel in domain, the texel value of pixel in the first textures is obtained;If the pixel is not to be mapped in the Pixel in one region is then directly configured as a fixed numbers, i.e. second threshold, such as the pixel value for white, and such as 8 The 255 of position.
Step 2, the texture pixel of pixel in third textures is determined according to the texel value of pixel in the first textures Value, obtains the second textures, wherein the second textures include the pixel for retouching fringe reason for indicating target text information.
Optionally, above-mentioned third textures can be the copy of the first textures, in other words, can directly to the first textures into Row sampling obtains the second textures, and third textures may be blank textures, will can directly be protected to the sampled result of the first textures There are in the second textures.
Above-mentioned instruction information is additionally operable to instruction target text information and needs to retouch the target direction on side and for indicating target Text message retouch side range first threshold (can be understood as rendered pixel point range or width, such as only render close to One pixel of Texture features, first threshold be denoted as 1 pixel (be denoted as pixel diameter or twice of radius r), Can such as render apart from two pixels of Texture features and within, first threshold is denoted as 2 pixels, according to picture in the first textures The texel value of vegetarian refreshments determines that the texel value of pixel in third textures may include following steps:
Step 21, the first parameter and the second parameter are obtained, wherein the first parameter is the line of the first pixel in the first textures Manage pixel value the sum of, the first pixel be located in the first textures on the target direction of the second pixel and with the second pixel phase Away from first threshold, the second parameter is with the second pixel at a distance of the number of the pixel of first threshold.
Optionally, in the above-described embodiments, target direction may include direction as shown by the arrows in Figure 7, as number is The left-right and front-back four direction of 44 pixel obtains the first parameter and may include in the above-described embodiments:
1) it in the case where target direction includes a direction, determines on the target direction of the second pixel and with the Two pixels at a distance of first threshold the first pixel, and by the sum of texel value of all first pixels as first ginseng Number, first threshold are equivalent to rendering range;If rendering ranging from N number of pixel, then first threshold can be understood as N*2*r, Wherein, r indicates the radius of pixel, if the center for the pixel (i.e. the second pixel) that number is 44 in Fig. 7 is to the pixel Any one angular vertex distance.
As rendered ranging from 1 pixel, target direction is that the right, the first pixel is the pixel that number is 35,45,55 Point, if target direction is upward, the first pixel is the pixel that number is 33,34,35, renders ranging from 2 pixels, if Target direction is downward, and the first pixel is the pixel that number is 63,64,65, and so on.
2) it in the case where target direction includes multiple directions, determines any one in the multiple directions of the second pixel At a distance of the first pixel of first threshold on a direction and with the second pixel, and by the texel value of all first pixels The sum of be used as the first parameter.
For example, if target direction includes four direction up and down, render ranging from 1 pixel, then number be 33, 34,35,45,55,54,53,43 pixel is the first pixel.
Step 22, third pixel in third textures is determined according to the texel value of target ratio and the second pixel Texel value, wherein ratio of the target ratio between the first parameter and the second parameter, third pixel is in third textures Position it is identical as position of second pixel in the first textures.
Optionally, third pixel in third textures is determined according to the texel value of target ratio and the second pixel Texel value includes:The first product between the texel value of the second pixel and the first weight is obtained, and obtains target The second product between ratio and the second weight, wherein the first weight is to be determined according to the transparency of the second pixel, optional , then the first parameter can be the numerical value after transparency normalization, n binary systems are such as used if transparency is binary data in ground Number indicates, then the numerical value after transparency α normalization is α/28, the sum of the first weight and the second weight are one, in other words, second Weight is (α/2 1-8);Using the first product and second sum of products as the texel value of third pixel in third textures.
In the above-described embodiments, by being sampled to obtain the second textures to include determining as follows to the first textures (second position information is used to indicate the stroke texture mapping of target text information in the second textures in target figure to second position information Second area as in):Determine third parameter, the 4th parameter, the 5th parameter and the 6th parameter for indicating second area, Third parameter is the sum of the maximum value of first area in a first direction and first threshold, and the 4th parameter is first area second The sum of maximum value and first threshold on direction, the 5th parameter are the minimum value and first threshold of first area in a first direction Difference, the 6th parameter is the difference of the minimum value and first threshold of first area in a second direction, first direction and second direction Both direction under the two-dimensional coordinate system where target image.
In the technical solution that step S406 is provided, terminal renders to have in the target image using the second textures retouches side Target text information.
Optionally, target image includes game frame animation, wherein being rendered in the target image using the second textures is had The target text information for retouching side may include:It is rendered with the target text letter for retouching side in frame animation of playing using the second textures Breath optionally, in render process can will retouch the fringe reason color that information indicates as indicated and paint.
Using the technical solution of the application, problems with can be solved:It is according to assigned direction in the related technology to former character Carry out it is mobile be then overlapped, it is discontinuous and uneven that word retouches side effect, the application be using sampling by the way of, enable to text Side effect of word retouching links up and uniformly, consistent with art effect figure, realization effect is preferable;It needs repeatedly to render in the related technology, need More vertex data is handled, and the vertex quantity of the application is consistent with word vertex quantity when not retouching side, it will not be additional Increase vertex quantity, reduces the hardware requirement to terminal;The side size of retouching of the application supports range (can specifically pass through tune greatly The Numerical Implementation of whole first threshold), the smaller of normal area of application can be provided and retouch side effect and application circumstances are provided Under overstriking retouch side effect.
As a kind of optional embodiment, below by by the technical solution of the application be applied to game for further pixel The technical solution of the application.
The rendering type in game engine Unity is the basic step that a word is rendered in Unity engines below:
Step 1, according to the information of word in font file (font), 6 vertex are generated, wherein this 6 vertex constitute 2 Triangle is formed by and is denoted as the first textures, as shown in figure 8,2 words totally 4 triangles and 12 vertex.
Step 2, the setting according to user in character component, generate mesh information, mesh information include apex coordinate, Vertex color, texture coordinate (being mapped to first area for indicating) etc..
Step 3, it is handled in the geometry stage of rendering pipeline (Geometry Stage) opposite vertexes information, by vertex In coordinate transform to screen space.
Step 4, in rasterization stage (Rasterizer Stage) by vertex interpolation texture coordinate and vertex color etc., so Output needs the word rendered pixel-by-pixel afterwards.
In the related art, it is as follows to retouch side effect implementation by UGUI:Side effect of retouching included UGUI is equivalent to and is drawing On the basis of original character, additionally it is repeated 4 times drafting operation and obtains retouching side effect, retouch the word when word vertex quantity is not retouch 5 times of vertex quantity.As shown in figure 9, for apply UGUI it is included retouch side component, it can be seen that 1 word shares 10 three Angular and 30 vertex.
Different from technical solution in the related technology, it is as follows to retouch side effect implementation in the step 1 of the application, to render For one is retouched side word:
Step 11, according to the information of word in font file (font), 6 vertex are generated, wherein this 6 vertex constitute 2 A triangle.
This step can be realized in Unity engines, similar for the game engine of remaining type, in Unity engines When middle this function of realization, the operating procedure that can be carried out is:Create an empty game object GameObject;For GameObject Add Text components;The font attribute and content of Text components are changed, i.e. Text components generate pair Unity engines thus at this time The vertex information answered.
Step 12, the setting according to user in character component generates mesh information, i.e. the first textures, in mesh information Including apex coordinate, vertex color, texture coordinate etc..
This step can be realized in Unity engines, when realizing this function in Unity engines, the operating procedure that can carry out Similar with step 11, Unity engines generate the mesh information of word while generating Text component vertex informations, That is the first textures.
Step 13, mesh information, the areas UV of essential record original character are changed in the setting according to user in retouching side component Domain (i.e. first area), according to the regions UV for retouching side expanded in size word, the region after expanding is denoted as second area.
UV coordinates, that is, texture coordinate, indicate the texture range of vertex correspondence, this range includes often entire word Minimum zone, under the premise of not increasing vertex quantity in this programme, UV ranges can be expanded with accommodate retouch side as a result, avoiding The result for retouching side is cut.
If " word " is UV ranges when not changing Mesh in left figure in Figure 10, as " word " is to repair in right figure in Figure 10 The UV ranges after Mesh are changed, the foundation for changing Mesh is to retouch the size on side, it can be seen that the range of right figure is more than left figure Range.
In order to more intuitively compare the front and back UV ranges of modification, as shown in figure 11, it is in red block (frame that i.e. " A " is shown) Original UV ranges before modification, interior yellow frame (frame that i.e. " B " is shown) is modified UV ranges.
UV before modification and the UV after modification are stored in the tangent variables in UI Vertex structures, are used for Parameter is passed into Shader and makees data source.
Above-mentioned steps can realize in Unity engines, using the interface for the mesh information for changing Text components in Unity, After generating the mesh of word, it can detect and whether also be inherited on the corresponding GameObject of Text components simultaneously The component of BaseMeshEffect, if any the ModifyMesh that can then call in BaseMeshEffect components for being written over Method.
The step of needing to do in the scheme of step 13 be:
1) realize that a component for being inherited from BaseMeshEffect, class name are named as TextModMesh, and by this component The GameObject being added to where Text components;
2) mesh is changed in the ModifyMesh methods that TextModMesh is rewritten, specific amending method is:It uses VertextHelper.GetUIVertextStream () method acquisition mesh vertex informations in Unity engines, adjacent 3 A vertex constitutes 1 triangle, takes the directions x (or first direction) and the directions y (or second direction) in this triangle Maximum value Xmax, Ymax and minimum value Xmin, Ymin are the original regions UV, and user setting retouches on reading TextModMesh Side size OutlineSize attributes (namely first threshold), Xmax, Ymax are plus OutlineSize up to new newXmax (i.e. third parameter), newYmax (i.e. the 4th parameter), Xmin, Ymin subtract OutlineSize up to new newXmin (i.e. Five parameters), newYmin (i.e. the 6th parameter).The value that this step obtains is exactly to be added to the new UV ranges for retouching border region.
Above-mentioned calculating step formula is as follows
Xmax=Math.Max (Vertex1.x, Vertex2.x, Vertex3.x);
Ymax=Math.Max (Vertex1.y, Vertex2.y, Vertex3.y);
Xmin=Math.Min (Vertex1.x, Vertex2.x, Vertex3.x);
Ymin=Math.Min (Vertex1.y, Vertex2.y, Vertex3.y);
Function Math.Max is used to obtain the maximum value of several parameters in bracket (), and Math.Min is for obtaining bracket The minimum value of several parameters in (), Vertex1 to Vertex3 indicate three vertex of triangle, Vertex1.x tables respectively Show the x coordinate of triangular apex 1, remaining meaning such as Vertex2.x is similar.
(Xmin, Ymin) and (Xmax, Ymax) is the original regions UV;
NewXmax=Xmax+OutlineSize;
NewYmax=Ymax+OutlineSize;
NewXmin=Xmin-OutlineSize;
NewYmin=Ymin-OutlineSize;
(newXmin, newYmin) and (newXmax, newYmax) is to be added to the regions UV after retouching border region (i.e. second Region).
By (Xmin, Ymin) and (Xmax, Ymax), (newXmin, newYmin) and (newXmax, newYmax) passes through The interface VertexHelper.AddUIVertexTriangleStream () that Unity is provided passes to Shader.
Step 14, it is handled, will be pushed up in the geometry stage of rendering pipeline (Geometry Stage) opposite vertexes information Point coordinates transforms in screen space.
This step can be based on modifying realizations to TextShader in Unity engines, and first in Shader renders In the pass of channel, by each apex coordinate of incoming Shader and "current" model view projections matrix multiple to get vertex correspondence Screen space coordinates.
Calculation formula is:Float4vertexPos=mul (UNITY_MATRIX_MVP, Input.vertex), float4 Indicate that a kind of data type, vertexPos indicate that the screen space coordinates of vertex correspondence, mul indicate to be multiplied, UNITY_ MATRIX_MVP indicates that "current" model view projections matrix, Input.vertex indicate apex coordinate.
Step 15, in rasterization stage (Rasterizer Stage) by vertex interpolation texture coordinate and vertex color etc., When exporting pixel-by-pixel, the original texture color in the upper left of current pixel, upper right, lower-left, bottom right, upper and lower, left and right totally 8 directions is sampled Value, the color value obtained after being averaged merge to obtain final output namely the second textures with the texture color value of current pixel.
When handling and exporting pixel-by-pixel, in the processing stage pixel-by-pixel of rasterization stage (Rasterizer Stage), As shown by the arrows in figure 12, the original line in the upper left of each pixel, upper right, lower-left, bottom right, upper and lower, left and right totally 8 directions is sampled The sum of color value (or texel value) is managed, the sampling matrix in 8 directions is as follows, and actual samples coordinate is in sampling matrix Value be multiplied by and retouch side size, the unit for retouching side size is pixel.
{-0.7,0.7} {0,1} {0.7,0.7}
{ -1,0 } { 1,0 }
{-0.7,-0.7} (0,-1} (0.7,-0.7}
The intermediate result that current pixel color value is obtained after value obtained in the previous step is averaged, in obtained in the previous step Between as a result, merged with the original texture color value that current pixel samples, obtain the final output of current pixel, merged Journey is contemplated that the Alpha values (i.e. α values) of original texture, is being retouched on side because original texture is intended to cover, meter when fusion It is as follows to calculate formula, wherein colorResult is to export as a result, colorOrigin is original texture color value, and color is upper The intermediate result that one step obtains.
This step can continue to realize after modifying based on TextShader in Unity engines, in the 2nd wash with watercolours of Shader It contaminates in the pass of channel, it is as follows to can define a sampling matrix:
Float2sampleOffsets [8]=
{{-0.7,-0.7},{0,-1},{0.7,-0.7},{-1,0},{1,0},{-0.7,0.7},{0,1},{0.7, 0.7}}
Sampled result is sampleOffsets [8], and obtained data type is float2, pixel-by-pixel to above-mentioned sampling square Battle array samples pixel-by-pixel, includes that current Position is sampled 9 times altogether inside per pixel.
A cycle can be used to carry out for sampling for sampling matrix, and the value that sampling is obtained is added summation.
ColorResult+=(tex2D (_ MainTex, curPos)+_ TextureSampleAdd) * _ OutlineColor;
In above-mentioned formula, ColorResult is summation as a result, tex2D methods are in Unity engines to texture mapping The method of sampling, _ MainTex be the corresponding texture mapping of current font, curPos be searching loop sampling matrix _ The coordinate value that sampleOffsets is obtained, _ TextureSampleAdd are default parameters when Unity engines sample, _ OutlineColor is that the user setting of incoming Shader retouches side color color value.
After being directed to sampling matrix sampling pixel-by-pixel, the summed result ColorResult of acquirement is averaged ColorResult=ColorResult/8. just obtained word after processing in this way retouches side result.
Finally the side result of retouching of word is blended with original texture, due to original texture needs be covered in retouch side result it On, it is merged so sampling following formula:
ColorResult.rgb=ColorOrigin.rgb*ColorOrigin.a+ColorResul t.rgb* (1- ColorOrigin.a),
Wherein ColorResult is to retouch side as a result, ColorOrigin is original texture, and rgb is the rgb value of color, a (namely α) is the Alpha channel values of color, it should be noted that for red channel R values, green channel value G values, blue channel Value B values can be respectively calculated according to above-mentioned formula.
In embodiments herein, substep output result may include:It is taken after being sampled to data on 8 directions The intermediate result output being averagely worth to, a kind of obtained optionally texture mapping are as shown in figure 13;By intermediate result with it is original Output after the fusion of texture color value, as shown in figure 14.
Optionally, for the dirty pixel of edge in the first textures, it can be treated filter as follows:
Above-mentioned technical proposal is sampled, obtained word retouches side effect, but in actual motion it is also possible to there are dirty The problem of pixel, since UGUI is to pool together all word textures to be placed on a big textures, word and text It closely arranges between word, and this arrangement is random, interim textures when being run as shown in figure 15 for certain (the One textures), it may include a large amount of word, and text layout is closer, absolute separate space had no between word.
In above-described embodiment of the application, the border region of retouching of word is expanded, will appear sampling in this way and cross the border to side The problem of character, does not have the texture for excluding to be not belonging in this character zone in further retouching side and calculating, therefore occurs The problem of dirty pixel, as shown in figure 16.
Such dirty pixel is unpredictable and random, so when side result is retouched in calculating, is needed pixel-by-pixel Processing stage rejects dirty pixel, and the method for rejecting is, during expanding UV ranges, has been passed to the UV ranges after expanding Meanwhile original UV ranges have also been passed to, this can be exceeded by judging current pixel coordinate whether within the scope of original UV The pixel of range directly sets to 0 and is not involved in follow-up calculating, and the generation of dirty pixel is shielded with this.
Optionally, available functions IsInRect () judges whether within the scope of original UV herein, IsInRect () letter Number using Step methods calculate whether within the scope of, float inside=step (xy, fPoint) * step (fPoint, Zw), i.e., judge whether the x coordinate of current pixel, y-coordinate (are indicated) within the scope of some with fPoint respectively, if step is returned The value returned is 1, is otherwise 0.Sampling this method, which removes to can be obtained after dirty pixel, final retouches side word output result;It increases The step of removing dirty pixel.
Optionally, in processing stage pixel-by-pixel, it is possible to reduce the redundancy determination of transparent pixels, such as pixel are original In the case that texture is full 0, when upper left, lower-left, upper right, the direction of bottom right 4 are also 0, it is far from having that can obtain this pixel The point for imitating texture, can reduce sampling number at this time.
The technical solution of the application solves in the game that Unity engines are developed that retouch side effect using font less efficient The problem of, the difference problem realized between effect and art effect figure is reduced, while improving game fine arts quality, optimization The performance of game, greatly improves the game experiencing of player.
It should be noted that for each method embodiment above-mentioned, for simple description, therefore it is all expressed as a series of Combination of actions, but those skilled in the art should understand that, the present invention is not limited by the described action sequence because According to the present invention, certain steps can be performed in other orders or simultaneously.Secondly, those skilled in the art should also know It knows, embodiment described in this description belongs to preferred embodiment, and involved action and module are not necessarily of the invention It is necessary.
Through the above description of the embodiments, those skilled in the art can be understood that according to above-mentioned implementation The method of example can add the mode of required general hardware platform to realize by software, naturally it is also possible to by hardware, but it is very much In the case of the former be more preferably embodiment.Based on this understanding, technical scheme of the present invention is substantially in other words to existing The part that technology contributes can be expressed in the form of software products, which is stored in a storage In medium (such as ROM/RAM, magnetic disc, CD), including some instructions are used so that a station terminal equipment (can be mobile phone, calculate Machine, server or network equipment etc.) execute method described in each embodiment of the present invention.
Other side according to the ... of the embodiment of the present invention additionally provides a kind of rendering side for implementing above-mentioned text message The rendering device of the text message of method.Figure 17 is a kind of rendering device of optional text message according to the ... of the embodiment of the present invention Schematic diagram, as shown in figure 17, which may include:
Acquiring unit 1701, for getting instruction information, wherein instruction information is used to indicate to be rendered in the target image Provide the target text information for retouching side;
Sampling unit 1703, in response to instruction information, the second textures to be obtained by being sampled to the first textures, In, the first textures are used to indicate the stroke texture of target text information, and what the second textures were used to indicate target text information retouches side Texture;
Rendering unit 1705 has the target text letter for retouching side for being rendered in the target image using the second textures Breath.
It should be noted that the acquiring unit 1701 in the embodiment can be used for executing the step in the embodiment of the present application S402, the sampling unit 1703 in the embodiment can be used for executing the step S404 in the embodiment of the present application, in the embodiment Rendering unit 1705 can be used for execute the embodiment of the present application in step S406.
Herein it should be noted that above-mentioned module is identical as example and application scenarios that corresponding step is realized, but not It is limited to above-described embodiment disclosure of that.It should be noted that above-mentioned module as a part for device may operate in as In hardware environment shown in Fig. 3, it can also pass through hardware realization by software realization.
By above-mentioned module, instruction information is got, instruction information is used to indicate to render to have in the target image and retouch The target text information on side;By being sampled to obtain the second textures to the first textures, the first textures are for indicating target text The stroke texture of information, the second textures are used to indicate that the fringe of retouching of target text information to be managed;Using the second textures in target image In render with the target text information on side is retouched, in other words, in embodiments herein, only the second textures can be rendered Once, rather than in the related technology it five times, is largely reduced with the word vertex quantity after this scheme, correlation can be solved Word is carried out in technology to retouch the technical issues of side needs to consume compared with multiple terminals calculation resources, and then reaches reduction and carries out retouching side need The technique effect for the terminal calculation resources to be consumed.
The application describes one kind in the engines such as Unity using character component etc., by components such as piece member tinters The texture sampling stage to word texture multiple repairing weld formed word retouch the technical solution of side effect can by the technical solution Efficiently to realize that word retouches side effect in the scenes such as game, film, animation, live streaming, mitigate the fortune of mobile device, computer etc. Calculate the pressure of resource (such as central processor CPU and image processor GPU).
In the above-described embodiments, sampling unit may include:Sampling module, for by carrying out sampling acquisition to the first textures The texel value of pixel in first textures;Determining module, for true according to the texel value of pixel in the first textures The texel value for determining pixel in third textures obtains the second textures, wherein the second textures include for indicating target text The pixel for retouching fringe reason of this information.
Optionally, instruction information is additionally operable to indicate that target text information needs to retouch the target direction on side and for indicating target The first threshold for retouching side range of text message, wherein determining module may include:Acquisition submodule, for obtaining the first parameter With the second parameter, wherein the first parameter be the first textures in the first pixel texel value the sum of, the first pixel point At a distance of first threshold on the target direction of the second pixel and with the second pixel in the first textures, the second parameter is and second Number of the pixel at a distance of the pixel of first threshold;Determination sub-module, for the line according to target ratio and the second pixel Reason pixel value determines the texel value of third pixel in third textures, wherein target ratio is the first parameter and the second ginseng Ratio between number, position of the third pixel in third textures are identical as position of second pixel in the first textures.
Above-mentioned determination sub-module can be additionally used in:Obtain between the texel value of the second pixel and the first weight One product, and obtain the second product between target ratio and the second weight, wherein the first weight is according to the second pixel What transparency determined, the sum of the first weight and the second weight are one;Using the first product and second sum of products as third textures The texel value of middle third pixel.
Above-mentioned acquisition submodule can be additionally used in:In the case where target direction includes a direction, determines and be located at second At a distance of the first pixel of first threshold on the target direction of pixel and with the second pixel, and by all first pixels The sum of texel value is used as the first parameter;In the case where target direction includes multiple directions, determines and be located at the second pixel Multiple directions on any one direction and with the second pixel at a distance of the first pixel of first threshold, and by all first The sum of texel value of pixel is used as the first parameter.
Optionally, the device of the application may also include, texture fetching unit, for by being sampled to the first textures Before obtaining the second textures, the stroke texture of target text information is obtained from the font file of target text information;It creates single Member includes the stroke texture of target text information and the first textures of first position information for creating, wherein is believed first position Breath is for indicating the first area of the stroke texture mapping of target text information in the target image.
Above-mentioned sampling module can be additionally used in:Pixel is the pixel being mapped in first area in the first textures In the case of, obtain the texel value of pixel in the first textures;Pixel is not to be mapped in first area in the first textures In the case of interior pixel, using second threshold as the texel value of pixel in the first textures.
Above-mentioned sampling unit to the first textures by being sampled to obtain the second textures to include determining as follows Second position information, second position information are used to indicate the stroke texture mapping of target text information in the second textures in target figure Second area as in:Determine third parameter, the 4th parameter, the 5th parameter and the 6th parameter for indicating second area, Wherein, third parameter is the sum of the maximum value of first area in a first direction and first threshold, and the 4th parameter is first area The sum of maximum value and first threshold in a second direction, the 5th parameter are the minimum value of first area in a first direction and the The difference of one threshold value, the 6th parameter are the difference of the minimum value and first threshold of first area in a second direction, first direction and the Two directions are the both direction under two-dimensional coordinate system where target image.
Optionally, target image includes game frame animation, wherein rendering unit can be additionally used in:It is being swum using the second textures It is rendered with the target text information for retouching side in play frame animation.
Using the technical solution of the application, problems with can be solved:It is according to assigned direction in the related technology to former character Carry out it is mobile be then overlapped, it is discontinuous and uneven that word retouches side effect, the application be using sampling by the way of, enable to text Side effect of word retouching links up and uniformly, consistent with art effect figure, realization effect is preferable;It needs repeatedly to render in the related technology, need More vertex data is handled, and the vertex quantity of the application is consistent with word vertex quantity when not retouching side, it will not be additional Increase vertex quantity, reduces the hardware requirement to terminal;The side size of retouching of the application supports range (can specifically pass through tune greatly The Numerical Implementation of whole first threshold), the smaller of normal area of application can be provided and retouch side effect and application circumstances are provided Under overstriking retouch side effect.
Herein it should be noted that above-mentioned module is identical as example and application scenarios that corresponding step is realized, but not It is limited to above-described embodiment disclosure of that.It should be noted that above-mentioned module as a part for device may operate in as In hardware environment shown in Fig. 3, it can also pass through hardware realization by software realization, wherein hardware environment includes network Environment.
Other side according to the ... of the embodiment of the present invention additionally provides a kind of rendering side for implementing above-mentioned text message The server or terminal of method.
Figure 18 is a kind of structure diagram of terminal according to the ... of the embodiment of the present invention, and as shown in figure 18, which may include: One or more (one is only shown in Figure 18) processors 1801, memory 1803 and transmitting device 1805, such as Figure 18 institutes Show, which can also include input-output equipment 1807.
Wherein, memory 1803 can be used for storing software program and module, such as the text message in the embodiment of the present invention Rendering intent and the corresponding program instruction/module of device, processor 1801 by operation be stored in it is soft in memory 1803 Part program and module realize the rendering side of above-mentioned text message to perform various functions application and data processing Method.Memory 1803 may include high speed random access memory, can also include nonvolatile memory, such as one or more magnetism Storage device, flash memory or other non-volatile solid state memories.In some instances, memory 1803 can further comprise The memory remotely located relative to processor 1801, these remote memories can pass through network connection to terminal.Above-mentioned net The example of network includes but not limited to internet, intranet, LAN, mobile radio communication and combinations thereof.
Above-mentioned transmitting device 1805 is used to receive via network or transmission data, can be also used for processor with Data transmission between memory.Above-mentioned network specific example may include cable network and wireless network.In an example, Transmitting device 1805 includes a network adapter (Network Interface Controller, NIC), can pass through cable It is connected with other network equipments with router so as to be communicated with internet or LAN.In an example, transmission dress It is radio frequency (Radio Frequency, RF) module to set 1805, is used to wirelessly be communicated with internet.
Wherein, specifically, memory 1803 is for storing application program.
Processor 1801 can call the application program that memory 1803 stores by transmitting device 1805, following to execute Step:
Get instruction information, wherein instruction information is used to indicate renders the target with side is retouched in the target image Text message;
In response to indicating information, the second textures are obtained by being sampled to the first textures, wherein the first textures are used for table Show that the stroke texture of target text information, the second textures are used to indicate that the fringe of retouching of target text information to be managed;
It is rendered in the target image with the target text information for retouching side using the second textures.
Processor 1801 is additionally operable to execute following step:
Obtain the first parameter and the second parameter, wherein the first parameter is the texture pixel of the first pixel in the first textures The sum of value, the first pixel is located in the first textures on the target direction of the second pixel and with the second pixel at a distance of first Threshold value, the second parameter are with the second pixel at a distance of the number of the pixel of first threshold;
The texture picture of third pixel in third textures is determined according to the texel value of target ratio and the second pixel Element value, wherein ratio of the target ratio between the first parameter and the second parameter, position of the third pixel in third textures It is identical as position of second pixel in the first textures.
Using the embodiment of the present invention, instruction information is got, instruction information is used to indicate to render in the target image and provide There is the target text information for retouching side;By being sampled to obtain the second textures to the first textures, the first textures are for indicating target The stroke texture of text message, the second textures are used to indicate that the fringe of retouching of target text information to be managed;Using the second textures in target Being rendered in image in other words, can be only to the second textures in embodiments herein with the target text information for retouching side It renders once, rather than in the related technology five times, it largely reduces, can solve with the word vertex quantity after this scheme Word is carried out in the related technology to retouch the technical issues of side needs to consume compared with multiple terminals calculation resources, and then reaches reduction and is retouched Side needs the technique effect of the terminal calculation resources consumed.
Optionally, the specific example in the present embodiment can refer to the example described in above-described embodiment, the present embodiment Details are not described herein.
It will appreciated by the skilled person that structure shown in Figure 18 is only to illustrate, terminal can be smart mobile phone (such as Android phone, iOS mobile phones), tablet computer, palm PC and mobile internet device (Mobile Internet Devices, MID), the terminal devices such as PAD.Figure 18 it does not cause to limit to the structure of above-mentioned electronic device.For example, terminal is also It may include more either less components (such as network interface, display device) than shown in Figure 18 or have and Figure 18 institutes Show different configurations.
One of ordinary skill in the art will appreciate that all or part of step in the various methods of above-described embodiment is can To be completed come command terminal device-dependent hardware by program, which can be stored in a computer readable storage medium In, storage medium may include:Flash disk, read-only memory (Read-Only Memory, ROM), random access device (Random Access Memory, RAM), disk or CD etc..
The embodiments of the present invention also provide a kind of storage mediums.Optionally, in the present embodiment, above-mentioned storage medium can For the program code of the rendering intent of execution text message.
Optionally, in the present embodiment, above-mentioned storage medium can be located at multiple in network shown in above-described embodiment On at least one of network equipment network equipment.
Optionally, in the present embodiment, storage medium is arranged to store the program code for executing following steps:
S12 gets instruction information, wherein instruction information, which is used to indicate to render to have in the target image, retouches side Target text information;
S14 obtains the second textures in response to indicating information by being sampled to the first textures, wherein the first textures are used In the stroke texture for indicating target text information, the second textures are used to indicate that the fringe of retouching of target text information to be managed;
S16 is rendered in the target image using the second textures with the target text information for retouching side.
Optionally, storage medium is also configured to store the program code for executing following steps:
S22 obtains the first parameter and the second parameter, wherein the first parameter is the texture of the first pixel in the first textures The sum of pixel value, the first pixel is located in the first textures on the target direction of the second pixel and with the second pixel apart First threshold, the second parameter are with the second pixel at a distance of the number of the pixel of first threshold;
S24 determines the line of third pixel in third textures according to the texel value of target ratio and the second pixel Manage pixel value, wherein ratio of the target ratio between the first parameter and the second parameter, third pixel is in third textures Position is identical as position of second pixel in the first textures.
Optionally, the specific example in the present embodiment can refer to the example described in above-described embodiment, the present embodiment Details are not described herein.
Optionally, in the present embodiment, above-mentioned storage medium can include but is not limited to:USB flash disk, read-only memory (ROM, Read-Only Memory), random access memory (RAM, Random Access Memory), mobile hard disk, magnetic disc or The various media that can store program code such as CD.
The embodiments of the present invention are for illustration only, can not represent the quality of embodiment.
If the integrated unit in above-described embodiment is realized in the form of SFU software functional unit and as independent product Sale in use, can be stored in the storage medium that above computer can be read.Based on this understanding, skill of the invention Substantially all or part of the part that contributes to existing technology or the technical solution can be with soft in other words for art scheme The form of part product embodies, which is stored in a storage medium, including some instructions are used so that one Platform or multiple stage computers equipment (can be personal computer, server or network equipment etc.) execute each embodiment institute of the present invention State all or part of step of method.
In the above embodiment of the present invention, all emphasizes particularly on different fields to the description of each embodiment, do not have in some embodiment The part of detailed description may refer to the associated description of other embodiment.
In several embodiments provided herein, it should be understood that disclosed client, it can be by others side Formula is realized.Wherein, the apparatus embodiments described above are merely exemplary, for example, the unit division, only one Kind of division of logic function, formula that in actual implementation, there may be another division manner, such as multiple units or component can combine or It is desirably integrated into another system, or some features can be ignored or not executed.Another point, it is shown or discussed it is mutual it Between coupling, direct-coupling or communication connection can be INDIRECT COUPLING or communication link by some interfaces, unit or module It connects, can be electrical or other forms.
The unit illustrated as separating component may or may not be physically separated, aobvious as unit The component shown may or may not be physical unit, you can be located at a place, or may be distributed over multiple In network element.Some or all of unit therein can be selected according to the actual needs to realize the mesh of this embodiment scheme 's.
In addition, each functional unit in each embodiment of the present invention can be integrated in a processing unit, it can also It is that each unit physically exists alone, it can also be during two or more units be integrated in one unit.Above-mentioned integrated list The form that hardware had both may be used in member is realized, can also be realized in the form of SFU software functional unit.
The above is only a preferred embodiment of the present invention, it is noted that for the ordinary skill people of the art For member, various improvements and modifications may be made without departing from the principle of the present invention, these improvements and modifications are also answered It is considered as protection scope of the present invention.

Claims (15)

1. a kind of rendering intent of text message, which is characterized in that including:
Get instruction information, wherein the instruction information is used to indicate renders the target with side is retouched in the target image Text message;
In response to the instruction information, the second textures are obtained by being sampled to the first textures, wherein first textures are used In the stroke texture for indicating the target text information, what second textures were used to indicate the target text information retouches fringe Reason;
It is rendered in the target image with the target text information for retouching side using second textures.
2. according to the method described in claim 1, it is characterized in that, by being sampled to obtain the second textures packet to the first textures It includes:
By carrying out the texel value that sampling obtains pixel in first textures to first textures;
The texel value that pixel in third textures is determined according to the texel value of pixel in first textures, obtains Second textures, wherein second textures include the pixel for retouching fringe reason for indicating the target text information Point.
3. according to the method described in claim 2, it is characterized in that, the instruction information is additionally operable to indicate the target text letter Breath need to retouch while target direction and the first threshold of range when retouching for indicating the target text information, wherein according to The texel value of pixel determines that the texel value of pixel in third textures includes in first textures:
Obtain the first parameter and the second parameter, wherein first parameter is the texture of the first pixel in first textures Pixel value the sum of, first pixel be located in first textures on the target direction of the second pixel and with institute The second pixel is stated at a distance of the first threshold, second parameter is with second pixel at a distance of the first threshold The number of pixel;
The line of third pixel in the third textures is determined according to the texel value of target ratio and second pixel Manage pixel value, wherein ratio of the target ratio between first parameter and second parameter, the third pixel Position of the point in the third textures is identical as position of second pixel in first textures.
4. according to the method described in claim 3, it is characterized in that, according to the texture picture of target ratio and second pixel Plain value determines that the texel value of third pixel in the third textures includes:
The first product between the texel value and the first weight of second pixel is obtained, and obtains the target ratio The second product between the second weight, wherein first weight is to be determined according to the transparency of second pixel, The sum of first weight and second weight are one;
Using first product and second sum of products as the texel value of third pixel in the third textures.
5. according to the method described in claim 3, it is characterized in that, acquisition first parameter includes:
In the case where the target direction includes a direction, determine on the target direction of second pixel And with second pixel at a distance of first pixel of the first threshold, and by the line of all first pixels It manages the sum of pixel value and is used as first parameter;
In the case where the target direction includes multiple directions, determine in the multiple direction of second pixel At a distance of first pixel of the first threshold on any one direction and with second pixel, and will be all described The sum of texel value of first pixel is used as first parameter.
6. method as claimed in any of claims 2 to 5, which is characterized in that by being adopted to the first textures Before sample obtains the second textures, the method further includes:
The stroke texture of the target text information is obtained from the font file of the target text information;
Create first textures of the stroke texture and first position information that include the target text information, wherein described First position information is used to indicate first area of the stroke texture mapping of the target text information in the target image.
7. according to the method described in claim 6, it is characterized in that, obtaining described the by carrying out sampling to first textures The texel value of pixel includes in one textures:
In the case that pixel is the pixel being mapped in the first area in first textures, described first is obtained The texel value of pixel in textures;
In the case that pixel is not the pixel being mapped in the first area in first textures, by second threshold Texel value as pixel in first textures.
8. according to the method described in claim 6, it is characterized in that, by being sampled to obtain the second textures packet to the first textures It includes and determines second position information as follows, the second position information is for indicating target described in second textures Second area of the stroke texture mapping of text message in the target image:
Determine third parameter, the 4th parameter, the 5th parameter and the 6th parameter for indicating the second area, wherein institute It is the sum of the maximum value of the first area in a first direction and first threshold to state third parameter, and the 4th parameter is described The sum of the maximum value of first area in a second direction and the first threshold, the 5th parameter are the first area in institute The difference of the minimum value and the first threshold on first direction is stated, the 6th parameter is the first area in the second party The difference of upward minimum value and the first threshold, the first direction and the second direction are two where the target image Both direction under dimension coordinate system.
9. method as claimed in any of claims 1 to 5, which is characterized in that the target image includes game frame Animation, wherein rendered in the target image with the target text packet for retouching side using second textures It includes:
It is rendered with the target text information for retouching side in frame animation of playing using second textures.
10. a kind of rendering device of text message, which is characterized in that including:
Acquiring unit, for getting instruction information, wherein the instruction information is used to indicate to render in the target image and provide There is the target text information for retouching side;
Sampling unit, in response to the instruction information, the second textures to be obtained by being sampled to the first textures, wherein First textures are used to indicate the stroke texture of the target text information, and second textures are for indicating the target text This information retouches fringe reason;
Rendering unit, for being rendered in the target image with the target text for retouching side using second textures Information.
11. device according to claim 10, which is characterized in that the sampling unit includes:
Sampling module, for by carrying out the texture pixel that sampling obtains pixel in first textures to first textures Value;
Determining module, the line for determining pixel in third textures according to the texel value of pixel in first textures Pixel value is managed, obtains second textures, wherein second textures include for indicating retouching for the target text information The pixel of fringe reason.
12. according to the devices described in claim 11, which is characterized in that the instruction information is additionally operable to indicate the target text Information need to retouch while target direction and the first threshold of range when retouching for indicating the target text information, wherein institute Stating determining module includes:
Acquisition submodule, for obtaining the first parameter and the second parameter, wherein first parameter is the in first textures The sum of the texel value of one pixel, first pixel is located at the mesh of the second pixel in first textures Mark on direction and with second pixel at a distance of the first threshold, second parameter be with second pixel apart The number of the pixel of the first threshold;
Determination sub-module, for being determined in the third textures according to the texel value of target ratio and second pixel The texel value of third pixel, wherein ratio of the target ratio between first parameter and second parameter Value, position of the third pixel in the third textures and position of second pixel in first textures It is identical.
13. device according to claim 12, which is characterized in that the determination sub-module is additionally operable to:
The first product between the texel value and the first weight of second pixel is obtained, and obtains the target ratio The second product between the second weight, wherein first weight is to be determined according to the transparency of second pixel, The sum of first weight and second weight are one;
Using first product and second sum of products as the texel value of third pixel in the third textures.
14. a kind of storage medium, which is characterized in that the storage medium includes the program of storage, wherein when described program is run Execute the method described in 1 to 9 any one of the claims.
15. a kind of electronic device, including memory, processor and it is stored on the memory and can transports on the processor Capable computer program, which is characterized in that the processor executes the claims 1 to 9 by the computer program Method described in one.
CN201810482906.XA 2018-05-18 2018-05-18 Text information rendering method and device, storage medium and electronic device Active CN108765520B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810482906.XA CN108765520B (en) 2018-05-18 2018-05-18 Text information rendering method and device, storage medium and electronic device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810482906.XA CN108765520B (en) 2018-05-18 2018-05-18 Text information rendering method and device, storage medium and electronic device

Publications (2)

Publication Number Publication Date
CN108765520A true CN108765520A (en) 2018-11-06
CN108765520B CN108765520B (en) 2020-07-28

Family

ID=64008414

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810482906.XA Active CN108765520B (en) 2018-05-18 2018-05-18 Text information rendering method and device, storage medium and electronic device

Country Status (1)

Country Link
CN (1) CN108765520B (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109948581A (en) * 2019-03-28 2019-06-28 腾讯科技(深圳)有限公司 Picture and text rendering method, device, equipment and readable storage medium storing program for executing
CN111105474A (en) * 2019-12-19 2020-05-05 广州酷狗计算机科技有限公司 Font drawing method and device, computer equipment and computer readable storage medium
CN111951367A (en) * 2020-08-04 2020-11-17 广州虎牙科技有限公司 Character rendering method, character processing method and device
CN112426711A (en) * 2020-10-23 2021-03-02 杭州电魂网络科技股份有限公司 Bloom effect processing method, system, electronic device and storage medium
CN113240779A (en) * 2021-05-21 2021-08-10 北京达佳互联信息技术有限公司 Method and device for generating special character effect, electronic equipment and storage medium
WO2022147931A1 (en) * 2021-01-06 2022-07-14 网易(杭州)网络有限公司 Method and apparatus for displaying skill special effect in game

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102081731A (en) * 2009-11-26 2011-06-01 中国移动通信集团广东有限公司 Method and device for extracting text from image
CN102122502A (en) * 2011-03-15 2011-07-13 深圳芯邦科技股份有限公司 Method and related device for displaying three-dimensional (3D) font
US20150117768A1 (en) * 2013-10-25 2015-04-30 Canon Kabushiki Kaisha Text rendering method with improved clarity of corners
CN105160646A (en) * 2015-10-21 2015-12-16 广州视睿电子科技有限公司 Character edge tracing implementation method and device
CN105447010A (en) * 2014-08-12 2016-03-30 博雅网络游戏开发(深圳)有限公司 Text rendering method and system
CN106384373A (en) * 2016-08-31 2017-02-08 广州博冠信息科技有限公司 Character display method and device
CN107424137A (en) * 2017-08-01 2017-12-01 深信服科技股份有限公司 A kind of Text enhancement method and device, computer installation, readable storage medium storing program for executing
CN108176048A (en) * 2017-11-30 2018-06-19 腾讯科技(深圳)有限公司 The treating method and apparatus of image, storage medium, electronic device

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102081731A (en) * 2009-11-26 2011-06-01 中国移动通信集团广东有限公司 Method and device for extracting text from image
CN102122502A (en) * 2011-03-15 2011-07-13 深圳芯邦科技股份有限公司 Method and related device for displaying three-dimensional (3D) font
US20150117768A1 (en) * 2013-10-25 2015-04-30 Canon Kabushiki Kaisha Text rendering method with improved clarity of corners
CN105447010A (en) * 2014-08-12 2016-03-30 博雅网络游戏开发(深圳)有限公司 Text rendering method and system
CN105160646A (en) * 2015-10-21 2015-12-16 广州视睿电子科技有限公司 Character edge tracing implementation method and device
CN106384373A (en) * 2016-08-31 2017-02-08 广州博冠信息科技有限公司 Character display method and device
CN107424137A (en) * 2017-08-01 2017-12-01 深信服科技股份有限公司 A kind of Text enhancement method and device, computer installation, readable storage medium storing program for executing
CN108176048A (en) * 2017-11-30 2018-06-19 腾讯科技(深圳)有限公司 The treating method and apparatus of image, storage medium, electronic device

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
刁月华: "网络视频字幕提取识别系统的设计与实现", 《中国优秀硕士学位论文全文数据库 (信息科技辑)》 *
沈梦杰 等: "基于背景融合的机载字符生成", 《电子技术应用》 *

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109948581A (en) * 2019-03-28 2019-06-28 腾讯科技(深圳)有限公司 Picture and text rendering method, device, equipment and readable storage medium storing program for executing
CN109948581B (en) * 2019-03-28 2023-05-05 腾讯科技(深圳)有限公司 Image-text rendering method, device, equipment and readable storage medium
CN111105474A (en) * 2019-12-19 2020-05-05 广州酷狗计算机科技有限公司 Font drawing method and device, computer equipment and computer readable storage medium
CN111951367A (en) * 2020-08-04 2020-11-17 广州虎牙科技有限公司 Character rendering method, character processing method and device
CN111951367B (en) * 2020-08-04 2024-04-19 广州虎牙科技有限公司 Character rendering method, character processing method and device
CN112426711A (en) * 2020-10-23 2021-03-02 杭州电魂网络科技股份有限公司 Bloom effect processing method, system, electronic device and storage medium
CN112426711B (en) * 2020-10-23 2024-03-26 杭州电魂网络科技股份有限公司 Method, system, electronic device and storage medium for processing Bloom effect
WO2022147931A1 (en) * 2021-01-06 2022-07-14 网易(杭州)网络有限公司 Method and apparatus for displaying skill special effect in game
CN113240779A (en) * 2021-05-21 2021-08-10 北京达佳互联信息技术有限公司 Method and device for generating special character effect, electronic equipment and storage medium
CN113240779B (en) * 2021-05-21 2024-02-23 北京达佳互联信息技术有限公司 Method and device for generating text special effects, electronic equipment and storage medium

Also Published As

Publication number Publication date
CN108765520B (en) 2020-07-28

Similar Documents

Publication Publication Date Title
CN108765520A (en) Rendering intent and device, storage medium, the electronic device of text message
CN112215934B (en) Game model rendering method and device, storage medium and electronic device
US11024077B2 (en) Global illumination calculation method and apparatus
CN103677828B (en) Coverage drawing method, drawing engine and terminal equipment
CN105528207B (en) A kind of virtual reality system and the method and apparatus for wherein showing Android application image
CN109448089A (en) A kind of rendering method and device
CN108711180A (en) Makeups/generation and makeups of special efficacy of changing face program file packet/special efficacy of changing face generation method and device
CN113223131B (en) Model rendering method and device, storage medium and computing equipment
Argudo et al. Single-picture reconstruction and rendering of trees for plausible vegetation synthesis
CN108837510B (en) Information display method and device, storage medium and electronic device
CN112274934B (en) Model rendering method, device, equipment and storage medium
CN106683193A (en) Three-dimensional model design method and design device
CN111710020A (en) Animation rendering method and device and storage medium
CN110930484B (en) Animation configuration method and device, storage medium and electronic device
CN108235138A (en) Method, processing unit and its computer system of preview video
WO2019042028A1 (en) All-around spherical light field rendering method
CN114612641A (en) Material migration method and device and data processing method
CN116310037A (en) Model appearance updating method and device and computing equipment
CN111179390A (en) Method and device for efficiently previewing CG assets
CN113192173B (en) Image processing method and device of three-dimensional scene and electronic equipment
CN111462343B (en) Data processing method and device, electronic equipment and storage medium
JP7301453B2 (en) IMAGE PROCESSING METHOD, IMAGE PROCESSING APPARATUS, COMPUTER PROGRAM, AND ELECTRONIC DEVICE
CN114742970A (en) Processing method of virtual three-dimensional model, nonvolatile storage medium and electronic device
CN111738967A (en) Model generation method and apparatus, storage medium, and electronic apparatus
CN111681317A (en) Data processing method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant