CN112949253A - Material display method and device, electronic equipment and computer readable storage medium - Google Patents

Material display method and device, electronic equipment and computer readable storage medium Download PDF

Info

Publication number
CN112949253A
CN112949253A CN202110258067.5A CN202110258067A CN112949253A CN 112949253 A CN112949253 A CN 112949253A CN 202110258067 A CN202110258067 A CN 202110258067A CN 112949253 A CN112949253 A CN 112949253A
Authority
CN
China
Prior art keywords
texture map
multilevel
luminous
displayed
texture
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110258067.5A
Other languages
Chinese (zh)
Other versions
CN112949253B (en
Inventor
易律
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Shell Wood Software Co ltd
Original Assignee
Beijing Shell Wood Software Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Shell Wood Software Co ltd filed Critical Beijing Shell Wood Software Co ltd
Priority to CN202110258067.5A priority Critical patent/CN112949253B/en
Publication of CN112949253A publication Critical patent/CN112949253A/en
Application granted granted Critical
Publication of CN112949253B publication Critical patent/CN112949253B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F40/00Handling natural language data
    • G06F40/10Text processing
    • G06F40/103Formatting, i.e. changing of presentation of documents
    • G06F40/109Font handling; Temporal or kinetic typography
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/40Filling a planar surface by adding surface attributes, e.g. colour or texture
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/60Editing figures and text; Combining figures or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/70Denoising; Smoothing

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • Computational Linguistics (AREA)
  • General Health & Medical Sciences (AREA)
  • General Engineering & Computer Science (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The invention relates to a material display method, a material display device, electronic equipment and a computer readable storage medium. In the process, the luminous effect is not generated by taking the single character as a unit, but is generated by taking the material to be luminous as a unit, so that the luminous effects of two single characters are prevented from generating an overlapping part, and highlight flaws are avoided.

Description

Material display method and device, electronic equipment and computer readable storage medium
Technical Field
The application belongs to the field of image display, and particularly relates to a material display method and device, electronic equipment and a computer-readable storage medium.
Background
The material that can be used for displaying includes characters, patterns, etc.
In some application scenarios, there is a need to use dynamic materials, i.e., the content of the materials displayed in the same area changes in real time; there is also a need to use light emitting materials, i.e. materials displayed in a specific area exhibit a light emitting effect; there is also a demand for lighting using dynamic materials.
In the prior art, when a dynamic material needs to be used alone, because the content of the dynamic material changes frequently, in order to save storage resources and time for making the material, a chartlet corresponding to each character is cached generally by taking the character as a unit, so that each character is arranged and combined according to the display requirement of a dynamic effect in the following, thereby forming the material to be displayed and realizing the dynamic display of the material.
In the prior art, when the light-emitting effect of the material needs to be achieved independently, a background worker generally makes a light-emitting map corresponding to a material map of the material to be emitted in advance, and then superimposes the light-emitting map and the material map, so that the light-emitting effect is achieved. In the process of superimposing, since two adjacent light-emitting maps may have an overlapping portion after being superimposed, and the overlapping portion represents a highlight flaw, in the conventional scheme, the light-emitting map is usually made by taking the whole material to be illuminated as an object, that is, the object of the light-emitting map is generally not in units of characters. On the premise, when the dynamic material light-emitting effect is required to be realized: (1) if the time for saving storage resources and making materials is to be ensured, the light-emitting maps corresponding to the characters, which are prepared in advance, can be superimposed on the character maps only by taking the characters as units, so that the dynamically displayed characters show the light-emitting effect. However, this can result in highlight defects, which can affect the display. (2) If highlight flaws are to be avoided, only whole materials corresponding to the whole pictures and luminous maps corresponding to the whole materials are manufactured in advance, namely the whole materials are taken as units, then the dynamic effect is realized by switching the whole materials, and the dynamic luminous effect is realized by superposing the corresponding luminous maps on the corresponding whole materials. However, since the material is a whole material, a large amount of storage resources and time for creating the material are required, which leads to a high cost.
Disclosure of Invention
In view of the above, an object of the present application is to provide a material display method, device, electronic device and computer readable storage medium, which can avoid highlighting defects on the premise of realizing dynamic material light emission, thereby improving display effect, and can avoid consuming a large amount of storage resources and time for making materials, thereby reducing cost.
The embodiment of the application is realized as follows:
in a first aspect, an embodiment of the present application provides a method for displaying materials, where the method includes: the method comprises the steps of obtaining a material to be displayed for displaying in a preset area of a display area, wherein the material to be displayed in the preset area is a material to be emitted; creating a primary texture map which has the same size as the preset area and contains the material to be luminous; carrying out fuzzy processing on the primary texture mapping to obtain a luminous texture mapping with a luminous effect; and superposing the luminous texture mapping to the material to be displayed in the preset area.
In the process, when the dynamic material is required to show the light emitting effect, the dynamic effect is separated from the realization process of the light emitting effect, so that after the whole material which needs to be dynamically displayed is determined, the light emitting texture map is made in real time for the whole material and the whole material is overlapped to realize the light emitting effect, namely, the light emitting effect is generated by taking the whole situation of the dynamic material to be emitted as a unit instead of a single character, the light emitting effect of two adjacent single characters is prevented from generating an overlapped part, and further, the highlight defect is avoided. In addition, the process of realizing the dynamic effect is separated from the process of realizing the luminous effect, so that the process of generating the luminous effect by taking the dynamic material to be luminous as a whole unit does not influence the realization process of the dynamic effect, and the dynamic effect can be processed by taking a single character as a unit, thereby avoiding consuming a large amount of storage resources and time for manufacturing the material, and further reducing the cost.
With reference to the embodiment of the first aspect, in a possible implementation manner, the blurring the first-level texture map includes: creating N +1 multilevel texture maps, wherein the nth multilevel texture map is filtered to obtain an N +1 multilevel texture map, the size of the N +1 multilevel texture map is half of that of the nth multilevel texture map, N is 0, 1, 2, 3, and when N is 0, the nth multilevel texture map is the one-level texture map; and according to the reverse order of the creation order of the N +1 multilevel texture maps, sequentially adding each multilevel texture map back to the previously created multilevel texture map from the N +1 multilevel texture map, and finally obtaining a texture map, namely the luminous texture map.
With reference to the embodiment of the first aspect, in a possible implementation manner, the filtering the nth multi-level texture map includes: and filtering the Nth multilevel texture map by using a box filter.
With reference to the embodiment of the first aspect, in one possible implementation manner, the sequentially overlaying each multi-level texture map back to the multi-level texture map created before the multi-level texture map sequentially includes: each multi-level texture map is superimposed back to its previous created multi-level texture map in turn by the tent filter.
With reference to the embodiment of the first aspect, in a possible implementation manner, the displaying area is a game interface, and the blurring processing on the primary texture map includes: creating a game camera, wherein the visual field range of the game camera is the same as the size of the preset area; and carrying out fuzzy processing on the primary texture mapping by the game camera.
With reference to the embodiment of the first aspect, in a possible implementation manner, before the obtaining the material to be displayed for displaying in the preset area of the display area, the method further includes: and determining that the material to be displayed has change compared with the previous moment.
With reference to the embodiment of the first aspect, in a possible implementation manner, the material to be displayed is a text.
With reference to the first aspect, in a possible implementation manner, the position of the preset area is adjustable.
In a second aspect, an embodiment of the present application provides a material display apparatus, including: the device comprises an acquisition module, a creation module, a processing module and a superposition module.
The device comprises an acquisition module, a display module and a display module, wherein the acquisition module is used for acquiring a material to be displayed for displaying in a preset area of a display area, and the material to be displayed in the preset area is a material to be emitted;
the creating module is used for creating a primary texture map which has the same size as the preset area and contains the material to be luminous;
the processing module is used for carrying out fuzzy processing on the primary texture mapping to obtain a luminous texture mapping with a luminous effect;
and the superposition module is used for superposing the luminous texture mapping to the material to be displayed in the preset area.
With reference to the second aspect embodiment, in a possible implementation manner, the processing module is configured to create N +1 multilevel texture maps, where the nth multilevel texture map is filtered to obtain an N +1 th multilevel texture map, and a size of the N +1 th multilevel texture map is half a size of the nth multilevel texture map, N is sequentially 0, 1, 2, 3, and when N is 0, the nth multilevel texture map is the one-level texture map; and according to the reverse order of the creation order of the N +1 multilevel texture maps, sequentially adding each multilevel texture map back to the previously created multilevel texture map from the N +1 multilevel texture map, and finally obtaining a texture map, namely the luminous texture map.
With reference to the second aspect, in one possible implementation manner, the processing module is configured to filter the nth multilevel texture map through a box filter.
With reference to the second aspect embodiment, in one possible implementation manner, the processing module is configured to sequentially superimpose each multi-level texture map back to the multi-level texture map created before the multi-level texture map through a tent filter.
With reference to the second aspect, in a possible implementation manner, the display area is a game interface, and the processing module is configured to create a game camera, where a visual field range of the game camera is the same as a size of the preset area; and carrying out fuzzy processing on the primary texture mapping by the game camera.
With reference to the second aspect, in a possible implementation manner, the apparatus further includes a determining module, configured to determine that the material to be displayed has a change compared to a previous time.
With reference to the second aspect, in one possible implementation manner, the material to be displayed is a text.
With reference to the second aspect, in one possible implementation manner, the position of the preset area is adjustable.
In a third aspect, an embodiment of the present application further provides an electronic device, including: a memory and a processor, the memory and the processor connected; the memory is used for storing programs; the processor calls a program stored in the memory to perform the method of the first aspect embodiment and/or any possible implementation manner of the first aspect embodiment.
In a fourth aspect, the present application further provides a non-transitory computer-readable storage medium (hereinafter, referred to as a computer-readable storage medium), on which a computer program is stored, where the computer program is executed by a computer to perform the method in the foregoing first aspect and/or any possible implementation manner of the first aspect.
Additional features and advantages of the application will be set forth in the description which follows, and in part will be obvious from the description, or may be learned by the practice of the embodiments of the application. The objectives and other advantages of the application may be realized and attained by the structure particularly pointed out in the written description and drawings.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings needed to be used in the embodiments will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings without creative efforts. The foregoing and other objects, features and advantages of the application will be apparent from the accompanying drawings. Like reference numerals refer to like parts throughout the drawings. The drawings are not intended to be to scale as practical, emphasis instead being placed upon illustrating the subject matter of the present application.
Fig. 1 shows a flowchart of a material display method according to an embodiment of the present application.
Fig. 2 is a schematic diagram illustrating a display area of a material display method according to an embodiment of the present application.
Fig. 3 is a schematic diagram illustrating a one-level texture map according to an embodiment of the present application.
Fig. 4 is a block diagram showing a structure of a material display apparatus according to an embodiment of the present application.
Fig. 5 shows a schematic structural diagram of an electronic device provided in an embodiment of the present application.
Icon: 100-an electronic device; 110-a processor; 120-a memory; 130-display screen; 400-material display means; 410-an obtaining module; 420-a creation module; 430-a processing module; 440-superposition module.
Detailed Description
The technical solutions in the embodiments of the present application will be described below with reference to the drawings in the embodiments of the present application.
It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus, once an item is defined in one figure, it need not be further defined and explained in subsequent figures. Meanwhile, relational terms such as "first," "second," and the like may be used solely in the description herein to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element.
In addition, the defects (highlight defects easily occur, and display effect is affected) existing in the material display method in the prior art are the results obtained after the applicant has practiced and studied carefully, and therefore, the discovery process of the above defects and the solutions proposed in the following embodiments of the present application for the above defects should be considered as contributions of the applicant to the present application.
In order to solve the above problem, embodiments of the present application provide a material display method, an apparatus, an electronic device, and a computer-readable storage medium, so that on the premise of realizing light emission of a dynamic material, occurrence of highlight flaws is avoided, and a display effect is improved.
The technology can be realized by adopting corresponding software, hardware and a combination of software and hardware. The following describes embodiments of the present application in detail.
Referring to fig. 1, an embodiment of the present application provides a material display method applied to an electronic device. The steps involved will be described below with reference to fig. 1.
Step S110: the method comprises the steps of obtaining a material to be displayed for displaying in a preset area of a display area, wherein the material to be displayed in the preset area is a material to be emitted.
In the embodiment of the present application, the range of the display area S1 is determined by the staff member in advance according to the actual application scenario.
In some application scenarios, such as displaying text via an LED display screen, the display area S1 may be the entire interface presented by the display screen.
In some application scenarios, the display area S1 may also be a local interface presented by a display screen of the electronic device. For example, in the game execution interface, the display area S1 is a dialog area for communication between players.
In the embodiment of the present application, a preset region S2 is further included in the display region S1.
It should be noted that, when the display area S1 needs to display the material to be displayed (where the material to be displayed may be a text or other patterns), if the display areas of some of the material to be displayed are located in the preset area S2, it indicates that the material to be displayed is the material to be emitted which needs to present the light emitting effect. As shown in fig. 2, the characters currently located inside S2 need to present the light-emitting effect, and the characters located outside S2 need not present the light-emitting effect.
Corresponding to the game running interface, the position of the preset area S2 may be a text description area for describing the game content in the game, or a fixed area corresponding to the latest text in the dialog box when the players communicate with each other.
Of course, the to-be-displayed material for displaying in the display area S1 may still be arranged and combined according to the current display requirement in the dynamic material display mode in the prior art, so as to obtain the to-be-displayed material for displaying in the display area S1, so as to obtain the to-be-displayed material located in the preset area S2 later, and meanwhile, when a dynamic effect is generated, the time problem of consuming a large amount of storage resources and manufacturing the material may be avoided, thereby reducing the cost.
In addition, the material to be lighted displayed in the preset area S2 may change with time, and in some embodiments, the step of acquiring the material to be lighted in the preset area S2 may be triggered every time the material to be lighted in the preset area S2 is determined to have a change compared with the previous time, so as to continuously realize the effect of displaying the light on the dynamic text.
As for how to determine that the material to be emitted in the preset area S2 has a change compared with the previous time, the change is implemented by the pre-stored character change determination logic, and since the content of this part is a mature prior art, it is not described here again.
Step S120: and creating a primary texture map which has the same size as the preset area and contains the material to be luminous.
After the material to be luminous is obtained, in order to avoid highlight flaws occurring in the subsequent generation of the luminous effect, in the embodiment of the application, the luminous effect is not generated by taking a single character as a unit, but the luminous effect is generated by taking the material to be luminous as a unit, so that the luminous effect of two adjacent single characters is prevented from generating an overlapping part, and further the highlight flaws are avoided.
Optionally, in this embodiment of the application, after the material to be emitted is obtained, a primary texture map may be created for the material to be emitted.
The size of the first-level texture map is the same as that of the preset area S2, and the first-level texture map comprises materials to be luminous, which need to present the light emitting effect currently.
The currently created one-level texture map corresponding to the preset area S2 in fig. 2 and the material to be lighted to be displayed therein is shown in fig. 3.
Step S130: and carrying out fuzzy processing on the primary texture mapping to obtain a luminous texture mapping with a luminous effect.
In order to obtain a luminous texture map with a luminous effect, after a first-level texture map is obtained, the first-level texture map can be subjected to blurring processing. Wherein, the fuzzy treatment can make the texture mapping have luminous effect.
The following will describe a procedure of the blurring process.
In some embodiments, N +1 multilevel texture maps may be created based on the one-level texture map.
The N +1 th multilevel texture map is obtained by filtering the nth multilevel texture map, the size of the N +1 th multilevel texture map is half of that of the nth multilevel texture map, N is 0, 1, 2, 3, and when N is 0, the nth multilevel texture map is a one-level texture map.
After N +1 multilevel texture maps are obtained, each multilevel texture map needs to be superimposed back to the first level texture map.
Specifically, each of the multiple levels of texture maps may be sequentially superimposed back to the previous created multiple levels of texture maps from the N +1 th multiple levels of texture maps according to the reverse order of the creation order of the N +1 multiple levels of texture maps, and finally a texture map, that is, a light-emitting texture map, is obtained.
Taking N as 0, 1, 2, and 3 in sequence, and the size of the one-level texture map is 16 × 16 as an example, in this embodiment, 4 multi-level texture maps need to be created.
When N is equal to 0, filtering the 0 th multilevel texture map (i.e., the one-level texture map) to obtain a first multilevel texture map, where the size of the first multilevel texture map is 8 × 8.
And when the N is equal to 1, filtering the first multi-level texture map to obtain a second multi-level texture map, wherein the size of the first multi-level texture map is 4 multiplied by 4.
And when the N is 2, filtering the second multilevel texture map to obtain a third multilevel texture map, wherein the size of the third multilevel texture map is 2 multiplied by 2.
And when the N is 3, filtering the third multilevel texture map to obtain a fourth multilevel texture map, wherein the size of the fourth multilevel texture map is 1 multiplied by 1.
It is worth pointing out that, when the nth multi-level texture map is filtered to obtain the (N + 1) th multi-level texture map, a black background texture with the same size as that of the (N + 1) th multi-level texture map needs to be created first, then the size of the (N + 1) th multi-level texture map is used as a sliding window, and the filter is used to perform fast addition and summation on each pixel value included in the nth multi-level texture map in the determined sliding window, so that the content included in the nth multi-level texture map is compressed into the black background texture with the current size, and the (N + 1) th multi-level texture map is obtained.
The filter may be a Standard box filter (Standard box filter) or other conventional filter.
After obtaining the four multilevel texture maps, the fourth multilevel texture map is overlaid back to the previous created multilevel texture map (i.e., the third multilevel texture map) from the fourth multilevel texture map, the overlaid third multilevel texture map is overlaid back to the previous created multilevel texture map (i.e., the second multilevel texture map), the overlaid second multilevel texture map is overlaid back to the previous created multilevel texture map (i.e., the first multilevel texture map), and the overlaid first multilevel texture map is overlaid back to the previous created multilevel texture map (i.e., the first multilevel texture map). The superimposed first-level texture map is the luminous texture map.
Since the size of the resulting luminous texture map is the same as the size of the first-level texture map, the size of the luminous texture map is the same as the size of the preset area S2.
It should be noted that the process of adding the (N + 1) th multilevel texture map back to the (N) th multilevel texture map is a process of enlarging the (N + 1) th multilevel texture map. In this process, this may be accomplished by a filter, such as a tent filter or other conventional filter.
In addition, it should be noted that the embodiment of the present application is applicable to the material display of the game interface.
Since the texture can only be operated by the game camera in the game interface, in this embodiment, when the first-level texture map is blurred, the game camera needs to be created first, and then the first-level texture map is blurred by the game camera.
Of course, the range of view of the game camera needs to be the same size as the preset area S2.
In the above embodiment, the position of the preset region S2 may be adjusted by adjusting the visual field range of the game camera.
Step S140: and superposing the luminous texture mapping to the material to be displayed in the preset area.
After the light-emitting texture map is obtained, the light-emitting texture map can be directly covered on the preset area S2, and since the size of the light-emitting texture map is the same as that of the preset area S2 and the material to be emitted contained in the light-emitting texture map is the same as that in the preset area S2, for a user, the material in the preset area S2 can be observed to show a light-emitting effect.
The material display method includes the steps of firstly obtaining a material to be luminous in a preset area, which needs to show a light emitting effect, forming a primary texture map by taking the whole material to be luminous as an object, conducting fuzzy processing on the primary texture map to obtain a luminous texture map which is the same as the primary texture map in size and content and has a luminous effect, and then overlaying the luminous texture map on the material to be displayed in the preset area to enable the material in the preset area to show the light emitting effect. In the process, the dynamic effect is separated from the realization process of the luminous effect, so that after the whole material needing to be dynamically displayed is determined, the luminous texture map is made in real time for the whole material and is overlapped back to the whole material to realize the luminous effect, namely, the luminous effect is not generated by taking a single character as a unit, but the luminous effect is generated by taking the whole situation of the material to be luminous as a unit, so that the luminous effect of two adjacent single characters is prevented from generating an overlapped part, and the highlight defect is avoided. In addition, the process of realizing the dynamic effect is separated from the process of realizing the luminous effect, so that the process of generating the luminous effect by taking the dynamic material to be luminous as a whole unit does not influence the realization process of the dynamic effect, and the dynamic effect can be processed by taking a single character as a unit, thereby avoiding consuming a large amount of storage resources and time for manufacturing the material, and further reducing the cost.
As shown in fig. 4, an embodiment of the present application further provides a material display apparatus 400, and the material display apparatus 400 may include: an acquisition module 410, a creation module 420, a processing module 430, and a superposition module 440.
The acquiring module 410 is configured to acquire a material to be displayed for displaying in a preset area of a display area, where the material to be displayed in the preset area is a material to be emitted;
a creating module 420, configured to create a primary texture map that is the same as the preset area size and contains the material to be emitted;
the processing module 430 is configured to perform fuzzy processing on the primary texture map to obtain a light-emitting texture map with a light-emitting effect;
and the superimposing module 440 is configured to superimpose the light-emitting texture map onto the material to be displayed in the preset area.
In a possible implementation manner, the processing module 430 is configured to create N +1 multilevel texture maps, where the nth multilevel texture map is filtered to obtain an N +1 th multilevel texture map, and the size of the N +1 th multilevel texture map is half of the size of the nth multilevel texture map, N is 0, 1, 2, 3, and when N is 0, the nth multilevel texture map is the one-level texture map; and according to the reverse order of the creation order of the N +1 multilevel texture maps, sequentially adding each multilevel texture map back to the previously created multilevel texture map from the N +1 multilevel texture map, and finally obtaining a texture map, namely the luminous texture map.
In a possible implementation, the processing module 430 is configured to filter the nth multilevel texture map by using a box filter.
In one possible implementation, the processing module 430 is configured to sequentially overlay each multi-level texture map back to the previously created multi-level texture map through a tent filter.
In a possible implementation manner, the display area is a game interface, and the processing module 430 is configured to create a game camera, where a visual field range of the game camera is the same as the size of the preset area; and carrying out fuzzy processing on the primary texture mapping by the game camera.
In a possible implementation manner, the apparatus further includes a determining module, configured to determine that the material to be displayed has a change compared to a previous time.
In one possible implementation, the material to be displayed is a text.
In a possible embodiment, the position of the preset area is adjustable.
The material display device 400 provided in the embodiment of the present application has the same implementation principle and the same technical effects as those of the foregoing method embodiments, and for the sake of brief description, no mention is made in the device embodiment, and reference may be made to the corresponding contents in the foregoing method embodiments.
In addition, an embodiment of the present application further provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a computer, the computer program executes the steps included in the above-mentioned material display method.
In addition, referring to fig. 5, an electronic device 100 for implementing the material display method and apparatus of the embodiment of the present application is further provided in the embodiment of the present application.
Alternatively, the electronic Device 100 may be, but is not limited to, a Personal Computer (PC), a smart phone, a tablet PC, a Mobile Internet Device (MID), a Personal digital assistant, a server, and the like.
Among them, the electronic device 100 may include: processor 110, memory 120, display 130.
It should be noted that the components and structure of electronic device 100 shown in FIG. 5 are exemplary only, and not limiting, and electronic device 100 may have other components and structures as desired.
The processor 110, memory 120, display 130, and other components that may be present in the electronic device 100 are electrically connected to each other, directly or indirectly, to enable the transfer or interaction of data. For example, the processor 110, the memory 120, the display 130, and other components that may be present may be electrically connected to each other via one or more communication buses or signal lines.
The memory 120 is used for storing a program, such as a program corresponding to the material display method described above or a material display apparatus described above. Alternatively, when a material display device is stored in the memory 120, the material display device includes at least one software function module that can be stored in the memory 120 in the form of software or firmware (firmware).
Alternatively, the software function module included in the material display apparatus may be solidified in an Operating System (OS) of the electronic device 100.
The processor 110 is used to execute executable modules stored in the memory 120, such as software functional modules or computer programs included in the material display apparatus. When the processor 110 receives the execution instruction, it may execute the computer program, for example, to perform: the method comprises the steps of obtaining a material to be displayed for displaying in a preset area of a display area, wherein the material to be displayed in the preset area is a material to be emitted; creating a primary texture map which has the same size as the preset area and contains the material to be luminous; carrying out fuzzy processing on the primary texture mapping to obtain a luminous texture mapping with a luminous effect; and superposing the luminous texture mapping to the material to be displayed in the preset area.
Of course, the method disclosed in any of the embodiments of the present application can be applied to the processor 110, or implemented by the processor 110.
In summary, in the material display method, the device, the electronic apparatus, and the computer-readable storage medium according to the embodiments of the present invention, a material to be illuminated that needs to present a light-emitting effect in a preset area is obtained, then a primary texture map is formed by taking an entire material to be illuminated as an object, the primary texture map is subjected to a fuzzy processing, so that a light-emitting texture map having the same size and content as the primary texture map and having a light-emitting effect is obtained, and then the light-emitting texture map is superimposed on the material to be displayed in the preset area, so that the material in the preset area presents the light-emitting effect. In the process, the dynamic effect is separated from the realization process of the luminous effect, so that after the whole material needing to be dynamically displayed is determined, the luminous texture map is made in real time for the whole material and is superposed back to the whole material to realize the luminous effect, namely, the luminous effect is not generated by taking a single character as a unit, but the luminous effect is generated by taking the whole situation of the material to be luminous as a unit, so that the luminous effect of two single characters is prevented from generating an overlapped part, and the highlight defect is avoided. In addition, the process of realizing the dynamic effect is separated from the process of realizing the luminous effect, so that the process of generating the luminous effect by taking the dynamic material to be luminous as a whole unit does not influence the realization process of the dynamic effect, and the dynamic effect can be processed by taking a single character as a unit, thereby avoiding consuming a large amount of storage resources and time for manufacturing the material, and further reducing the cost.
It should be noted that, in the present specification, the embodiments are all described in a progressive manner, each embodiment focuses on differences from other embodiments, and the same and similar parts among the embodiments may be referred to each other.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus and method can be implemented in other ways. The apparatus embodiments described above are merely illustrative, and for example, the flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of apparatus, methods and computer program products according to various embodiments of the present application. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
In addition, functional modules in the embodiments of the present application may be integrated together to form an independent part, or each module may exist separately, or two or more modules may be integrated to form an independent part.
The functions, if implemented in the form of software functional modules and sold or used as a stand-alone product, may be stored in a computer-readable storage medium. Based on such understanding, the technical solution of the present application or portions thereof that substantially contribute to the prior art may be embodied in the form of a software product stored in a storage medium and including instructions for causing a computer device (which may be a personal computer, a notebook computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
The above description is only for the specific embodiments of the present application, but the scope of the present application is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present application, and shall be covered by the scope of the present application.

Claims (10)

1. A method for displaying material, the method comprising:
the method comprises the steps of obtaining a material to be displayed for displaying in a preset area of a display area, wherein the material to be displayed in the preset area is a material to be emitted;
creating a primary texture map which has the same size as the preset area and contains the material to be luminous;
carrying out fuzzy processing on the primary texture mapping to obtain a luminous texture mapping with a luminous effect;
and superposing the luminous texture mapping to the material to be displayed in the preset area.
2. The method of claim 1, wherein the blurring the first level texture map comprises:
creating N +1 multilevel texture maps, wherein the nth multilevel texture map is filtered to obtain an N +1 multilevel texture map, the size of the N +1 multilevel texture map is half of that of the nth multilevel texture map, N is 0, 1, 2, 3, and when N is 0, the nth multilevel texture map is the one-level texture map;
and according to the reverse order of the creation order of the N +1 multilevel texture maps, sequentially adding each multilevel texture map back to the previously created multilevel texture map from the N +1 multilevel texture map, and finally obtaining a texture map, namely the luminous texture map.
3. The method of claim 2, wherein the filtering the nth multi-level texture map comprises:
and filtering the Nth multilevel texture map by using a box filter.
4. The method of claim 2, wherein said sequentially overlaying each multi-level texture map back to its previously created multi-level texture map comprises:
each multi-level texture map is superimposed back to its previous created multi-level texture map in turn by the tent filter.
5. The method of claim 1, wherein the display area is a game interface, and wherein the blurring the primary texture map comprises:
creating a game camera, wherein the visual field range of the game camera is the same as the size of the preset area;
and carrying out fuzzy processing on the primary texture mapping by the game camera.
6. The method according to any one of claims 1-5, wherein prior to said obtaining material to be displayed for display within a preset region of a display area, the method further comprises:
and determining that the material to be displayed has change compared with the previous moment.
7. The method according to any one of claims 1-5, wherein the material to be displayed is text.
8. A material display apparatus, characterized in that the apparatus comprises:
the device comprises an acquisition module, a display module and a display module, wherein the acquisition module is used for acquiring a material to be displayed for displaying in a preset area of a display area, and the material to be displayed in the preset area is a material to be emitted;
the creating module is used for creating a primary texture map which has the same size as the preset area and contains the material to be luminous;
the processing module is used for carrying out fuzzy processing on the primary texture mapping to obtain a luminous texture mapping with a luminous effect;
and the superposition module is used for superposing the luminous texture mapping to the material to be displayed in the preset area.
9. An electronic device, comprising: a memory and a processor, the memory and the processor connected;
the memory is used for storing programs;
the processor calls a program stored in the memory to perform the method of any of claims 1-7.
10. A computer-readable storage medium, on which a computer program is stored which, when executed by a computer, performs the method of any one of claims 1-7.
CN202110258067.5A 2021-03-09 2021-03-09 Material display method, device, electronic equipment and computer readable storage medium Active CN112949253B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110258067.5A CN112949253B (en) 2021-03-09 2021-03-09 Material display method, device, electronic equipment and computer readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110258067.5A CN112949253B (en) 2021-03-09 2021-03-09 Material display method, device, electronic equipment and computer readable storage medium

Publications (2)

Publication Number Publication Date
CN112949253A true CN112949253A (en) 2021-06-11
CN112949253B CN112949253B (en) 2024-06-11

Family

ID=76229095

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110258067.5A Active CN112949253B (en) 2021-03-09 2021-03-09 Material display method, device, electronic equipment and computer readable storage medium

Country Status (1)

Country Link
CN (1) CN112949253B (en)

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103824310A (en) * 2014-03-03 2014-05-28 网易(杭州)网络有限公司 Method for generating characters with special light effect
CN108921927A (en) * 2018-06-12 2018-11-30 阿里巴巴集团控股有限公司 A kind of fireworks special efficacy implementation method, device and equipment based on particle
CN109493376A (en) * 2017-09-13 2019-03-19 腾讯科技(深圳)有限公司 Image processing method and device, storage medium and electronic device
CN109603155A (en) * 2018-11-29 2019-04-12 网易(杭州)网络有限公司 Merge acquisition methods, device, storage medium, processor and the terminal of textures
US20190347771A1 (en) * 2018-05-10 2019-11-14 Google Llc Generating and displaying blur in images
CN111729307A (en) * 2020-07-30 2020-10-02 腾讯科技(深圳)有限公司 Virtual scene display method, device, equipment and storage medium
CN111784811A (en) * 2020-06-01 2020-10-16 北京像素软件科技股份有限公司 Image processing method and device, electronic equipment and storage medium
CN111951156A (en) * 2020-08-24 2020-11-17 杭州趣维科技有限公司 Method for drawing photoelectric special effect of graph
CN112258611A (en) * 2020-10-23 2021-01-22 北京字节跳动网络技术有限公司 Image processing method and device

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103824310A (en) * 2014-03-03 2014-05-28 网易(杭州)网络有限公司 Method for generating characters with special light effect
CN109493376A (en) * 2017-09-13 2019-03-19 腾讯科技(深圳)有限公司 Image processing method and device, storage medium and electronic device
US20190347771A1 (en) * 2018-05-10 2019-11-14 Google Llc Generating and displaying blur in images
CN112074865A (en) * 2018-05-10 2020-12-11 谷歌有限责任公司 Generating and displaying blur in an image
CN108921927A (en) * 2018-06-12 2018-11-30 阿里巴巴集团控股有限公司 A kind of fireworks special efficacy implementation method, device and equipment based on particle
CN109603155A (en) * 2018-11-29 2019-04-12 网易(杭州)网络有限公司 Merge acquisition methods, device, storage medium, processor and the terminal of textures
CN111784811A (en) * 2020-06-01 2020-10-16 北京像素软件科技股份有限公司 Image processing method and device, electronic equipment and storage medium
CN111729307A (en) * 2020-07-30 2020-10-02 腾讯科技(深圳)有限公司 Virtual scene display method, device, equipment and storage medium
CN111951156A (en) * 2020-08-24 2020-11-17 杭州趣维科技有限公司 Method for drawing photoelectric special effect of graph
CN112258611A (en) * 2020-10-23 2021-01-22 北京字节跳动网络技术有限公司 Image processing method and device

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
叶迎萍: "基于着色器LOD的模型间平滑过渡技术", 组合机床与自动化加工技术, no. 11, 20 November 2018 (2018-11-20), pages 1 - 3 *
周珍艮: "基于纹理贴图及高斯滤波的纹理反走样方法", 《计算机工程》, no. 05, 5 March 2008 (2008-03-05), pages 207 - 209 *

Also Published As

Publication number Publication date
CN112949253B (en) 2024-06-11

Similar Documents

Publication Publication Date Title
US20120127198A1 (en) Selection of foreground characteristics based on background
US7688317B2 (en) Texture mapping 2-D text properties to 3-D text
CN107544730B (en) Picture display method and device and readable storage medium
CN105138317A (en) Window display processing method and device applied to terminal equipment
Pardo et al. Visualization of high dynamic range images
WO2022083250A1 (en) Image processing method and apparatus, electronic device, and computer-readable storage medium
JP4950316B2 (en) DATA GENERATION DEVICE, DATA GENERATION METHOD, AND DATA GENERATION PROGRAM
CN114974148B (en) Font display enhancement method, device, equipment and storage medium for ink screen
Fabrikant Towards an Understanding of Geovisualization with Dynamic Displays: Issues and Prospects.
JP2006332908A (en) Color image display apparatus, color image display method, program, and recording medium
CN114399437A (en) Image processing method and device, electronic equipment and storage medium
Keil et al. Structural salience of landmark pictograms in maps as a predictor for object location memory performance
CN111311720B (en) Texture image processing method and device
CN107491289B (en) Window rendering method and device
CN113316018B (en) Method, device and storage medium for overlaying time information on video picture display
CN112949253A (en) Material display method and device, electronic equipment and computer readable storage medium
CN110880164A (en) Image processing method, device and equipment and computer storage medium
WO2015052514A2 (en) Rendering composites/layers for video animations
CN116385469A (en) Special effect image generation method and device, electronic equipment and storage medium
CN114428573B (en) Special effect image processing method and device, electronic equipment and storage medium
JP2015125543A (en) Line-of-sight prediction system, line-of-sight prediction method, and line-of-sight prediction program
CN109675312B (en) Game item list display method and device
CN110795096A (en) Slice layer special effect realization method based on openlayers
CN109803163B (en) Image display method and device and storage medium
CN117726722A (en) Special effect generation method, device and equipment for video image and readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant