CN112138378A - Method, device and equipment for realizing flashing effect in 2D game and storage medium - Google Patents

Method, device and equipment for realizing flashing effect in 2D game and storage medium Download PDF

Info

Publication number
CN112138378A
CN112138378A CN202011001599.2A CN202011001599A CN112138378A CN 112138378 A CN112138378 A CN 112138378A CN 202011001599 A CN202011001599 A CN 202011001599A CN 112138378 A CN112138378 A CN 112138378A
Authority
CN
China
Prior art keywords
coordinate data
pixel
light
random noise
flash effect
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202011001599.2A
Other languages
Chinese (zh)
Other versions
CN112138378B (en
Inventor
张文斌
李妍
叶子莹
何磊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Netease Hangzhou Network Co Ltd
Original Assignee
Netease Hangzhou Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netease Hangzhou Network Co Ltd filed Critical Netease Hangzhou Network Co Ltd
Priority to CN202011001599.2A priority Critical patent/CN112138378B/en
Publication of CN112138378A publication Critical patent/CN112138378A/en
Application granted granted Critical
Publication of CN112138378B publication Critical patent/CN112138378B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/30Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device
    • A63F2300/308Details of the user interface

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The application provides a method, a device, equipment and a storage medium for realizing a flashing effect in a 2D game, wherein the method comprises the following steps: acquiring initial UV coordinate data and luminous point coordinates of each pixel on a display; respectively carrying out offset calculation on the initial UV coordinate data of each pixel according to the coordinates of the light-emitting points to obtain first UV coordinate data corresponding to each pixel; generating light emission coordinate data according to the first UV coordinate data and a random noise algorithm; generating mask coordinate data according to the first UV coordinate data and the light-emitting point coordinates; the flash effect data are obtained according to the light emission coordinate data and the shade coordinate data, and the flash effect is displayed on the graphical user interface according to the flash effect data, so that the size of the game bag body is greatly reduced, the resource occupation and the performance pressure of the terminal equipment are reduced, and the game performance can be optimized.

Description

Method, device and equipment for realizing flashing effect in 2D game and storage medium
Technical Field
The present application relates to computer technologies, and in particular, to a method, an apparatus, a device, and a storage medium for implementing a flash effect in a 2D game.
Background
In various games, designers often design special effects to improve the visual effect of the games and improve the user experience, and the flashing effect is a special effect with high occurrence frequency in the game scene.
In the prior art, when a flash effect is realized, a general production process is that a designer draws a series of continuous flash sequence frame albums by using a drawing tool or downloads the flash sequence frame albums from the internet, then the albums are imported into a game engine, and a particle system of the game engine is matched with the albums to produce the flash effect.
Because the method is realized on the basis of the sequence frame atlas, the more sequence frame atlas resources required by the special effects in the game are more, which inevitably causes that the game bag body is larger and larger, the occupied memory of the terminal equipment is larger and larger when the terminal equipment runs the game, and the game performance is poorer.
Disclosure of Invention
The application provides a method, a device, equipment and a storage medium for realizing the flashing effect in a 2D game, which realize the flashing effect independent of a sequence frame atlas, avoid the flashing effect from occupying too many resources of terminal equipment and improve the game performance.
In a first aspect, the present application provides a method for implementing a flash effect in a 2D game, where a graphical user interface is provided through a display of a terminal device, the method including:
acquiring initial UV coordinate data and luminous point coordinates of each pixel on the display;
respectively carrying out offset calculation on the initial UV coordinate data of each pixel according to the luminous point coordinates to obtain first UV coordinate data corresponding to each pixel;
generating light emission coordinate data according to the first UV coordinate data and a random noise algorithm;
generating mask coordinate data according to the first UV coordinate data and the light-emitting point coordinates;
and obtaining flash effect data according to the light emission coordinate data and the shade coordinate data, and displaying a flash effect on the graphical user interface according to the flash effect data.
In one possible implementation, the acquiring initial UV coordinate data of each pixel on the display includes:
acquiring UV coordinate data of a virtual display screen and screen resolution of the display, wherein the virtual display screen is a square display screen;
and calculating the initial UV coordinate data of each pixel on the display according to the UV coordinate data and the screen resolution.
In one possible implementation, the generating light emission coordinate data according to each of the first UV coordinate data and a random noise algorithm includes:
performing coordinate conversion on each first UV coordinate data by adopting a polar coordinate algorithm to obtain polar coordinate data corresponding to each pixel;
and calculating according to the polar coordinate data, a preset random noise parameter and a random noise algorithm to obtain the light ray emission coordinate data.
In a feasible implementation manner, the calculating according to the polar coordinate data, a preset random noise parameter, and a random noise algorithm to obtain the light emission coordinate data includes:
calculating by using a random noise algorithm according to the polar coordinate data and a preset first random noise parameter to obtain second UV coordinate data corresponding to each pixel;
calculating by using a random noise algorithm according to the initial UV coordinate data and a preset second random noise parameter to obtain third UV coordinate data corresponding to each pixel;
the preset random noise parameters comprise a preset first random noise parameter and a preset second random noise parameter;
and obtaining the light emission coordinate data according to the second UV coordinate data and the third UV coordinate data corresponding to each pixel.
In a possible implementation manner, the obtaining the light emission coordinate data by using the second UV coordinate data and the third UV coordinate data corresponding to each pixel includes:
obtaining fourth UV coordinate data corresponding to each pixel according to the second UV coordinate data corresponding to each pixel and the flash light and shade parameters; the flash brightness parameter is used for controlling the brightness of a flash effect;
and obtaining the light emission coordinate data according to the fourth UV coordinate data and the third UV coordinate data corresponding to each pixel.
In one possible implementation, generating mask coordinate data according to the first UV coordinate data and the light emitting point coordinates includes:
calculating the distance between each pixel point on the display and the light-emitting point according to the first UV coordinate data and the light-emitting point coordinate;
and calculating the mask coordinate data according to the distance between each pixel point and the light-emitting point.
In one possible implementation, the calculating the mask coordinate data according to the distance between each pixel point and the light-emitting point includes:
subtracting a light spot size parameter from the distance between each pixel point and the light-emitting point to obtain target distance data; the light spot size parameter is used for controlling the range size of the flash effect;
and calculating to obtain the mask coordinate data according to the target distance data.
In a second aspect, the present application provides an apparatus for implementing a flash effect in a 2D game, wherein a graphical user interface is provided through a display of a terminal device, the apparatus includes:
the acquisition module is used for acquiring initial UV coordinate data and luminous point coordinates of each pixel on the display;
the first processing module is used for respectively carrying out offset calculation on the initial UV coordinate data of each pixel according to the luminous point coordinates to obtain first UV coordinate data corresponding to each pixel;
the second processing module is used for generating light ray emission coordinate data according to the first UV coordinate data and a random noise algorithm;
the third processing module is used for generating mask coordinate data according to the first UV coordinate data and the light-emitting point coordinates;
and the display module is used for obtaining the flash effect data according to the light emission coordinate data and the shade coordinate data and displaying the flash effect on the graphical user interface according to the flash effect data.
In one possible implementation manner, the obtaining module is configured to:
acquiring UV coordinate data of a virtual display screen and screen resolution of the display, wherein the virtual display screen is a square display screen;
and calculating the initial UV coordinate data of each pixel on the display according to the UV coordinate data and the screen resolution.
In one possible implementation, the second processing module is configured to:
performing coordinate conversion on each first UV coordinate data by adopting a polar coordinate algorithm to obtain polar coordinate data corresponding to each pixel;
and calculating according to the polar coordinate data, a preset random noise parameter and a random noise algorithm to obtain the light ray emission coordinate data.
In one possible implementation, the second processing module is configured to:
calculating by using a random noise algorithm according to the polar coordinate data and a preset first random noise parameter to obtain second UV coordinate data corresponding to each pixel;
calculating by using a random noise algorithm according to the initial UV coordinate data and a preset second random noise parameter to obtain third UV coordinate data corresponding to each pixel;
the preset random noise parameters comprise a preset first random noise parameter and a preset second random noise parameter;
and obtaining the light emission coordinate data according to the second UV coordinate data and the third UV coordinate data corresponding to each pixel.
In one possible implementation, the second processing module is configured to:
obtaining fourth UV coordinate data corresponding to each pixel according to the second UV coordinate data corresponding to each pixel and the flash light and shade parameters; the flash brightness parameter is used for controlling the brightness of a flash effect;
and obtaining the light emission coordinate data according to the fourth UV coordinate data and the third UV coordinate data corresponding to each pixel.
In one possible implementation manner, the third processing module is configured to:
calculating the distance between each pixel point on the display and the light-emitting point according to the first UV coordinate data and the light-emitting point coordinate;
and calculating the mask coordinate data according to the distance between each pixel point and the light-emitting point.
In one possible implementation manner, the third processing module is configured to:
subtracting a light spot size parameter from the distance between each pixel point and the light-emitting point to obtain target distance data; the light spot size parameter is used for controlling the range size of the flash effect;
and calculating to obtain the mask coordinate data according to the target distance data.
In a third aspect, the present application provides an electronic device comprising a memory and a processor, the memory and the processor being connected;
the memory is used for storing a computer program;
the processor is adapted to carry out the method of any of the first aspects when the computer program is executed.
In a fourth aspect, the present application provides a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements the method according to any one of the first aspect.
The application provides a method, a device, equipment and a storage medium for realizing a flash effect in a 2D game, wherein in the method, initial UV coordinate data and luminous point coordinates of each pixel on a display are obtained, light emission coordinate data are generated by using a random noise algorithm, and a mask is formed by the mask coordinate data so as to partially shield a light emission image formed by the light emission coordinate data; therefore, the flash effect is displayed on the graphical user interface, the flash effect data is obtained through the pixel coordinate data, the random noise and the mask coordinate data, the flash effect is achieved based on the flash effect data, the dependence on the sequence frame image set resources is avoided, the size of a game bag body can be greatly reduced, the resource occupation and the performance pressure of terminal equipment are reduced, and the game performance can be optimized.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings needed to be used in the description of the embodiments or the prior art will be briefly introduced below, and it is obvious that the drawings in the following description are some embodiments of the present application, and for those skilled in the art, other drawings can be obtained according to these drawings without inventive exercise.
Fig. 1 is a flowchart illustrating a method for implementing a flash effect in a 2D game according to an embodiment of the present application;
FIG. 2 is a flowchart illustrating a method for implementing a flash effect in a 2D game according to another embodiment of the present application;
fig. 3 is a diagram illustrating a coordinate data rendering effect according to an embodiment of the present application;
FIG. 4 is a diagram illustrating an effect of rendering coordinate data according to another embodiment of the present application;
FIG. 5 is a diagram illustrating an effect of rendering coordinate data according to another embodiment of the present application;
FIG. 6 is a noise map provided by an embodiment of the present application;
FIG. 7 is a diagram illustrating an effect of rendering coordinate data according to another embodiment of the present application;
FIG. 8 is a mask according to an embodiment of the present application;
FIG. 9 is a mask according to another embodiment of the present application;
FIG. 10 is a diagram illustrating a flashlight effect provided by an embodiment of the present application;
FIG. 11 is a diagram of a flashlight effect provided by another embodiment of the present application;
FIG. 12 is a diagram of a flashlight effect provided by another embodiment of the present application;
FIG. 13 is a diagram of a flashlight effect provided by another embodiment of the present application;
FIG. 14 is a diagram of a flashlight effect provided by another embodiment of the present application;
FIG. 15 is a diagram of a flashlight effect provided by another embodiment of the present application;
FIG. 16 is a diagram of a flashlight effect provided by another embodiment of the present application;
FIG. 17 is a diagram of a flashlight effect provided by another embodiment of the present application;
FIG. 18 is a diagram of a flashlight effect provided by another embodiment of the present application;
FIG. 19 is a diagram of a flashlight effect provided by another embodiment of the present application;
FIG. 20 is a diagram of a flashlight effect provided by another embodiment of the present application;
fig. 21 is a schematic structural diagram of an apparatus for implementing a flash effect in a 2D game according to an embodiment of the present application;
fig. 22 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present application clearer, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some embodiments of the present application, but not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The method for realizing the flashing effect in the 2D game is used for displaying the flashing effect on the graphical user interface in the process that a user runs the game through the terminal equipment. For example, a user operating a virtual character during a game uses a skill corresponding to a flashing effect, and thus, after the user uses the skill, the graphical user interface needs to display the flashing effect.
In the related technology, the flash effect is realized based on a series of flash sequence frame atlas, the sequence frame atlas forms a continuously changing flash effect, under the mode, the more sequence frame atlas resources required by the more special effects in the game are more, which inevitably causes the game bag body to be larger and larger, and the terminal equipment needs to load excessive picture resources when running the game, the memory occupation of the terminal equipment is larger and larger, and the game performance is poorer.
In order to solve the problems, the application provides a method for realizing the flash effect independent of a sequence frame image set, which comprises the steps of acquiring initial UV coordinate data and luminous point coordinates of each pixel on a display, generating light emission coordinate data by using a random noise algorithm, and forming a mask by using the mask coordinate data so as to partially shield a light emission image formed by the light emission coordinate data; thereby displaying a flashing effect on the graphical user interface. In addition, the flash brightness and the flash range can be adjusted by adjusting the parameters of the light ray emission coordinate data and the mask coordinate data, so that the dynamic flash effect is realized.
The following describes in detail a method for implementing a flash effect in a 2D game according to an embodiment. It is to be understood that the following detailed description may be combined with certain embodiments, and that the same or similar concepts or processes may not be repeated in some embodiments.
Fig. 1 is a flowchart illustrating a method for implementing a flash effect in a 2D game according to an embodiment of the present application. The execution subject of the method can be a device for realizing the flash effect in the 2D game, the device is realized by software and/or hardware, and the device can be an electronic device. For example, the apparatus may be a terminal device running a game application, such as a mobile phone, a tablet computer, a personal computer, and the like, and may also be a game server. In the method of the application, a graphical user interface is provided through a display of the terminal device. The method of the embodiment comprises the following steps:
s101, acquiring initial UV coordinate data and luminous point coordinates of each pixel on a display.
In this embodiment, the game progress may be a fixed scene preset in the game or a special effect of a preset action, or may be a scene triggered by a user operation, for example, a user operates a virtual character to use a skill in a game process, and the skill corresponds to a flash effect, so that after the user operates and uses the skill, the graphic user interface needs to display the flash effect.
Correspondingly, the light-emitting point may be a preset position in the game scene, or may be any position on the display triggered by the user operation, which is not limited in this embodiment.
In the method, acquiring initial UV coordinate data of each pixel on a display comprises the following steps: acquiring UV coordinate data of a virtual display screen and screen resolution of a display, wherein the virtual display screen is a square display screen in an example; and calculating initial UV coordinate data of each pixel on the display according to the UV coordinate data and the screen resolution of the virtual display screen.
S102, respectively carrying out offset calculation on the initial UV coordinate data of each pixel according to the coordinates of the light-emitting points to obtain first UV coordinate data corresponding to each pixel.
The initial UV coordinate data of each pixel on the display is two-dimensional coordinate data with a vertex at the lower left corner of the display as an origin (0, 0), and in a flash effect application scene, the flash effect is diffused towards the periphery with a luminous point as a central point, so that the luminous point of the flash effect is required to be used as the origin, and the initial UV coordinate data is subjected to offset calculation according to the luminous point coordinates to obtain first UV coordinate data.
And S103, generating light ray emission coordinate data according to the first UV coordinate data and a random noise algorithm.
The random noise algorithm can be an algorithm for calculating and randomly generating noise points according to a two-dimensional array, parameters of the random noise algorithm are set according to needs, random noise calculation is carried out on the first UV coordinate data through the random noise algorithm, and the obtained gray-scale image corresponding to the light emission coordinate data is an image of light emitted from the light emitting points to the periphery.
Specifically, generating the light emission coordinate data according to each first UV coordinate data and the random noise algorithm may include: respectively carrying out coordinate conversion on each first UV coordinate data by adopting a polar coordinate algorithm to obtain polar coordinate data corresponding to each pixel; and calculating according to the polar coordinate data, the preset random noise parameters and a random noise algorithm to obtain light ray emission coordinate data.
Optionally, calculating according to the polar coordinate data, a preset random noise parameter and a random noise algorithm to obtain light emission coordinate data, including:
calculating by using a random noise algorithm according to the polar coordinate data and a preset first random noise parameter to obtain second UV coordinate data corresponding to each pixel; calculating by using a random noise algorithm according to the initial UV coordinate data and a preset second random noise parameter to obtain third UV coordinate data corresponding to each pixel; the preset random noise parameters comprise a preset first random noise parameter and a preset second random noise parameter; and obtaining light ray emission coordinate data according to the second UV coordinate data and the third UV coordinate data corresponding to each pixel.
Optionally, calculating according to the second UV coordinate data and the third UV coordinate data corresponding to each pixel to obtain light emission coordinate data, including: obtaining fourth UV coordinate data corresponding to each pixel according to the second UV coordinate data corresponding to each pixel and the flash light and shade parameters; the flash brightness parameter is used for controlling the brightness of the flash effect; and obtaining light ray emission coordinate data according to the fourth UV coordinate data and the third UV coordinate data corresponding to each pixel.
And S104, generating mask coordinate data according to the first UV coordinate data and the light-emitting point coordinates.
The mask image corresponding to the mask coordinate data is used for partially covering the light emission image corresponding to the light emission coordinate data, so that a flash area corresponding to a flash effect and a black area outside the flash area are formed.
Specifically, generating mask coordinate data according to the first UV coordinate data and the light emitting point coordinates includes: calculating the distance between each pixel point and the light-emitting point on the display according to the first UV coordinate data and the light-emitting point coordinate; and calculating the coordinate data of the shade according to the distance between each pixel point and the luminous point.
Optionally, calculating mask coordinate data according to a distance between each pixel point and the light emitting point, including: subtracting the size parameter of the light spot from the distance between each pixel point and the light-emitting point to obtain target distance data; the light spot size parameter is used for controlling the range size of the flash effect; and calculating to obtain the mask coordinate data according to the target distance data.
And S105, obtaining the flash effect data according to the light emission coordinate data and the shade coordinate data, and displaying the flash effect on the graphical user interface according to the flash effect data.
And after the light emission coordinate data are processed by adopting different shade coordinate data, the sizes of the flash areas corresponding to the obtained different flash effect data are different.
According to the method, initial UV coordinate data and luminous point coordinates of each pixel on a display are obtained, light emission coordinate data are generated by using a random noise algorithm, and a mask is formed through the mask coordinate data so as to partially shield a light emission graph formed by the light emission coordinate data; therefore, the flash effect is displayed on the graphical user interface, the flash effect independent of the sequence frame atlas is realized, the flash effect is prevented from occupying too many resources of the terminal equipment, and the game performance is improved.
Further explanation will be given on each step on the basis of the above-described embodiment.
Fig. 2 is a flowchart illustrating a method for implementing a flash effect in a 2D game according to another embodiment of the present application. Specifically, as shown in fig. 2, the method includes:
s201, acquiring UV coordinate data of the virtual display screen and screen resolution of the display.
The virtual display screen is a square display screen, namely the UV coordinate data of the virtual display screen obtained in the step is the UV coordinate data of a square patch.
S202, calculating initial UV coordinate data of each pixel on the display according to the UV coordinate data and the screen resolution.
Since the display screen of a terminal device for running a game application is not usually square, the UV coordinate data of the virtual display screen needs to be adapted to the full screen to obtain the initial UV coordinate data of each pixel.
Specifically, initial UV coordinate data is calculated according to UV coordinate data of the virtual display screen and the display screen resolution, the initial UV coordinate data being calculated in the following manner:
uvx=uvxi*(ScreenParamsx/ScreenParamsx)
uvy=uvyi*(ScreenParamsy/ScreenParamsx)
wherein, uvxiThe data are U coordinate data in the UV coordinate data of the virtual display screen; uvyiV coordinate data in the UV coordinate data of the virtual display screen; uvxThe U coordinate data in the initial UV coordinate data; uvyV coordinate data in the initial UV coordinate data; screenparamsxThe resolution width of a display screen of the terminal equipment; screenparamsyThe resolution height of the display screen of the terminal equipment. If the initial UV coordinate data is rendered on the graphical display interface of the terminal device, the graph shown in fig. 3 can be obtained.
And S203, respectively carrying out offset calculation on the initial UV coordinate data of each pixel according to the coordinates of the light-emitting points to obtain first UV coordinate data corresponding to each pixel.
The initial UV coordinate data is a two-dimensional coordinate system with the lower left corner of the graph as shown in fig. 3 as the origin (0, 0). In the application scene of the flash effect, the flash effect is diffused to the periphery by taking the luminous point as a central point, so that the luminous point of the flash effect is required to be taken as an original point, and the initial UV coordinate data is subjected to offset calculation according to the luminous point coordinate to obtain first UV coordinate data.
For example, the initial UV coordinate data is subtracted by (0.5 ), and then the coordinate data after the initial UV coordinate data is subtracted by (0.5 ) is converted to the origin of the first UV coordinate data.
And S204, performing coordinate conversion on each first UV coordinate data by adopting a polar coordinate algorithm to obtain polar coordinate data corresponding to each pixel.
The polar coordinate algorithm may be an algorithm for performing polar coordinate conversion on the UV coordinate in the prior art, the input parameter of the polar coordinate algorithm is first UV coordinate data, which is a two-dimensional array, and the polar coordinate data output by the polar coordinate algorithm is a one-dimensional numerical value. The polar coordinate data is filled into a three-dimensional array, and the three-dimensional array is rendered as RGB colors on a graphical user interface to obtain a graph as shown in fig. 4.
And S205, calculating according to the polar coordinate data, the preset random noise parameters and a random noise algorithm to obtain light ray emission coordinate data.
The polar coordinate data, that is, the one-dimensional data obtained in S204, is filled into two-dimensional data (one-dimensional data ), and then is used as a parameter of the random noise algorithm, and in addition, a preset first random noise parameter is used as a second parameter of the random noise algorithm to perform random noise calculation, for example, if the first random noise parameter is set to 30, a second UV coordinate data is obtained by obtaining an output through the random noise algorithm, and the second UV coordinate data is also a one-dimensional numerical value. Filling the one-dimensional numerical value of the second UV coordinate data into three-dimensional RGB colors can obtain a graph of light rays emitted from a central point to the periphery as shown in fig. 5, where the position of the central point is the position of the coordinates of the light emitting point. At this time, the second UV coordinate data may be regarded as light emission coordinate data.
In addition to directly using the second UV coordinate data as the light emission coordinate data, in this step, the second UV coordinate data may be further processed after being obtained, so as to obtain the light emission coordinate data. The method specifically comprises the following steps:
calculating by using a random noise algorithm according to the initial UV coordinate data and a preset second random noise parameter to obtain third UV coordinate data corresponding to each pixel; and obtaining light ray emission coordinate data according to the second UV coordinate data and the third UV coordinate data corresponding to each pixel.
Taking the two-dimensional data of the initial UV coordinate data as a parameter of a random noise algorithm, and taking a preset second random noise parameter as a second parameter of the random noise algorithm to perform random noise calculation, for example, the second random noise parameter may be set to 12, and then obtaining output through the random noise algorithm to obtain third UV coordinate data, where the third UV coordinate data is also a one-dimensional numerical value. Filling the third UV coordinate data, which is a one-dimensional value, into three-dimensional RGB colors, may result in a noise map as shown in fig. 6.
And multiplying the second UV coordinate data and the third UV coordinate data to obtain light ray emission coordinate data, and filling the obtained light ray emission coordinate data into three-dimensional RGB colors to obtain a light ray irregular dark and bright graph shown in figure 7, so that the brightness of the light rays is more natural and random.
S206, calculating the distance between each pixel point and the light-emitting point on the display according to the first UV coordinate data and the light-emitting point coordinates, and calculating the mask coordinate data according to the distance between each pixel point and the light-emitting point.
Specifically, the distance from the first UV coordinate data of each pixel point to the light-emitting point is subtracted by a preset numerical value to obtain the mask coordinate data. Illustratively, the predetermined value is 1.
mask=1-sqrt(uvxj×uvxj+uvyj×uvyj)
Wherein, mask is mask coordinate data; uvxjThe U coordinate data in the first UV coordinate data; uvyjV coordinate data in the first UV coordinate data; sqrt (uv)xj×uvxj+uvyj×uvyj) The distance between the first UV coordinate data of each pixel point and the light-emitting point is obtained.
Rendering the mask coordinate data may result in a mask map as shown in fig. 8.
Optionally, the mask coordinate data may be further subjected to remapping (remap) calculation to obtain new mask coordinate data, so that black and white boundaries of the obtained new mask image are more clearly black and white, as shown in fig. 9 for example.
And S207, multiplying the light ray emission coordinate data and the shade coordinate data to obtain flash effect data, and displaying the flash effect on a graphical user interface according to the final flash effect data.
The result obtained by multiplying the mask coordinate data obtained in S206 or the remapped new mask coordinate data with the light emission coordinate data is flash effect data, and the flash effect data is rendered, so that the flash effect map shown in fig. 10 can be displayed on the graphical user interface.
It can be seen that through the steps, the flashing graph can be displayed on the graphical user interface of the terminal equipment through the algorithm. How to display the effect of the flash change is explained below.
In S205, when obtaining the light emission coordinate data according to the second UV coordinate data and the third UV coordinate data corresponding to each pixel, controlling the brightness of the flash effect according to the flash brightness parameter specifically includes: obtaining fourth UV coordinate data corresponding to each pixel according to the second UV coordinate data corresponding to each pixel and the flash light and shade parameters; and obtaining light ray emission coordinate data according to the fourth UV coordinate data and the third UV coordinate data corresponding to each pixel.
And adding a flash light and shade parameter to each data in the second UV coordinate data to obtain fourth UV coordinate data, and calculating by adopting the fourth UV coordinate data and the third UV coordinate data, namely multiplying the fourth UV coordinate data and the third UV coordinate data to obtain light emission coordinate data.
The value of the flash brightness parameter can be freely set, when the flash effect is realized, the flash brightness parameter is controlled to slowly change from-0.5 to 1 along with the time, and also can change from 1 to 0, and the steps of the embodiment shown in the figure 2 are repeatedly executed, so that the effect of the flash brightness change can be displayed. For example, fig. 11, 12, 13, 14, and 15 are graphs of flash effects generated when the flash brightness parameters are-0.5, 0, 0.3, 0.6, and 0.8, respectively.
In S206, when calculating the mask coordinate data according to the distance between each pixel point and the light emitting point, the method controls the range size of the flash effect according to the light spot size parameter, and specifically includes: subtracting the size parameter of the light spot from the distance between each pixel point and the light-emitting point to obtain target distance data; and obtaining mask coordinate data according to the target distance data.
The distance between each pixel and the light-emitting point is calculated by the method in S206, and then the light-spot size parameter is subtracted from the distance between each pixel and the light-emitting point to obtain target distance data, and then the target distance data is subtracted from the preset value to obtain mask coordinate data.
The light spot size parameter can be set as required, in order to make the light from small to large, from point to full screen, or from large to small, when the flash effect is realized, the light spot size parameter is controlled to slowly change from-1 to 1 along with the time, or from 1 to-1, the steps of the embodiment shown in fig. 2 are repeatedly executed, that is, the effect of the change of the flash size can be displayed, and the light is made to change from point to full screen, or the full screen is reduced to no light, so that the characteristic of flash is embodied. Exemplary graphs are obtained when the spot size parameters are-1, -0.7, -0.4, 0.2, and 1, respectively, in fig. 16, 17, 18, 19, and 20.
When the flash effect is realized, the steps of the embodiment shown in fig. 2 are repeatedly executed, and the flash brightness parameter and the spot size parameter are controlled to be larger or smaller together with the system time, for example, the flash brightness parameter and the spot size parameter are increased from-1 to 1 in 0.5 seconds, so that the effect of full-screen flash can be generated.
The method of the embodiment realizes the flash effect through the algorithm, does not depend on the resources of the sequence frame atlas, can greatly reduce the size of the game bag body, reduces the resource occupation and the performance pressure of the terminal equipment, and can also optimize the game performance. The method can change parameters to adjust the density and the intensity of light, can also change parameters to change light from a lightless state and then gradually increase the light to finally obtain full-screen white light, can also adjust parameters to reversely decrease the light to the lightless state, can also adjust the light-emitting position point at will, is applicable to screens with any resolution without concerning the screen-adapting problem, does not need any mapping assistance, and is easy to realize.
Fig. 21 is a schematic structural diagram of an apparatus for implementing a flash effect in a 2D game according to an embodiment of the present application. Rendering a graphical user interface on a display of the terminal device by executing the gaming application, the apparatus 210 comprising:
an obtaining module 2101, configured to obtain initial UV coordinate data and coordinates of light emitting points of each pixel on the display;
a first processing module 2102, configured to perform offset calculation on the initial UV coordinate data of each pixel according to the coordinates of the light emitting point, to obtain first UV coordinate data corresponding to each pixel;
the second processing module 2103 is used for generating light emission coordinate data according to the first UV coordinate data and the random noise algorithm;
the third processing module 2104 is configured to generate mask coordinate data according to each of the first UV coordinate data and the coordinates of the light emitting point;
and the display module 2105 is used for obtaining the flash effect data according to the light emission coordinate data and the shade coordinate data and displaying the flash effect on the graphical user interface according to the flash effect data.
In one possible implementation, the obtaining module 2101 is configured to:
acquiring UV coordinate data of a virtual display screen and screen resolution of a display, wherein the virtual display screen is a square display screen;
based on the UV coordinate data and the screen resolution, initial UV coordinate data for each pixel on the display is calculated.
In one possible implementation, the second processing module 2103 is configured to:
respectively carrying out coordinate conversion on each first UV coordinate data by adopting a polar coordinate algorithm to obtain polar coordinate data corresponding to each pixel;
and calculating according to the polar coordinate data, the preset random noise parameters and a random noise algorithm to obtain light ray emission coordinate data.
In one possible implementation, the second processing module 2103 is configured to:
calculating by using a random noise algorithm according to the polar coordinate data and a preset first random noise parameter to obtain second UV coordinate data corresponding to each pixel;
calculating by using a random noise algorithm according to the initial UV coordinate data and a preset second random noise parameter to obtain third UV coordinate data corresponding to each pixel;
the preset random noise parameters comprise a preset first random noise parameter and a preset second random noise parameter;
and obtaining light ray emission coordinate data according to the second UV coordinate data and the third UV coordinate data corresponding to each pixel.
In one possible implementation, the second processing module 2103 is configured to:
obtaining fourth UV coordinate data corresponding to each pixel according to the second UV coordinate data corresponding to each pixel and the flash light and shade parameters; the flash brightness parameter is used for controlling the brightness of the flash effect;
and obtaining light ray emission coordinate data according to the fourth UV coordinate data and the third UV coordinate data corresponding to each pixel.
In one possible implementation, the third processing module 2104 is configured to:
calculating the distance between each pixel point and the light-emitting point on the display according to the first UV coordinate data and the light-emitting point coordinate;
and calculating the coordinate data of the shade according to the distance between each pixel point and the luminous point.
In one possible implementation, the third processing module 2104 is configured to:
subtracting the size parameter of the light spot from the distance between each pixel point and the light-emitting point to obtain target distance data; the light spot size parameter is used for controlling the range size of the flash effect;
and calculating to obtain the mask coordinate data according to the target distance data.
The device provided by this embodiment may be used to implement the method for implementing the flash effect in the 2D game in the above method embodiments, and the implementation principle and the calculation effect are similar, which are not described herein again.
Fig. 22 is a schematic structural diagram of an electronic device according to an embodiment of the present application. As shown in fig. 22, the electronic device 220 includes a memory 2201 and a processor 2202, the memory 2201 and the processor 2202 being connected by a bus 2203.
The memory 2201 is used to store computer programs.
The processor 2202 is arranged to implement the methods in the above-described method embodiments when the computer program is executed.
The present application provides a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements the method of the above-described method embodiments.
Those of ordinary skill in the art will understand that: all or a portion of the steps of implementing the above-described method embodiments may be performed by hardware associated with program instructions. The program may be stored in a computer-readable storage medium. When executed, the program performs steps comprising the method embodiments described above; and the aforementioned storage medium includes: various media that can store program codes, such as ROM, RAM, magnetic or optical disks.
It will be apparent to those skilled in the art that various changes and modifications may be made in the embodiments of the present application without departing from the spirit and scope of the application. Thus, if such modifications and variations of the embodiments of the present application fall within the scope of the claims of the present application and their equivalents, the present application is also intended to encompass such modifications and variations.
In the present application, the terms "include" and variations thereof may refer to non-limiting inclusions; the term "or" and variations thereof may mean "and/or". The terms "first," "second," and the like in this application are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. In the present application, "a plurality" means two or more. "and/or" describes the association relationship of the associated objects, meaning that there may be three relationships, e.g., a and/or B, which may mean: a exists alone, A and B exist simultaneously, and B exists alone. The character "/" generally indicates that the former and latter associated objects are in an "or" relationship.
Finally, it should be noted that: the above embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some or all of the technical features may be equivalently replaced; and the modifications or the substitutions do not make the essence of the corresponding technical solutions depart from the scope of the technical solutions of the embodiments of the present application.

Claims (10)

1. A method for realizing a flashing effect in a 2D game is characterized in that a graphical user interface is provided through a display of a terminal device, and the method comprises the following steps:
acquiring initial UV coordinate data and luminous point coordinates of each pixel on the display;
respectively carrying out offset calculation on the initial UV coordinate data of each pixel according to the luminous point coordinates to obtain first UV coordinate data corresponding to each pixel;
generating light emission coordinate data according to the first UV coordinate data and a random noise algorithm;
generating mask coordinate data according to the first UV coordinate data and the light-emitting point coordinates;
and obtaining flash effect data according to the light emission coordinate data and the shade coordinate data, and displaying a flash effect on the graphical user interface according to the flash effect data.
2. The method of claim 1, wherein said obtaining initial UV coordinate data for each pixel on the display comprises:
acquiring UV coordinate data of a virtual display screen and screen resolution of the display, wherein the virtual display screen is a square display screen;
and calculating the initial UV coordinate data of each pixel on the display according to the UV coordinate data and the screen resolution.
3. The method of claim 1, wherein generating light emission coordinate data based on each of the first UV coordinate data and a random noise algorithm comprises:
performing coordinate conversion on each first UV coordinate data by adopting a polar coordinate algorithm to obtain polar coordinate data corresponding to each pixel;
and calculating according to the polar coordinate data, a preset random noise parameter and a random noise algorithm to obtain the light ray emission coordinate data.
4. The method of claim 3, wherein the calculating according to the polar coordinate data, a preset random noise parameter and a random noise algorithm to obtain the light emission coordinate data comprises:
calculating by using a random noise algorithm according to the polar coordinate data and a preset first random noise parameter to obtain second UV coordinate data corresponding to each pixel;
calculating by using a random noise algorithm according to the initial UV coordinate data and a preset second random noise parameter to obtain third UV coordinate data corresponding to each pixel;
the preset random noise parameters comprise a preset first random noise parameter and a preset second random noise parameter;
and obtaining the light emission coordinate data according to the second UV coordinate data and the third UV coordinate data corresponding to each pixel.
5. The method of claim 4, wherein obtaining the light emission coordinate data from the second UV coordinate data and the third UV coordinate data corresponding to each pixel comprises:
obtaining fourth UV coordinate data corresponding to each pixel according to the second UV coordinate data corresponding to each pixel and the flash light and shade parameters; the flash brightness parameter is used for controlling the brightness of a flash effect;
and obtaining the light emission coordinate data according to the fourth UV coordinate data and the third UV coordinate data corresponding to each pixel.
6. The method according to any one of claims 1-4, wherein said generating mask coordinate data from said first UV coordinate data and said light emission point coordinates comprises:
calculating the distance between each pixel point on the display and the light-emitting point according to the first UV coordinate data and the light-emitting point coordinate;
and calculating the mask coordinate data according to the distance between each pixel point and the light-emitting point.
7. The method of claim 6, wherein said calculating said mask coordinate data based on a distance between said each pixel point and said light emission point comprises:
subtracting a light spot size parameter from the distance between each pixel point and the light-emitting point to obtain target distance data; the light spot size parameter is used for controlling the range size of the flash effect;
and calculating to obtain the mask coordinate data according to the target distance data.
8. An apparatus for implementing a flash effect in a 2D game, wherein a graphical user interface is provided through a display of a terminal device, the apparatus comprising:
the acquisition module is used for acquiring initial UV coordinate data and luminous point coordinates of each pixel on the display;
the first processing module is used for respectively carrying out offset calculation on the initial UV coordinate data of each pixel according to the luminous point coordinates to obtain first UV coordinate data corresponding to each pixel;
the second processing module is used for generating light ray emission coordinate data according to the first UV coordinate data and a random noise algorithm;
the third processing module is used for generating mask coordinate data according to the first UV coordinate data and the light-emitting point coordinates;
and the display module is used for obtaining the flash effect data according to the light emission coordinate data and the shade coordinate data and displaying the flash effect on the graphical user interface according to the flash effect data.
9. An electronic device comprising a memory and a processor, the memory and the processor being connected;
the memory is used for storing a computer program;
the processor is adapted to carry out the method of any one of claims 1-7 when the computer program is executed.
10. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the method according to any one of claims 1-7.
CN202011001599.2A 2020-09-22 2020-09-22 Method, device, equipment and storage medium for realizing flash effect in 2D game Active CN112138378B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011001599.2A CN112138378B (en) 2020-09-22 2020-09-22 Method, device, equipment and storage medium for realizing flash effect in 2D game

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011001599.2A CN112138378B (en) 2020-09-22 2020-09-22 Method, device, equipment and storage medium for realizing flash effect in 2D game

Publications (2)

Publication Number Publication Date
CN112138378A true CN112138378A (en) 2020-12-29
CN112138378B CN112138378B (en) 2024-08-13

Family

ID=73893608

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011001599.2A Active CN112138378B (en) 2020-09-22 2020-09-22 Method, device, equipment and storage medium for realizing flash effect in 2D game

Country Status (1)

Country Link
CN (1) CN112138378B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114399425A (en) * 2021-12-23 2022-04-26 北京字跳网络技术有限公司 Image processing method, video processing method, device, equipment and medium
CN115423684A (en) * 2022-08-23 2022-12-02 成都智元汇信息技术股份有限公司 Method and device for locally amplifying packed picture in rows by using RGB array and display

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1656518A (en) * 2002-05-21 2005-08-17 科乐美股份有限公司 Three dimensional image processing program, three dimensional image processing method, and video game device
WO2016056317A1 (en) * 2014-10-08 2016-04-14 ソニー株式会社 Information processor and information-processing method
CN106815883A (en) * 2016-12-07 2017-06-09 珠海金山网络游戏科技有限公司 The hair treating method and system of a kind of game role
WO2018176185A1 (en) * 2017-03-27 2018-10-04 中国科学院深圳先进技术研究院 Texture synthesis method, and device for same
CN108701372A (en) * 2017-05-19 2018-10-23 华为技术有限公司 A kind of image processing method and device
CN110298327A (en) * 2019-07-03 2019-10-01 北京字节跳动网络技术有限公司 A kind of visual effect processing method and processing device, storage medium and terminal
CN110458922A (en) * 2019-08-14 2019-11-15 深圳市商汤科技有限公司 Method for rendering graph and Related product

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1656518A (en) * 2002-05-21 2005-08-17 科乐美股份有限公司 Three dimensional image processing program, three dimensional image processing method, and video game device
WO2016056317A1 (en) * 2014-10-08 2016-04-14 ソニー株式会社 Information processor and information-processing method
CN106815883A (en) * 2016-12-07 2017-06-09 珠海金山网络游戏科技有限公司 The hair treating method and system of a kind of game role
WO2018176185A1 (en) * 2017-03-27 2018-10-04 中国科学院深圳先进技术研究院 Texture synthesis method, and device for same
CN108701372A (en) * 2017-05-19 2018-10-23 华为技术有限公司 A kind of image processing method and device
CN110298327A (en) * 2019-07-03 2019-10-01 北京字节跳动网络技术有限公司 A kind of visual effect processing method and processing device, storage medium and terminal
CN110458922A (en) * 2019-08-14 2019-11-15 深圳市商汤科技有限公司 Method for rendering graph and Related product

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114399425A (en) * 2021-12-23 2022-04-26 北京字跳网络技术有限公司 Image processing method, video processing method, device, equipment and medium
CN114399425B (en) * 2021-12-23 2024-08-06 北京字跳网络技术有限公司 Image processing method, video processing method, device, equipment and medium
CN115423684A (en) * 2022-08-23 2022-12-02 成都智元汇信息技术股份有限公司 Method and device for locally amplifying packed picture in rows by using RGB array and display

Also Published As

Publication number Publication date
CN112138378B (en) 2024-08-13

Similar Documents

Publication Publication Date Title
CN110956654B (en) Image processing method, device, equipment and storage medium
CN110196746B (en) Interactive interface rendering method and device, electronic equipment and storage medium
CN107680042B (en) Rendering method, device, engine and storage medium combining texture and convolution network
US20130300741A1 (en) Adaptive mesh refinement
JP2017091523A (en) 3d rendering method and 3d rendering apparatus
CN110917617B (en) Method, device, equipment and storage medium for generating water ripple image
US20230125255A1 (en) Image-based lighting effect processing method and apparatus, and device, and storage medium
CN112138378B (en) Method, device, equipment and storage medium for realizing flash effect in 2D game
US11263805B2 (en) Method of real-time image processing based on rendering engine and a display apparatus
US20150009216A1 (en) Storage medium, image processing apparatus, image processing system and image processing method
KR102413146B1 (en) Method for processing 3-d data
CN105760073B (en) A method of realizing graphical perspectives in interactive electric whiteboard software
CN107742317B (en) Rendering method, device and system combining light sensation and convolution network and storage medium
CN112516595B (en) Magma rendering method, device, equipment and storage medium
CN112734900A (en) Baking method, baking device, baking equipment and computer-readable storage medium of shadow map
CN117611703A (en) Barrage character rendering method, barrage character rendering device, barrage character rendering equipment, storage medium and program product
CN112862929B (en) Method, device, equipment and readable storage medium for generating virtual target model
CN115035231A (en) Shadow baking method, shadow baking device, electronic apparatus, and storage medium
CN117355866A (en) Arranging digital images
CN113256484A (en) Method and device for stylizing image
CN118379470B (en) Interactive three-dimensional model texture editing method and system and electronic equipment
JP2010068059A (en) Video data generation program
CN117252974A (en) Mapping method and device for three-dimensional image, electronic equipment and storage medium
CN117745920A (en) Model mapping method, device, equipment and storage medium
CN116959344A (en) Image display method, device, projection equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant