CN115382203A - Internet game image processing system - Google Patents

Internet game image processing system Download PDF

Info

Publication number
CN115382203A
CN115382203A CN202210666322.4A CN202210666322A CN115382203A CN 115382203 A CN115382203 A CN 115382203A CN 202210666322 A CN202210666322 A CN 202210666322A CN 115382203 A CN115382203 A CN 115382203A
Authority
CN
China
Prior art keywords
game
scene
module
int
function
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210666322.4A
Other languages
Chinese (zh)
Inventor
陈曙
龚勋
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou Jedi Technology Co ltd
Original Assignee
Hangzhou Jedi Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hangzhou Jedi Technology Co ltd filed Critical Hangzhou Jedi Technology Co ltd
Priority to CN202210666322.4A priority Critical patent/CN115382203A/en
Publication of CN115382203A publication Critical patent/CN115382203A/en
Pending legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/77Retouching; Inpainting; Scratch removal
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/66Methods for processing data by generating or executing the game program for rendering three dimensional images

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The invention discloses an internet game image processing system, comprising: the game environment perception module is used for calculating data according to the simulation time of a game scene, the simulation light effect and the weather; the model generation module is used for automatically calculating and repairing images, exposure and colors according to the movement of the generated game character model in the scene; and the image processing module is used for acquiring simulation data information of the movement of the game character model in the scene and restoring the presented game three-dimensional image to the optimal image quality by combining the calculation amount of the game environment perception module. The method solves the problems of image quality color, shadow errors and the like caused by the problem of calculated amount of a dynamic scene, greatly ensures the maintenance of the image quality and the identification of character input when a game is played, and optimizes the experience of the game.

Description

Internet game image processing system
Technical Field
The invention relates to the technical field of image processing, in particular to an internet game image processing system.
Background
With the concept of electronic competition being proposed in recent years and the progress of the electronic competition, the gold time period caused by the development of the online game is changed from the traditional cognition of people on the game. With the continuous influx of capital into the game industry, the rapid development of the game industry is also promoted, especially the development of three-dimensional games, and the requirements of people on image quality and the visual experience impact from the image quality are greatly developed.
Publication (bulletin) no: CN108601976B, published (public) day: 2021-06-15, discloses an image processing system for a game for progressing a game by generating various event processes by moving a character and other objects corresponding to a user in a virtual world, comprising: a game progress processing unit that generates the various event processes and progresses the game; a real map storage unit that stores real map information including geographic information in the real world; a virtual map information generation unit that generates virtual map information including coordinate information of the object on virtual geographic information corresponding to geographic information on the real map information, in accordance with the game progress by the game progress processing unit; a position information acquisition unit that selects a coordinate position in the real world; a real display data generating unit that generates real display data indicating the coordinate position on the real map information, based on the coordinate position selected by the position information acquiring unit; a virtual display data generating unit that generates virtual display data in which the character is expressed on the virtual map information corresponding to the coordinate position, based on the coordinate position acquired by the position information acquiring unit; and a display control unit that displays both of the display data generated by the real display data generation unit and the virtual display data generated by the virtual display data generation unit, or displays one of the display data selected by the display control unit, or displays one of the display data and the virtual display data with a part of the one of the display data and the virtual display data superimposed on the other of the display data, wherein the virtual display data generation unit performs a fade-out process as follows: a plurality of low resolution maps having a scale reduced in steps and an enlarged area are arranged concentrically around a detailed map having a large scale with a predetermined coordinate position as a center, and one image is made to gradually transit to the other image at a boundary portion between the detailed map and the low resolution map and a boundary portion between the low resolution maps.
In the prior art including the above patent, the operation amount of the large three-dimensional game is huge, so that the operation team focuses on smooth operation of the game and setting of the scenario, and the automatic calculation of the light and color in the game scene in the image quality is relatively poor. When the number of running frames of a picture is high, the problems of shadow and poor color can occur.
Disclosure of Invention
It is an object of the present invention to provide an internet game image processing system for solving the above-mentioned problems.
In order to achieve the above purpose, the invention provides the following technical scheme: an internet game image processing system comprising:
the game environment perception module is used for calculating data according to the simulation time of a game scene, the simulation light effect and the weather;
the model generation module is used for automatically calculating and repairing images, exposure and colors according to the movement of the generated game character model in the scene;
and the image processing module is used for acquiring simulation data information of the movement of the game character model in the scene and restoring the presented game three-dimensional image to the optimal image quality by combining the calculation amount of the game environment perception module.
Preferably, the game environment sensing module identifies the static scene to determine whether the static scene contains the illumination angle, the illumination intensity and the shadow formation specifically includes:
determining a plurality of boundary lines included in a static scene, determining two adjacent boundary lines as to-be-determined sub-regions, acquiring XYZ values of each pixel point in a single to-be-determined sub-region, if the XYZ values of the single to-be-determined sub-region change, extracting the change positions of the XYZ values of the single to-be-determined sub-region, traversing all the to-be-determined sub-regions to determine the change position of each to-be-determined sub-region, if a plurality of width values corresponding to the plurality of change positions of all the to-be-determined sub-regions are linear changes, determining that the sun shadow is included, and otherwise determining that the sun shadow is not included.
Preferably, the XYZ production color is represented as: c = X (X) + Y (Y) + Z (Z), where (X), (Y), (Z) are the primary color quantities of the XYZ color model, X, Y, Z are the tristimulus scaling coefficients, and the transformation relationship of the second is:
[(X)]=[2.7689 1.7517 1.1302][R]
[(Y)]=[1.0000 4.5907 0.0601][G]
[(X)]=[0.0000 0.0565 5.5943][B]。
preferably, the game environment sensing module includes dynamic scene recognition, and the processing of the shadow and light color difference data of the game character located in the current movement of the game environment sensing module by the dynamic scene recognition map specifically includes:
if a plurality of width values are linearly changed, calculating to obtain an initial linear function f (x) = k of the width values 2 x + b0, a linear function f (x) determining the width value at the next instant t t =k 2 x+b0+k 1 t; wherein k is 2 Is the slope of the initial linear function, k 1 For the coefficient of variation, t is the race time and b0 is the offset of the initial linear function.
Preferably, the dynamic scene recognition includes a color difference rendering module for marking a pixel covered by a distinguishing scene in the exposure game image as 1, a distinguishing scene marking module for marking a pixel not in the distinguishing scene as 0, and a distinguishing scene drawing module for re-rendering a drawn area with a pixel of 0 in the exposure game image.
Preferably, the color difference rendering module sets the display to VGA high resolution graphics mode using a graphics initialization function
Figure RE-GDA0003864538570000031
Figure RE-GDA0003864538570000041
Preferably, the color difference rendering module can set the filling mode of the floodfi110 function by using a filling mode function, wherein the function prototype is void far setfillstyle (int pattern, int color);
Figure RE-GDA0003864538570000042
Figure RE-GDA0003864538570000051
preferably, the dynamic scene recognition further comprises character output in scene mode, and in graphics mode, preferably using a special character output function:
1. a character output function void far outext (char str); outputting at the current location:
2. setting functions of output fonts, sizes and directions, namely void far settexttype (int font, direction, int charsize);
definition of fonts in graphics.h, default is 8 x 8 lattice words determined by hardware
Molding; the direction is 0 to represent horizontal output, and the direction is 1 to represent vertical output, namely, the value of charsize is 0^10
3. A character output positioning function, namely void far outttextxy (int x, int y, char str);
Figure RE-GDA0003864538570000052
Figure RE-GDA0003864538570000061
an electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, the processor implementing the internet game image processing system of the above aspect when executing the program.
A computer-readable storage medium on which a computer program is stored, which, when executed by a processor, implements the internet game image processing system of the above-described aspect.
In the above technical solution, the internet game image processing system provided by the present invention has the following beneficial effects: the system can automatically calculate, process and optimize the parameters of the scene, namely the color and the shadow of the character model under the dynamic scene and the static scene, and automatically optimize the character input in the scene under the dynamic scene and the static scene. The image quality is greatly maintained and the character input is greatly identified during the game, and the experience of the game is optimized.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
An internet game image processing system, comprising:
the game environment perception module is used for calculating data according to game scene simulation time, simulation light effect and weather;
the model generation module is used for automatically calculating and repairing images, exposure and colors according to the movement of the generated game character model in the scene;
and the image processing module is used for acquiring simulation data information of the movement of the game character model in the scene and restoring the presented game three-dimensional image to the optimal image quality by combining the calculation amount of the game environment perception module.
Specifically, in the above embodiment, the identifying and determining whether the static scene includes the illumination angle, the illumination intensity, and the shadow formation of the static scene by the game environment sensing module specifically includes:
determining a plurality of boundary lines contained in a static scene, determining two adjacent boundary lines as regions to be determined, acquiring XYZ values of each pixel point in a single region to be determined, if the XYZ values of the single region to be determined change, extracting the change positions of the XYZ values of the single region to be determined, traversing all the regions to be determined to determine the change position of each region to be determined, if a plurality of width values corresponding to a plurality of change positions of all the regions to be determined are linear changes, determining that the shadow of the sun is contained, otherwise, determining that the shadow of the sun is not contained.
Further, XYZ production color is expressed as: c = X (X) + Y (Y) + Z (Z), where (X), (Y), (Z) are the primary color quantities of the XYZ color model, X, Y, Z are the tristimulus scaling coefficients, and the transformation relationship is:
[(X)]=[2.7689 1.7517 1.1302][R]
[(Y)]=[1.0000 4.5907 0.0601][G]
[(X)]=[0.0000 0.0565 5.5943][B]。
furthermore, the game environment perception module comprises dynamic scene recognition, the game character of the dynamic scene recognition image is positioned in the shadow and the light color difference data processing of the current game environment perception module, and the data processing specifically comprises the following steps:
if a plurality of width values are linearly changed, calculating to obtain an initial linear function f (x) = k of the width values 2 x + b0, a linear function f (x) determining the width value at the next instant t t =k 2 x+b0+k 1 t; wherein k is 2 Is the slope of the initial linear function, k 1 For the coefficient of variation, t is the race time and b0 is the offset of the initial linear function.
Still further, the dynamic scene recognition includes a color difference rendering module for marking a pixel covered by a distinguishing scene in the exposure game image as 1, a distinguishing scene marking module for marking a pixel not in the distinguishing scene as 0, and a distinguishing scene drawing module for re-rendering a drawn area in the exposure game image with the pixel of 0.
In the above embodiment, the system can automatically calculate, process and optimize the parameters of the scene, namely, the colors and shadows of the character models in the dynamic scene and the static scene, and automatically optimize the character input in the scene in the dynamic scene and the static scene. The image quality is greatly maintained and the character input is greatly identified during the game, and the experience of the game is optimized.
As a further embodiment of the present invention, the color difference rendering module uses a graphics initialization function to set the display to a VGA high resolution graphics mode
#inc.lude<stdio.h>
#include<graphics.h>
void main().
int graphdriver=VGA;
int graphmode=VGAHI;
ini tgraph(&graphdr iver,&graphmode,"\TC");
bar3d(200,200,400,400,50,1);
getch(;
closegraph();
}
Detection display i.e. void far detectgraph (int far graph driver, int far graph mode)
egg, automatically detecting the display and completing the configuration of the display
Figure BDA0003691732160000091
As a further embodiment provided by the present invention, the color difference rendering module can use a filling mode function to set the filling mode of the floodfi110 function, and the function prototype is a void far setfillstyle (int pattern, int color);
Figure BDA0003691732160000092
Figure BDA0003691732160000101
as a further embodiment of the present invention, the dynamic scene recognition further comprises character output in a scene mode, and in a graphics mode, preferably using a special character output function:
1. a character output function of void far outext (char str); outputting at the current location:
2. setting functions of output fonts, sizes and directions, namely void far settextyle (int font, direction, int charsize);
definition of fonts in graphics.h, default is 8 x 8 lattice words determined by hardware
Molding; the direction is 0 to represent horizontal output, and the direction is 1 to represent vertical output, namely, the value of charsize is 0^10
3. A character output positioning function, i.e. void far outextxy (int x, int y, char str);
Figure BDA0003691732160000111
as will be appreciated by one skilled in the art, embodiments of the present invention may be provided as a method, system, or computer program product. Accordingly, the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present invention is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
The principle and the implementation mode of the invention are explained by applying specific embodiments in the invention, and the description of the embodiments is only used for helping to understand the method and the core idea of the invention; meanwhile, for a person skilled in the art, according to the idea of the present invention, the specific embodiments and the application range may be changed, and in summary, the content of the present specification should not be construed as a limitation to the present invention.
An embodiment of the present application further provides a specific implementation manner of an electronic device, which is capable of implementing all steps in the method in the foregoing embodiment, where the electronic device specifically includes the following contents:
a processor (processor), a memory (memory), a communication Interface (Communications Interface), and a bus;
the processor, the memory and the communication interface complete mutual communication through the bus;
the processor is configured to call the computer program in the memory, and when the processor executes the computer program, the processor implements all the steps of the method in the above embodiments.
Embodiments of the present application further provide a computer-readable storage medium capable of implementing all the steps of the method in the above embodiments, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the computer program implements all the steps of the method in the above embodiments.
All the embodiments in the present specification are described in a progressive manner, and the same and similar parts among the embodiments are referred to each other, and each embodiment focuses on the differences from other embodiments. In particular, for the hardware + program class embodiment, since it is substantially similar to the method embodiment, the description is simple, and reference may be made to part of the description of the method embodiment for relevant points. Although embodiments of the present description provide method steps as described in embodiments or flowcharts, more or fewer steps may be included based on conventional or non-inventive means. The order of steps recited in the embodiments is merely one manner of performing the steps in a multitude of sequences, and does not represent a unique order of performance. When an actual apparatus or end product executes, it may execute sequentially or in parallel (e.g., parallel processors or multi-threaded environments, or even distributed data processing environments) according to the method shown in the embodiment or the figures. The terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, the presence of additional identical or equivalent elements in a process, method, article, or apparatus that comprises the recited elements is not excluded. For convenience of description, the above devices are described as being divided into various modules by functions, and are described separately. Of course, in implementing the embodiments of the present description, the functions of each module may be implemented in one or more software and/or hardware, or a module implementing the same function may be implemented by a combination of multiple sub-modules or sub-units, and the like. The above-described embodiments of the apparatus are merely illustrative, and for example, the division of the units is only one logical division, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form. The present invention is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flowchart illustrations and/or block diagrams, and combinations of flows and/or blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
As will be appreciated by one skilled in the art, embodiments of the present description may be provided as a method, system, or computer program product. Accordingly, embodiments of the present description may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, embodiments of the present description may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein. All the embodiments in the present specification are described in a progressive manner, and the same and similar parts among the embodiments are referred to each other, and each embodiment focuses on the differences from other embodiments. In particular, for the system embodiment, since it is substantially similar to the method embodiment, the description is simple, and for the relevant points, reference may be made to the partial description of the method embodiment. In the description of the specification, reference to the description of "one embodiment," "some embodiments," "an example," "a specific example," or "some examples" or the like means that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the embodiments of the specification.
In this specification, the schematic representations of the terms used above are not necessarily intended to refer to the same embodiment or example. Moreover, various embodiments or examples and features of various embodiments or examples described in this specification can be combined and combined by one skilled in the art without being mutually inconsistent. The above description is only an example of the embodiments of the present disclosure, and is not intended to limit the embodiments of the present disclosure. Various modifications and alterations to the embodiments described herein will be apparent to those skilled in the art. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the embodiments of the present specification should be included in the scope of the claims of the embodiments of the present specification.

Claims (10)

1. An internet game image processing system, comprising:
the game environment perception module is used for calculating data according to the simulation time of a game scene, the simulation light effect and the weather;
the model generation module is used for automatically calculating and repairing images, exposure and colors according to the movement of the generated game character model in the scene;
and the image processing module is used for acquiring simulation data information of the movement of the game character model in the scene and restoring the presented game three-dimensional image to the optimal image quality by combining the calculation amount of the game environment perception module.
2. The system of claim 1, wherein the game context awareness module is further configured to identify the static scene to determine whether the static scene contains the lighting angle, the lighting intensity, and the shadow formation, and further configured to:
determining a plurality of boundary lines contained in a static scene, determining two adjacent boundary lines as regions to be determined, acquiring XYZ values of each pixel point in a single region to be determined, if the XYZ values of the single region to be determined change, extracting the change positions of the XYZ values of the single region to be determined, traversing all the regions to be determined to determine the change position of each region to be determined, if a plurality of width values corresponding to a plurality of change positions of all the regions to be determined are linear changes, determining that the shadow of the sun is contained, otherwise, determining that the shadow of the sun is not contained.
3. The system of claim 2, wherein the XYZ production color representation is: c = X (X) + Y (Y) + Z (Z), where (X), (Y), (Z) are the primary color quantities of the XYZ color model, X, Y, Z are the tristimulus scaling coefficients, and the transformation relationship is:
[(X)]=[2.7689 1.7517 1.1302][R]
[(Y)]=[1.0000 4.5907 0.0601][G]
[(X)]=[0.0000 0.0565 5.5943][B]。
4. the system of claim 1, wherein the game context awareness module comprises a dynamic scene recognizer for recognizing the shadow and light color difference data of the game character moving in the game context awareness module, and the system comprises:
if a plurality of width values are linearly changed, calculating an initial linear function f (x) = k of the width values 2 x + b0, a linear function f (x) determining the width value at the next instant t t =k 2 x+b0+k 1 t; wherein k is 2 Is the slope of the initial linear function, k 1 For the coefficient of variation, t is the race time and b0 is the offset of the initial linear function.
5. The system of claim 4, wherein the dynamic scene recognition module comprises a color difference rendering module for marking pixels in the exposure game image covered by the distinguishing scene as 1, a distinguishing scene marking module for marking pixels in the exposure game image not covered by the distinguishing scene as 0, and a distinguishing scene drawing module for re-rendering the drawing in the area of the exposure game image with pixels of 0.
6. The system of claim 5, wherein the color difference rendering module uses a graphics initialization function to set the display to a VGA high resolution graphics mode
#inc.lude<stdio.h>
#include<graphics.h>
void main().
int graphdriver=VGA;
int graphmode=VGAHI;
ini tgraph(&graphdr iver,&graphmode,"\TC");
bar3d(200,200,400,400,50,1);
getch(;
closegraph();
}
Detection display i.e. void far detectgraph (int far graph driver, int far graph mode)
egg, automatically detecting the display and completing the configuration of the display
Figure FDA0003691732150000021
Figure FDA0003691732150000031
7. The system of claim 5, wherein the color difference rendering module can use a fill mode function to set the fill mode of flodfi 110 function, the function prototype is void far set fillstyle (int pattern, int color);
Figure FDA0003691732150000032
Figure FDA0003691732150000041
8. an internet game image processing system according to claim 5, wherein the dynamic scene recognition further comprises character output in scene mode, preferably using a specialized character output function in graphics mode:
1. a character output function of void far outext (char str); outputting at the current location:
2. setting functions of output fonts, sizes and directions, namely void far settextyle (int font, direction, int charsize);
definition of fonts in graphics.h, default is 8 x 8 lattice word determined by hardware
Molding; the value of 0^10 of the direction is 0 to represent horizontal output, and the value of 1 to represent vertical output
3. A character output positioning function, i.e. void far outextxy (int x, int y, char str);
Figure FDA0003691732150000042
Figure FDA0003691732150000051
9. an electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, wherein the processor implements the internet game image processing system of any one of claims 1 to 8 when executing the program.
10. A computer-readable storage medium on which a computer program is stored, the computer program, when being executed by a processor, implementing the internet game image processing system according to any one of claims 1 to 8.
CN202210666322.4A 2022-06-13 2022-06-13 Internet game image processing system Pending CN115382203A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210666322.4A CN115382203A (en) 2022-06-13 2022-06-13 Internet game image processing system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210666322.4A CN115382203A (en) 2022-06-13 2022-06-13 Internet game image processing system

Publications (1)

Publication Number Publication Date
CN115382203A true CN115382203A (en) 2022-11-25

Family

ID=84116246

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210666322.4A Pending CN115382203A (en) 2022-06-13 2022-06-13 Internet game image processing system

Country Status (1)

Country Link
CN (1) CN115382203A (en)

Similar Documents

Publication Publication Date Title
CN110490896B (en) Video frame image processing method and device
CN106575445B (en) Fur avatar animation
CN109045691B (en) Method and device for realizing special effect of special effect object
US8059119B2 (en) Method for detecting border tiles or border pixels of a primitive for tile-based rendering
CN1265502A (en) Image processing apparatus and method
CN109448137B (en) Interaction method, interaction device, electronic equipment and storage medium
CN112241993B (en) Game image processing method and device and electronic equipment
US8917281B2 (en) Image rendering method and system
CN111047506B (en) Environmental map generation and hole filling
CN111583398B (en) Image display method, device, electronic equipment and computer readable storage medium
CN108197555B (en) Real-time face fusion method based on face tracking
CN110930492B (en) Model rendering method, device, computer readable medium and electronic equipment
CN111402385B (en) Model processing method and device, electronic equipment and storage medium
EP2728551A1 (en) Image rendering method and system
CN115382203A (en) Internet game image processing system
CN112634420B (en) Image special effect generation method and device, electronic equipment and storage medium
CN114832375A (en) Ambient light shielding processing method, device and equipment
JP2010231347A (en) Method and apparatus for image generation
CN113648655B (en) Virtual model rendering method and device, storage medium and electronic equipment
CN116112716B (en) Virtual person live broadcast method, device and system based on single instruction stream and multiple data streams
CN116958332B (en) Method and system for mapping 3D model in real time of paper drawing based on image recognition
CN116939293B (en) Implantation position detection method and device, storage medium and electronic equipment
CN116939294B (en) Video implantation method and device, storage medium and electronic equipment
JP5359054B2 (en) GAME DEVICE, GAME DEVICE IMAGE PROCESSING METHOD, GAME IMAGE PROCESSING PROGRAM, ITS STORAGE MEDIUM, AND GAME DEVICE IMAGE PROCESSING METHOD
CN113648655A (en) Rendering method and device of virtual model, storage medium and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination