CN113893533B - Display control method and device in game - Google Patents

Display control method and device in game Download PDF

Info

Publication number
CN113893533B
CN113893533B CN202111164019.6A CN202111164019A CN113893533B CN 113893533 B CN113893533 B CN 113893533B CN 202111164019 A CN202111164019 A CN 202111164019A CN 113893533 B CN113893533 B CN 113893533B
Authority
CN
China
Prior art keywords
color information
virtual scene
time point
target
sampling
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202111164019.6A
Other languages
Chinese (zh)
Other versions
CN113893533A (en
Inventor
陈彦宏
成桀桑
李冰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Netease Hangzhou Network Co Ltd
Original Assignee
Netease Hangzhou Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netease Hangzhou Network Co Ltd filed Critical Netease Hangzhou Network Co Ltd
Priority to CN202111164019.6A priority Critical patent/CN113893533B/en
Publication of CN113893533A publication Critical patent/CN113893533A/en
Application granted granted Critical
Publication of CN113893533B publication Critical patent/CN113893533B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/005General purpose rendering architectures
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/66Methods for processing data by generating or executing the game program for rendering three dimensional images

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Computer Graphics (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The embodiment of the invention provides a display control method and a device in a game, wherein the method comprises the following steps: acquiring a current time point in a game process; searching original color information of a target virtual scene corresponding to the current time point; determining target color information of a target virtual scene corresponding to the current time point according to the original color information and a virtual scene map corresponding to a preset time point; and rendering the target virtual scene according to the target color information to obtain a target virtual scene adapted to the current time point. The process of extracting the color characteristic data corresponding to each time point can be performed off-line in advance, and the data can be directly obtained to restore the target color information of the target virtual scene corresponding to the current time point during game operation, so that the calculation amount and the resource cost during operation are greatly reduced, and the method has better effect and performance on various equipment platforms.

Description

Display control method and device in game
Technical Field
The present invention relates to the field of game technologies, and in particular, to a method and an apparatus for controlling display in a game.
Background
The 24-hour day and night environmental effect change is the standard of the main stream game, and the colors of the sky and the surrounding environment can be adaptively changed according to the time change, so that different rendering effects of the game picture can be changed. At present, the change of sky and ambient day and night colors can be solved through a technical means, but aiming at a rendering object in the environment, how to correctly reflect the change of the ambient environment is always a problem difficult point that all game developers are doing the force solving, because the rendering object needs to obtain the information of the ambient environment through an ambient map (CubeMap) for illumination calculation, the corresponding rendering effect is realized, the scheme of adapting the ambient map along with the 24-hour day and night ambient change is very few, the effect is also poor, one or two pre-generated ambient maps can be designated by the common technology, the rendering object obtains the ambient environment information through the ambient map, and the corresponding ambient illumination effect is realized, but the scheme cannot accurately express the ambient information of all time points in 24 hours of a day because only one or two ambient map information are needed according to 3 data sources.
In the prior art, in order to adapt different environment maps for different time points, a relatively fused rendering effect is achieved, when a game of an individual host platform runs, environments can be shot in a scene at regular intervals in real time, corresponding environment maps are generated, and an effect of adapting environment changes in real time is achieved.
At present, no proper real-time adaptation scheme exists in a game of a mobile terminal, most schemes can prepare different environment maps in advance according to different environment points instead of real-time shooting generation, so that the adaptation effect is not ideal, and when scene points with adaptation requirements are more, the occupation of a game bag body and a memory is large, particularly, the mobile platform has very strict requirements on power consumption, memory and calculation overhead. Rendering objects in a mobile-end game need to be rendered based on an environment map in order to show the influence of the surrounding environment on the object, and in order to adapt to 24-hour diurnal environment changes, the following modes are mainly implemented in the prior art:
First kind: an environment map is specified in advance, unified color conversion is carried out on the basis of the environment map according to time and day-night conversion, and the additional load cost is minimum. However, since only one environment map is specified in the scheme, when the time, the surrounding environment continuously changes and the scene changes, the information of all directions of the surrounding environment cannot be continuously and accurately reflected, the phenomenon that a rendering object cannot be fused with the surrounding environment, even a game picture jumps can be generated, and the effect of adapting to the environment changes in real time is poor.
Second kind: designating a plurality of environment maps in advance, and selecting a certain environment map for adaptation according to time and day-night transformation. However, by designating a plurality of environment maps, different environment maps are selected for rendering when time changes, a phenomenon of game picture jump occurs when the environment maps are switched, and when the environment places needing to be adapted in the scene are more, the plurality of environment maps have great influence on the memory occupation and the size of a game inclusion during running.
Third kind: when the game runs, the environment map is generated by shooting in real time in the scene, the changes of day and night, time and environment are adapted, and the additional load cost is high. Through the mode of shooting the game scene in real time, although the information of the surrounding environment can be obtained in real time, the computing capacity and the memory requirement of the equipment are high, and the mobile platform cannot be normally used under the conditions of limited computing capacity and strict power consumption limit.
It should be noted that the information disclosed in the above background section is only for enhancing understanding of the background of the present disclosure and thus may include information that does not constitute prior art known to those of ordinary skill in the art.
Disclosure of Invention
In view of the problem that the real-time adaptation environment of the game scene of the mobile terminal is poor in effect, and the adaptation environment causes additional load overhead to cause abnormal use, the embodiment of the invention is provided so as to provide a display control method in a game and a display control device in a corresponding game, which overcome or at least partially solve the problem.
The embodiment of the invention discloses a display control method in a game, which comprises the following steps:
acquiring a current time point in a game process;
searching original color information of a target virtual scene corresponding to the current time point;
determining target color information of a target virtual scene corresponding to the current time point according to the original color information and a virtual scene map corresponding to a preset time point;
and rendering the target virtual scene according to the target color information to obtain a target virtual scene adapted to the current time point.
Optionally, the acquiring the current time point in the game process includes:
Acquiring a current time point in a game process in real time; or alternatively, the first and second heat exchangers may be,
And acquiring the current time point in the game process according to the preset time interval.
Optionally, before the step of obtaining the current time point in the game process, the method further includes:
acquiring a plurality of virtual scene maps acquired respectively based on a plurality of time points;
and determining original color information corresponding to the multiple virtual scene maps.
Optionally, the virtual scene map acquired at each time point includes: sub-scene maps shot from a plurality of preset directions through a virtual camera, wherein the determining original color information corresponding to the plurality of virtual scene maps comprises the following steps:
Respectively extracting color information of all pixel points in the sub-scene map corresponding to each direction;
determining a color mean value corresponding to the sub-scene map according to the color information of all the pixel points in the sub-scene map;
and determining the color mean value of the sub-scene map in each direction as the original color information corresponding to the virtual scene map.
Optionally, the determining, according to the color information of all the pixel points, a color average value corresponding to the sub-scene map includes:
removing abnormal values of color information of all pixel points in the sub-scene map to obtain filtered color information;
and determining a color mean value corresponding to the sub-scene map according to the filtered color information.
Optionally, after the step of determining the original color information of the plurality of virtual scene maps, the method further includes:
And carrying out interpolation processing on the original color information of the multiple virtual scene maps to obtain the color information of any time point.
Optionally, the method further comprises:
And determining the virtual scene map corresponding to the preset time point from the plurality of virtual scene maps.
Optionally, the determining, according to the original color information and the virtual scene map corresponding to the preset time point, the target color information of the target virtual scene corresponding to the current time point includes:
acquiring a sampling vector, and determining weight values of the sampling vector in X, Y and Z axis directions of a three-dimensional coordinate system; the sampling vector is a reflection vector corresponding to a direction vector of the virtual camera for observing the target virtual scene;
Calculating to obtain a color shift ratio according to the weight value and the original color information;
Sampling from the virtual scene map corresponding to the preset time point according to the sampling vector to obtain reference color information;
and calculating to obtain target color information according to the color deviation ratio and the reference color information.
Optionally, the virtual scene map corresponding to the preset time point includes sub scene maps corresponding to a plurality of preset directions one by one, and the sampling according to the sampling vector obtains reference color information from the virtual scene map corresponding to the preset time point, including:
And determining a target map from the sub-scene maps corresponding to the preset directions according to the directions of the sampling vectors, and sampling from the target map according to the sizes of the sampling vectors to obtain reference color information.
Optionally, the calculating to obtain the target color information according to the color shift ratio and the reference color information includes:
Multiplying the color offset ratio with reference color information and preset parameters to obtain target color information, wherein the preset parameters are used for adjusting the display effect of the target virtual scene.
The embodiment of the invention also discloses a display control device in the game, which comprises the following steps:
the current time point acquisition module is used for acquiring the current time point in the game process;
the original color information searching module is used for searching the original color information of the target virtual scene corresponding to the current time point;
The target color information determining module is used for determining target color information of a target virtual scene corresponding to the current time point according to the original color information and the virtual scene map corresponding to the preset time point;
and the rendering module is used for rendering the target virtual scene according to the target color information so as to obtain the target virtual scene adapted to the current time point.
The embodiment of the invention also discloses an electronic device, which comprises:
A processor and a storage medium storing machine-readable instructions executable by the processor, the processor executing the machine-readable instructions when the electronic device is running to perform a method according to any one of the embodiments of the invention.
The embodiment of the invention also discloses a computer readable storage medium, wherein the storage medium is stored with a computer program, and the computer program is executed by a processor to execute the method according to any one of the embodiments of the invention.
The embodiment of the invention has the following advantages:
The embodiment of the invention provides a display control method in a game, which is characterized in that color characteristic data of virtual scenes corresponding to different time points are extracted offline in advance, the color characteristic data are stored, the current time point in the game virtual world is obtained in real time or according to a preset time interval in the game running process, and the original color information of a target virtual scene corresponding to the current time point is obtained from the stored data. Meanwhile, when data are stored, a virtual scene map corresponding to a preset time point is also stored as a reference for calculation, instead of storing the virtual scene map of each time point, the target color information of a target virtual scene corresponding to the current time point can be restored through the original color information and the virtual scene map corresponding to the preset time point, and then the target virtual scene is rendered according to the target color information so as to present the target virtual scene matched with the current time point on a graphical user interface. Because the process of extracting the color characteristic data corresponding to each time point can be processed offline in advance, the data can be directly obtained to restore the target color information of the target virtual scene corresponding to the current time point when the game is operated, the calculated amount and the resource expense when the game is operated are greatly reduced, the process can be automatically completed without additional manual participation, the performance expense when the game is operated is very small, and the best effect of real-time adaptation environment day and night change is achieved through the most saved performance expense.
Drawings
In order to more clearly illustrate the technical solutions of the present invention, the drawings that are needed in the description of the present invention will be briefly described below, it being obvious that the drawings in the following description are only some embodiments of the present invention, and that other drawings may be obtained according to these drawings without inventive effort to a person skilled in the art.
FIG. 1 is a flow chart of steps of a method for controlling display in a game according to an embodiment of the present invention;
FIG. 2A is a schematic diagram of a game scenario provided by an embodiment of the present invention;
FIG. 2B is a schematic diagram of a virtual scene map according to an embodiment of the present invention;
FIG. 2C is a schematic diagram of six faces of a virtual scene map provided by an embodiment of the present invention;
FIG. 3 is a schematic diagram of a sampling vector according to an embodiment of the present invention;
FIG. 4 is a block diagram showing a display control apparatus in a game according to an embodiment of the present invention;
FIG. 5 is a block diagram of an electronic device of the present invention;
Fig. 6 is a block diagram of a computer-readable storage medium of the present invention.
Detailed Description
In order that the above-recited objects, features and advantages of the present invention will become more readily apparent, a more particular description of the invention will be rendered by reference to the appended drawings and appended detailed description. It will be apparent that the described embodiments are some, but not all, embodiments of the invention. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
In order to solve the problem that the real-time adaptation environment of the game scene of the mobile terminal has poor effect, the adaptation environment can cause extra load expenditure to cause incapability of normal use, the embodiment of the invention provides a display control method in a game, which comprises the steps of extracting color characteristic data of virtual scenes corresponding to different time points in advance in an offline manner, storing the color characteristic data, acquiring the current time point in the game process in real time or according to a preset time interval in the game running process, and acquiring original color information of a target virtual scene corresponding to the current time point from the stored data. Meanwhile, when data are stored, a virtual scene map corresponding to a preset time point is also stored as a reference for calculation, instead of storing the virtual scene map of each time point, the target color information of a target virtual scene corresponding to the current time point can be restored through the original color information and the virtual scene map corresponding to the preset time point, and then the target virtual scene is rendered according to the target color information so as to present the target virtual scene matched with the current time point on a graphical user interface. Because the process of extracting the color characteristic data corresponding to each time point can be processed offline in advance, the data can be directly obtained to restore the target color information of the target virtual scene corresponding to the current time point when the game is operated, the calculated amount and the resource expense when the game is operated are greatly reduced, the process can be automatically completed without additional manual participation, the performance expense when the game is operated is very small, and the best effect of real-time adaptation environment day and night change is achieved through the most saved performance expense.
The display control method in the game in one embodiment of the present invention may be run on a terminal device or a server. The terminal device may be a local terminal device. When the in-game display control method runs on the server, the in-game display control method can be realized and executed based on a cloud interaction system, wherein the cloud interaction system comprises the server and the client device.
In an alternative embodiment, various cloud applications may be run under the cloud interaction system, for example: and (5) cloud game. Taking cloud game as an example, cloud game refers to a game mode based on cloud computing. In the running mode of the cloud game, the running main body of the game program and the game picture presentation main body are separated, the storage and running of the display control method in the game are completed on the cloud game server, and the function of the client device is used for receiving and sending data and presenting the game picture, for example, the client device can be a display device with a data transmission function close to a user side, such as a first terminal device, a television, a computer, a palm computer and the like; however, the terminal device of the display control method in the game is a cloud game server in the cloud. When playing the game, the player operates the client device to send an operation instruction to the cloud game server, the cloud game server runs the game according to the operation instruction, codes and compresses data such as game pictures and the like, returns the data to the client device through a network, and finally decodes the data through the client device and outputs the game pictures.
In an alternative embodiment, the terminal device may be a local terminal device. Taking a game as an example, the local terminal device stores a game program and is used to present a game screen. The local terminal device is used for interacting with the player through the graphical user interface, namely, conventionally downloading and installing the game program through the electronic device and running. The manner in which the local terminal device provides the graphical user interface to the player may include a variety of ways, for example, may be rendered for display on a display screen of the terminal, or provided to the player by holographic projection. For example, the local terminal device may include a display screen for presenting a graphical user interface including game visuals, and a processor for running the game, generating the graphical user interface, and controlling the display of the graphical user interface on the display screen.
Referring to fig. 1, a flowchart illustrating steps of an embodiment of a display control method in a game according to an embodiment of the present invention may specifically include the following steps:
step 101, obtaining a current time point in the game process;
the game comprises a virtual world, wherein the virtual world comprises a game scene and game characters, a player can control the game characters to play in the virtual world, and in some games, the virtual world is combined with the real scene in the real world, so that the virtual world and the real world interact, the sense of the game of the player is improved, and different game experiences are brought to the player. Specifically, 24 hours of day-night alternation in the real world can be simulated when the game is running, an independent time system different from the real world is arranged in the game, and the virtual scene has different expression effects in the game at different time points, for example, the virtual scene is displayed to be brighter when the illumination in the daytime is sufficient, and the virtual scene is displayed to be darker when the illumination in the evening is insufficient.
In a specific implementation, the game engine may provide a data access interface through which a current point in time in a game process may be obtained when the game is running, so as to display a target virtual scene adapted to the current point in time in the graphical user interface. It should be noted that the current time point may be any time, and is not limited to the whole time, for example, the current time point may be 2:00,3:00, or 3:20, etc., which is not limited by the embodiment of the present invention.
Step 102, searching original color information of a target virtual scene corresponding to the current time point;
The original color information refers to original color characteristic data of the game environment, and the original color information can be a color value, and is expressed as (R, G, B) or (R, G, B, A), wherein R is expressed as the color of a red channel, G is expressed as the color of a green channel, B is expressed as the color of a blue channel, A is expressed as transparency, and the value of an A channel can not participate in calculation because the transparency of virtual scenes displayed at different time points in the game is the same.
In a specific implementation, color feature data of virtual scenes corresponding to different time points can be extracted offline in advance, and stored, and when the game is running, original color information of a target virtual scene corresponding to the current time point is searched for from the stored data.
Step 103, determining target color information of a target virtual scene corresponding to the current time point according to the original color information and a virtual scene map corresponding to a preset time point;
the virtual scene map corresponding to the preset time point is used as a reference for rendering the virtual scene, the virtual scene map corresponding to the preset time point is shot and stored in advance in an offline mode, the virtual scene map corresponding to the preset time point is directly obtained during game running to calculate and obtain target color information of the target virtual scene corresponding to the current time point, so that the target virtual scene is rendered according to the target color information, and in this way, the need of storing the corresponding virtual scene map of each time point can be avoided.
Specifically, each position is sequentially determined from the game scene as a sampling point, a sampling vector p (X, Y, Z) is determined based on the sampling point, the sampling vector refers to a reflection vector corresponding to a direction vector of the sampling point in the virtual scene observed by the virtual camera, a reference color value can be obtained by sampling on a virtual scene map corresponding to a preset time point according to the sampling vector, meanwhile, weight values of the sampling vector in three directions of an X axis, a Y axis and a Z axis are calculated, a color offset ratio of the sampling point can be calculated according to the weight values and original color information, and finally the color information of the sampling point is restored according to the color offset ratio and the reference color value. And repeating the above process until all positions in the game scene are traversed, and obtaining the target color information of the target virtual scene corresponding to the current time point.
And step 104, rendering the target virtual scene according to the target color information to obtain a target virtual scene adapted to the current time point.
In the embodiment of the invention, after the target color information of the target virtual scene corresponding to the current time point is restored, the target virtual scene can be rendered according to the target color information to obtain the target virtual scene adaptive to the current time point, so that the color of the displayed target virtual scene is not changed along with the progress of the game, and the effect that the environmental color is also changed along with 24 hours of day-night alternation is presented.
In a preferred embodiment of the present invention, the step 101 may specifically include the following sub-steps:
Acquiring a current time point in a game process in real time; or, acquiring the current time point in the game process according to the preset time interval.
In the embodiment of the invention, the current time point in the game process can be acquired in real time, the current time point in the game process is acquired according to the preset time interval, and after the current time point is acquired, the subsequent steps 102-104 are executed to render the target virtual scene which is adaptive to the current time point, so that the effect that the environmental color is continuously changed when 24 hours alternate around the clock is presented. The preset time interval may refer to a preset period, for example, 5 minutes, and may be set according to a display effect requirement, which is not limited in the embodiment of the present invention.
In a preferred embodiment of the present invention, before said step 101, said method may further comprise the steps of:
Acquiring a plurality of virtual scene maps acquired respectively based on a plurality of time points; original color information of the plurality of virtual scene maps is determined.
The plurality of time points may be a plurality of time points set in advance, for example, 24 hours may be divided into 24 time points in units of 1 hour, and further 24 hours may be divided into 48 time points in units of half an hour. In a specific implementation, the division of the time points may be set according to the game effect requirement, which is not limited by the embodiment of the present invention.
Specifically, shooting processing of the virtual scene map is performed by switching the game scene to each set time point, so as to collect the virtual scene map corresponding to each time point. After the virtual scene maps are acquired, the original color information for each virtual scene map may be further determined.
In a preferred embodiment of the present invention, the virtual scene map acquired at each time point includes: sub-scene maps shot from a plurality of preset directions through a virtual camera, wherein the determining of the original color information of the plurality of virtual scene maps comprises the following steps:
Respectively extracting color information of all pixel points in the sub-scene map in each direction; determining a color mean value corresponding to the sub-scene map according to the color information of all the pixel points; and determining the color mean value of the sub-scene map in each direction as the original color information corresponding to the virtual scene map.
In the embodiment of the present invention, the virtual scene map acquired at each time point includes: the sub-scene map shot from a plurality of preset directions by the virtual camera, wherein the preset directions refer to preset sampling directions, for example, the preset directions can be respectively in six directions of front, back, left, right, up, down, as shown in fig. 2A, which is a schematic diagram of a virtual scene, as shown in fig. 2B, which is a schematic diagram of a virtual scene map obtained by sampling the virtual scene in fig. 2A, wherein the virtual scene map includes sub-scene maps corresponding to the six directions one by one, as shown in fig. 2C, which is a schematic diagram of six faces of the virtual scene map, including X-, x+, Y-, y+, Z-, z+ respectively corresponding to the six directions of front, back, left, right, up, down.
When the original color information of the virtual scene map needs to be determined, the color information of all pixel points in the sub-scene map corresponding to each direction can be extracted respectively, the color average value corresponding to the sub-scene map is determined according to the color information of all pixel points in the sub-scene map, and the color average value of the sub-scene map in each direction is determined to be the original color information corresponding to the virtual scene map. The number of pixel points included in the sub-scene map may be set according to the game effect, which is not limited in the embodiment of the present invention, and assuming that 128×128 pixel points are included in the sub-scene map in each direction, 16384 (128×128) color sample values may be extracted based on each direction, and the stored data volume may be greatly reduced by calculating the average value of the colors of the pixel points as the color information of the sub-scene map in the direction and storing the color information.
In a preferred embodiment of the present invention, the determining, according to the color information of all the pixel points, a color average value corresponding to the sub-scene map includes:
Performing outlier rejection processing on the color information of all pixel points in the sub-scene map to obtain filtered color information; and determining a color mean value corresponding to the sub-scene map according to the filtered color information.
Specifically, in order to avoid the influence of the over-bright or over-dark color points in each direction on the whole normal data sample, sample data cleaning is required, outlier rejection processing is performed on the color information of all the pixel points in the sub-scene map, so as to obtain filtered color information, and a color average value corresponding to the sub-scene map is determined according to the filtered color information. For example, assuming that 128×128 sample data exist in one direction, outlier rejection is performed on the 128×128 sample data. As an example, the data filtering may be performed based on a 3Sigma criterion, where the 3Sigma criterion is to calculate a standard deviation by assuming that a set of detection data contains only random errors, and determine a section according to a certain probability, and consider that an error exceeding the section is not a random error but a coarse error, and the data containing the error should be removed.
In a preferred embodiment of the present invention, after the step of determining the original color information of the plurality of virtual scene maps, the method further comprises:
And carrying out interpolation processing on the original color information of the multiple virtual scene maps to obtain the color information of any time point.
In a specific implementation, the color information of the virtual scene map corresponding to each two time points can be obtained by performing interpolation processing on the color information of the plurality of virtual scene maps, for example, by performing interpolation processing on the color information of the virtual scene map corresponding to each two time points through a linear or curve.
In a preferred embodiment of the present invention, the method may further comprise the steps of:
And determining the virtual scene map corresponding to the preset time point from the plurality of virtual scene maps.
Specifically, a time point is randomly determined as a preset time point, and virtual scene maps corresponding to the preset time point are determined from a plurality of photographed virtual scene maps, wherein the virtual scene map corresponding to the preset time point can be used as a reference for rendering a virtual scene, only the virtual scene map corresponding to the preset time point is required to be stored, and other time points are rendered according to the virtual scene map corresponding to the preset time point during rendering, so that the need of storing the corresponding virtual scene map of each time point is avoided, and the consumption of a memory can be reduced.
In a preferred embodiment of the present invention, the step 103 includes:
Acquiring a sampling vector, and determining weight values of the sampling vector in X, Y and Z axis directions of a three-dimensional coordinate system; the sampling vector is a reflection vector corresponding to a direction vector of the virtual camera for observing the target virtual scene; calculating to obtain a color shift ratio according to the weight value and the original color information; sampling from a virtual scene map corresponding to a preset time point according to the sampling vector to obtain reference color information; and calculating to obtain target color information according to the color deviation ratio and the reference color information.
Specifically, each position may be sequentially determined from the target virtual scene as a sampling point, and a sampling vector is determined based on the sampling point, where the sampling vector refers to a reflection vector corresponding to a direction vector in which the virtual camera observes the sampling point in the target virtual scene. After a sampling vector is obtained, determining weight values of the sampling vector in X, Y and Z axis directions of a three-dimensional coordinate system, determining a color offset ratio according to the weight values and the color information, sampling from a virtual scene map corresponding to a preset time point according to the sampling vector to obtain reference color information, and calculating according to the color offset ratio and the reference color information to obtain target color information.
In a preferred embodiment of the present invention, the virtual scene map corresponding to the preset time point includes sub scene maps corresponding to a plurality of preset directions one by one, and the sampling according to the sampling vector to obtain reference color information from the virtual scene map corresponding to the preset time point includes:
And determining a target map from sub-scene maps corresponding to the preset directions according to the directions of the sampling vectors, and sampling from the target map according to the sizes of the sampling vectors to obtain reference color information.
Specifically, the target mapping can be determined from sub-scene mapping in the direction corresponding to the preset directions according to the positive and negative of each component of x, y and z in the sampling vector, and then the reference color information is obtained by sampling from the target mapping according to the sizes of each component of x, y and z in the sampling vector.
In a preferred embodiment of the present invention, the calculating, according to the color shift ratio and the reference color information, the target color information includes:
Multiplying the color offset ratio with reference color information and preset parameters to obtain target color information, wherein the preset parameters are used for adjusting the display effect of the target virtual scene.
The preset parameters are preset coefficients, and can be set according to the game display effect, which is not limited by the embodiment of the invention. In the embodiment of the invention, when calculating the target color information, the color shift ratio can be multiplied by the reference color information and the preset parameter to obtain the target color information.
The process of calculating the target color information described above will be described in detail as follows:
By sequentially determining a position from the target virtual scene as a sampling point, obtaining a sampling vector based on the reflection direction of the observation direction of the sampling point (the direction from the virtual camera to the sampling point), and recording as CaptureVector (X, Y, Z), as shown in fig. 3, according to the positive and negative and the magnitude of each component of X, Y, Z in the sampling vector, reference color information C (r, g, b) can be obtained by sampling (x+, X-, y+, Y-, z+, Z-) on each sub-scene map of the virtual scene map corresponding to a preset time point, and then obtaining the color information values Di (r, g, b) of the current six directions according to the current time point, wherein (i=x+, X-, y+, Y-, z+, Z-).
First, the sample vector is normalized (calculation formula x x+y+z x z=1), expressed as NormalizedVector = normalize (CaptureVector). Then, weight values of the sampling vectors in the X, Y, and Z axis directions are calculated according to the normalized values, and are expressed as nSquared = NormalizedVector × NormalizedVector.
The color information of each face at the current time point is expressed as follows:
ColorX=CaptureVector.x>0Dx+:Dx-;
ColorY=CaptureVector.y>0Dy+:Dy-;
ColorZ=CaptureVector.z>0Dz+:Dz-。
In the calculation process of obtaining the color information of each surface, judging whether x is larger than 0, if x is larger than 0, taking Dx+ as ColorX, and if x is smaller than 0, taking Dx-as ColorX; judging whether y is larger than 0, if y is larger than 0, taking Dy+ as ColorY, and if y is smaller than 0, taking Dy-as ColorY; judging whether z is larger than 0, if z is larger than 0, taking Dz+ as ColorZ, and if z is smaller than 0, taking Dz-as ColorZ.
Further, the color shift ratio TintColor is calculated by the following formula:
TintColor=nSquared.x*ColorX+nSquared.y*ColorY+nSquared.z*ColorZ。
And calculating a target Color value according to the Color offset ratio and the reference Color information, wherein color=c TintColo. Scale. Wherein Scale is a preset coefficient for performing offset adjustment of other effects.
In the embodiment of the invention, the original color information of the target virtual scene corresponding to the current time point is searched by acquiring the current time point in the game process, the target color information of the target virtual scene corresponding to the current time point is determined according to the original color information and the virtual scene map corresponding to the preset time point, and the target virtual scene is rendered according to the target color information so as to obtain the target virtual scene matched with the current time point. Therefore, when the game engine operates, only a preset environmental illumination map (virtual scene map corresponding to a preset time point) is needed to be dyed differently by matching with a pre-generated parameterized curve, and a rendering object can be restored to generate an environmental illumination effect similar to 24-hour day and night environmental illumination according to the virtual scene map dyed at different time points, so that the environmental illumination information of each time change can be accurately and continuously expressed. The processing mode ensures that the change transition of the virtual scene map dyeing is continuous, no abnormal jump occurs, the operation calculation amount and the memory occupation are greatly saved, and the environment color low-frequency information in a plurality of preset directions (such as front, back, left, right, upper and lower directions) can be accurately fed back.
Because most of the steps only need offline processing, the calculation amount and the resource cost are greatly saved, the environment information of the current time point can be continuously and accurately restored only based on the designated standard environment mapping and the original color information of the target virtual scene corresponding to each time point generated offline during operation, the calculation and the memory cost are small, the best effect is achieved under the condition of minimum extra load, the better performance and the better effect performance are achieved on various equipment platforms, and the process can be automatically completed without extra manual participation.
It should be noted that, for simplicity of description, the method embodiments are shown as a series of acts, but it should be understood by those skilled in the art that the embodiments are not limited by the order of acts, as some steps may occur in other orders or concurrently in accordance with the embodiments. Further, those skilled in the art will appreciate that the embodiments described in the specification are presently preferred embodiments, and that the acts are not necessarily required by the embodiments of the invention.
Referring to fig. 4, a block diagram of a display control device in a game according to an embodiment of the present invention is shown, which may specifically include the following modules:
a current time point obtaining module 401, configured to obtain a current time point in a game process;
An original color information searching module 402, configured to search original color information of a target virtual scene corresponding to the current time point;
A target color information determining module 403, configured to determine target color information of a target virtual scene corresponding to the current time point according to the original color information and a virtual scene map corresponding to a preset time point;
And the rendering module 404 is configured to render the target virtual scene according to the target color information, so as to obtain a target virtual scene adapted to the current time point.
In a preferred embodiment of the present invention, the current time point obtaining module 401 includes:
The first current time point acquisition sub-module is used for acquiring the current time point in the game process in real time; or alternatively, the first and second heat exchangers may be,
The second current time point obtaining sub-module is used for obtaining the current time point in the game process according to the preset time interval.
In a preferred embodiment of the invention, the device further comprises:
The environment map acquisition module is used for acquiring a plurality of virtual scene maps acquired based on a plurality of time points respectively;
And the color information determining module is used for determining the original color information of the multiple virtual scene maps.
In a preferred embodiment of the present invention, the virtual scene map acquired at each time point includes: the sub-scene map shot from a plurality of preset directions through the virtual camera, and the color information determining module comprises:
The color information extraction sub-module is used for respectively extracting the color information of all pixel points in the sub-scene map corresponding to each direction;
The color mean value calculation sub-module is used for determining the color mean value corresponding to the sub-scene map according to the color information of all the pixel points;
And the color information determining submodule is used for determining the color average value of the sub-scene map in each direction as the original color information corresponding to the virtual scene map.
In a preferred embodiment of the present invention, the color mean calculation sub-module includes:
the data filtering unit is used for carrying out outlier rejection processing on the color information of all the pixel points in the sub-scene map so as to obtain filtered color information;
And the color mean value calculation unit is used for determining the color mean value corresponding to the child birth channel map according to the filtered color information.
In a preferred embodiment of the present invention, further comprising:
And the interpolation processing sub-module is used for carrying out interpolation processing on the original color information of the multiple virtual scene maps to obtain the color information of any time point.
In a preferred embodiment of the present invention, further comprising:
And the mapping determining module is used for determining virtual scene mapping corresponding to a preset time point from the plurality of virtual scene mapping.
In a preferred embodiment of the present invention, the target color information calculating module 403 includes:
the weight value calculation sub-module is used for obtaining a sampling vector and determining weight values of the sampling vector in X, Y and Z axis directions of a three-dimensional coordinate system; the sampling vector is a reflection vector corresponding to a direction vector of the virtual camera for observing the target virtual scene;
The color deviation ratio calculating sub-module is used for calculating a color deviation ratio according to the weight value and the original color information;
the reference color information sampling sub-module is used for sampling from the virtual scene map corresponding to the preset time point according to the sampling vector to obtain reference color information;
and the target color information calculation sub-module is used for calculating and obtaining target color information according to the color deviation ratio and the reference color information.
In a preferred embodiment of the present invention, the virtual scene map corresponding to the preset time point includes sub scene maps corresponding to a plurality of preset directions one by one, and the reference color information sampling submodule includes:
And the reference color information sampling unit is used for determining a target mapping from the sub-scene mapping corresponding to the preset directions according to the directions of the sampling vectors, and sampling the target mapping according to the sizes of the sampling vectors to obtain the reference color information.
In a preferred embodiment of the present invention, the target color information calculation sub-module includes:
And the target color information calculating unit is used for multiplying the color offset ratio with the reference color information and a preset parameter to obtain target color information, wherein the preset parameter is used for adjusting the display effect of the target virtual scene.
For the device embodiments, since they are substantially similar to the method embodiments, the description is relatively simple, and reference is made to the description of the method embodiments for relevant points.
The embodiment of the invention also provides an electronic device, as shown in fig. 5, including:
A processor 501 and a storage medium 502, said storage medium 502 storing machine readable instructions executable by said processor 501, said processor 501 executing said machine readable instructions to perform a method according to any one of the embodiments of the present invention when the electronic device is running. The specific implementation manner and the technical effect are similar, and are not repeated here.
An embodiment of the present invention further provides a computer readable storage medium, as shown in fig. 6, on which a computer program 601 is stored, the computer program 601 performing the method according to any one of the embodiments of the present invention when being executed by a processor. The specific implementation manner and the technical effect are similar, and are not repeated here.
In this specification, each embodiment is described in a progressive manner, and each embodiment is mainly described by differences from other embodiments, and identical and similar parts between the embodiments are all enough to be referred to each other.
It will be apparent to those skilled in the art that embodiments of the present invention may be provided as a method, apparatus, or computer program product. Accordingly, embodiments of the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, embodiments of the invention may take the form of a computer program product on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, etc.) having computer-usable program code embodied therein.
Embodiments of the present invention are described with reference to flowchart illustrations and/or block diagrams of methods, terminal devices (systems), and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flowchart illustrations and/or block diagrams, and combinations of flows and/or blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing terminal device to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing terminal device, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
While preferred embodiments of the present invention have been described, additional variations and modifications in those embodiments may occur to those skilled in the art once they learn of the basic inventive concepts. It is therefore intended that the following claims be interpreted as including the preferred embodiment and all such alterations and modifications as fall within the scope of the embodiments of the invention.
Finally, it is further noted that relational terms such as first and second, and the like are used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Moreover, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or terminal that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or terminal. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article, or terminal device that comprises the element.
The above description of the present invention provides a method for controlling display in a game and a device for controlling display in a game, and specific examples are applied to illustrate the principles and embodiments of the present invention, and the above description of the examples is only used to help understand the method and core idea of the present invention; meanwhile, as those skilled in the art will have variations in the specific embodiments and application scope in accordance with the ideas of the present invention, the present description should not be construed as limiting the present invention in view of the above.

Claims (13)

1. A display control method in a game, comprising:
acquiring a current time point in a game process;
Searching original color information of a target virtual scene corresponding to the current time point from color feature data of the virtual scene corresponding to different time points; wherein the color characteristic data are pre-extracted and stored;
Determining target color information of a target virtual scene corresponding to the current time point according to the original color information and a virtual scene map corresponding to a preset time point; the target color information is determined based on a color offset ratio and a reference color value, wherein the color offset ratio is calculated based on the original color information and weight values of sampling vectors in different directions, the reference color value is obtained by sampling a virtual scene map corresponding to the preset time point based on the sampling vector, and the sampling vector is a reflection vector corresponding to a direction vector of a virtual camera for observing the target virtual scene;
and rendering the target virtual scene according to the target color information to obtain a target virtual scene adapted to the current time point.
2. The method of claim 1, wherein the obtaining the current point in time in the game session comprises:
Acquiring a current time point in a game process in real time; or alternatively, the first and second heat exchangers may be,
And acquiring the current time point in the game process according to the preset time interval.
3. The method of claim 1, further comprising, prior to the step of obtaining a current point in time in the gaming session:
acquiring a plurality of virtual scene maps acquired respectively based on a plurality of time points;
original color information of the plurality of virtual scene maps is determined.
4. The method of claim 3, wherein the virtual scene map collected at each time point comprises: sub-scene maps shot from a plurality of preset directions through a virtual camera, wherein the determining of the original color information of the plurality of virtual scene maps comprises the following steps:
Respectively extracting color information of all pixel points in the sub-scene map corresponding to each direction;
determining a color mean value corresponding to the sub-scene map according to the color information of all the pixel points;
And determining the color mean value of the sub-scene map in each direction as the original color information corresponding to the virtual scene map.
5. The method of claim 4, wherein the determining the color mean corresponding to the sub-scene map according to the color information of all the pixels comprises:
Performing outlier rejection processing on the color information of all pixel points in the sub-scene map to obtain filtered color information;
and determining a color mean value corresponding to the sub-scene map according to the filtered color information.
6. The method of claim 3, further comprising, after the step of determining the original color information for the plurality of virtual scene maps:
And carrying out interpolation processing on the original color information of the multiple virtual scene maps to obtain the color information of any time point.
7. A method according to claim 3, further comprising:
And determining the virtual scene map corresponding to the preset time point from the plurality of virtual scene maps.
8. The method according to claim 1, wherein the determining the target color information of the target virtual scene corresponding to the current time point according to the original color information and the virtual scene map corresponding to the preset time point includes:
acquiring a sampling vector, and determining weight values of the sampling vector in X, Y and Z axis directions of a three-dimensional coordinate system; the sampling vector is a reflection vector corresponding to a direction vector of the virtual camera for observing the target virtual scene;
Calculating to obtain a color shift ratio according to the weight value and the original color information;
Sampling from the virtual scene map corresponding to the preset time point according to the sampling vector to obtain reference color information;
and calculating to obtain target color information according to the color deviation ratio and the reference color information.
9. The method according to claim 8, wherein the virtual scene map corresponding to the preset time point includes sub-scene maps corresponding to a plurality of preset directions one by one, and the sampling from the virtual scene map corresponding to the preset time point according to the sampling vector to obtain the reference color information includes:
And determining a target map from sub-scene maps corresponding to the preset directions according to the directions of the sampling vectors, and sampling the target map according to the sizes of the sampling vectors to obtain the reference color information.
10. The method of claim 8, wherein the calculating the target color information based on the color shift ratio and the reference color information comprises:
And multiplying the color offset ratio with reference color information and preset parameters to obtain the target color information, wherein the preset parameters are used for adjusting the display effect of the target virtual scene.
11. A display control apparatus in a game, comprising:
the current time point acquisition module is used for acquiring the current time point in the game process;
the original color information searching module is used for searching the original color information of the target virtual scene corresponding to the current time point from the color characteristic data of the virtual scene corresponding to the different time points; wherein the color characteristic data are pre-extracted and stored;
The target color information determining module is used for determining target color information of a target virtual scene corresponding to the current time point according to the original color information and the virtual scene map corresponding to the preset time point; the target color information is determined based on a color offset ratio and a reference color value, wherein the color offset ratio is calculated based on the original color information and weight values of sampling vectors in different directions, the reference color value is obtained by sampling a virtual scene map corresponding to the preset time point based on the sampling vector, and the sampling vector is a reflection vector corresponding to a direction vector of a virtual camera for observing the target virtual scene;
and the rendering module is used for rendering the target virtual scene according to the target color information so as to obtain the target virtual scene adapted to the current time point.
12. An electronic device, comprising:
A processor and a storage medium storing machine-readable instructions executable by the processor, the processor executing the machine-readable instructions when the electronic device is running to perform the method of any one of claims 1-10.
13. A computer readable storage medium, characterized in that the storage medium has stored thereon a computer program which, when executed by a processor, performs the method according to any of claims 1-10.
CN202111164019.6A 2021-09-30 2021-09-30 Display control method and device in game Active CN113893533B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111164019.6A CN113893533B (en) 2021-09-30 2021-09-30 Display control method and device in game

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111164019.6A CN113893533B (en) 2021-09-30 2021-09-30 Display control method and device in game

Publications (2)

Publication Number Publication Date
CN113893533A CN113893533A (en) 2022-01-07
CN113893533B true CN113893533B (en) 2024-07-09

Family

ID=79190085

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111164019.6A Active CN113893533B (en) 2021-09-30 2021-09-30 Display control method and device in game

Country Status (1)

Country Link
CN (1) CN113893533B (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106355637A (en) * 2016-08-30 2017-01-25 北京像素软件科技股份有限公司 Game scene environment rendering method
CN108038897A (en) * 2017-12-06 2018-05-15 北京像素软件科技股份有限公司 Shadow map generation method and device

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7285047B2 (en) * 2003-10-17 2007-10-23 Hewlett-Packard Development Company, L.P. Method and system for real-time rendering within a gaming environment
CN108579082A (en) * 2018-04-27 2018-09-28 网易(杭州)网络有限公司 The method, apparatus and terminal of shadow are shown in game
CN111467805B (en) * 2020-05-11 2023-04-07 网易(杭州)网络有限公司 Method and device for realizing dynamic change of virtual scene, medium and electronic equipment
CN112169324A (en) * 2020-09-22 2021-01-05 完美世界(北京)软件科技发展有限公司 Rendering method, device and equipment of game scene
CN112402974B (en) * 2020-11-23 2024-09-06 成都完美时空网络技术有限公司 Game scene display method and device, storage medium and electronic equipment
CN113368496B (en) * 2021-05-14 2023-08-01 广州三七互娱科技有限公司 Weather rendering method and device for game scene and electronic equipment

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106355637A (en) * 2016-08-30 2017-01-25 北京像素软件科技股份有限公司 Game scene environment rendering method
CN108038897A (en) * 2017-12-06 2018-05-15 北京像素软件科技股份有限公司 Shadow map generation method and device

Also Published As

Publication number Publication date
CN113893533A (en) 2022-01-07

Similar Documents

Publication Publication Date Title
CN112348969B (en) Display method and device in augmented reality scene, electronic equipment and storage medium
CN109413480B (en) Picture processing method, device, terminal and storage medium
US9164723B2 (en) Virtual lens-rendering for augmented reality lens
CN108830923B (en) Image rendering method and device and storage medium
JP2024523865A (en) Screensaver interaction method, device, electronic device, and storage medium
CN111667420B (en) Image processing method and device
CN111144491B (en) Image processing method, device and electronic system
JP2022515798A (en) Lighting rendering methods, equipment, electronic equipment and computer programs
CN112221131B (en) Visual angle switching method and device and computer readable storage medium
CN112652046B (en) Game picture generation method, device, equipment and storage medium
CN108876935A (en) A kind of method and device quickly carrying out house threedimensional model splicing in mobile terminal
CN112446939A (en) Three-dimensional model dynamic rendering method and device, electronic equipment and storage medium
CN108063915A (en) A kind of image-pickup method and system
CN108230434B (en) Image texture processing method and device, storage medium and electronic device
CN113554726A (en) Image reconstruction method and device based on pulse array, storage medium and terminal
CN111696034A (en) Image processing method and device and electronic equipment
CN111583378A (en) Virtual asset processing method and device, electronic equipment and storage medium
CN109218817A (en) A kind of method and apparatus showing virtual present prompting message
CN113893533B (en) Display control method and device in game
CN113516751B (en) Method and device for displaying cloud in game and electronic terminal
CN111231826B (en) Control method, device and system for vehicle model steering lamp in panoramic image and storage medium
CN111292234A (en) Panoramic image generation method and device
CN115228083A (en) Resource rendering method and device
CN109379577B (en) Video generation method, device and equipment of virtual viewpoint
CN113222178A (en) Model training method, user interface generation method, device and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant