CN113893533A - Display control method and device in game - Google Patents

Display control method and device in game Download PDF

Info

Publication number
CN113893533A
CN113893533A CN202111164019.6A CN202111164019A CN113893533A CN 113893533 A CN113893533 A CN 113893533A CN 202111164019 A CN202111164019 A CN 202111164019A CN 113893533 A CN113893533 A CN 113893533A
Authority
CN
China
Prior art keywords
color information
virtual scene
time point
target
game
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202111164019.6A
Other languages
Chinese (zh)
Other versions
CN113893533B (en
Inventor
陈彦宏
成桀桑
李冰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Netease Hangzhou Network Co Ltd
Original Assignee
Netease Hangzhou Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netease Hangzhou Network Co Ltd filed Critical Netease Hangzhou Network Co Ltd
Priority to CN202111164019.6A priority Critical patent/CN113893533B/en
Priority claimed from CN202111164019.6A external-priority patent/CN113893533B/en
Publication of CN113893533A publication Critical patent/CN113893533A/en
Application granted granted Critical
Publication of CN113893533B publication Critical patent/CN113893533B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/005General purpose rendering architectures
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/66Methods for processing data by generating or executing the game program for rendering three dimensional images

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Computer Graphics (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The embodiment of the invention provides a display control method and a display control device in a game, wherein the method comprises the following steps: acquiring a current time point in a game process; searching the original color information of the target virtual scene corresponding to the current time point; determining target color information of a target virtual scene corresponding to the current time point according to the original color information and a virtual scene map corresponding to a preset time point; and rendering the target virtual scene according to the target color information to obtain the target virtual scene adaptive to the current time point. The process of extracting the color characteristic data corresponding to each time point can be processed off-line in advance, and the data can be directly acquired during the running of the game to restore the target color information of the target virtual scene corresponding to the current time point, so that the calculation amount and the resource overhead during the running are greatly reduced, and the method has better effect and performance on various equipment platforms.

Description

Display control method and device in game
Technical Field
The present invention relates to the field of game technology, and in particular, to a display control method and a display control device in a game.
Background
The 24-hour day and night environment effect change is the standard configuration of the mainstream game, and the colors of the sky and the surrounding environment can be adaptively changed according to the time change, so that the game picture presents different changes of rendering effect. The change of sky and environment day and night colors can be solved by technical means at present, but aiming at rendering objects in the environment, how to correctly reflect the change of the surrounding environment is always the difficult point that various game developers have to solve, because the rendering object needs to obtain the information of the surrounding environment through the environment map (cube map) for illumination calculation, the corresponding rendering effect is realized, the general technology designates one or two pre-generated environment maps, the rendering object acquires surrounding environment information through the environment maps to realize corresponding environment illumination effect, however, this solution cannot accurately express the environment information at each time point in 24 hours a day because only one or two pieces of environment map information are provided according to the 3 data sources.
In the prior art, in order to adapt different environment maps to different time points and achieve a relatively fused rendering effect, an individual host platform game can shoot an environment in a scene at regular intervals during running and generate a corresponding environment map to achieve an effect of adapting to environment changes in real time.
At present, a suitable real-time adaptation scheme is not available in a game of a mobile terminal, most schemes can prepare different environment maps according to different environment points in advance instead of shooting and generating in real time, so that the adaptation effect is not ideal, and when scene points with adaptation requirements are more, the occupation of a game bag body and a memory is large, and especially a mobile platform has very strict requirements on power consumption, memory and calculation overhead. Rendering objects in a mobile terminal game need to be rendered based on an environment map in order to show the influence of the surrounding environment on the objects, and in order to adapt to 24-hour day and night environment change, the prior art mainly realizes the following ways:
the first method comprises the following steps: an environment map is pre-assigned, uniform color conversion is carried out on the basis of the environment map according to time and day-night conversion, and the overhead of extra load is minimum. However, in this scheme, only one environment map is designated, and when time, surrounding environment continuously change and a scene changes, information of each direction of the surrounding environment cannot be continuously and accurately reflected, so that a rendered object cannot be fused with the surrounding environment, even a game picture jumps, and the effect of real-time adaptation to the environment change is poor.
And the second method comprises the following steps: appointing a plurality of environment maps in advance, and selecting a certain environment map for adaptation according to time and day-night conversion. However, according to the scheme of designating multiple environment maps and selecting different environment maps for rendering when time changes, the phenomenon of game picture jumping can be generated when the environment maps are switched, and when the environment sites needing to be adapted in a scene are more, the multiple environment maps greatly influence the memory occupation and the size of a game bag body during operation.
And the third is that: when the game runs, the environment map is shot in real time in the scene, the change of day and night, time and environment is adapted, and the extra load cost is high. Although the information of the surrounding environment can be acquired in real time by means of shooting the game scene in real time, the requirements on the computing capacity and the memory of the equipment are high, and the mobile platform cannot be used normally under the conditions of limited computing capacity and strict power consumption limitation.
It is to be noted that the information disclosed in the above background section is only for enhancement of understanding of the background of the present disclosure, and thus may include information that does not constitute prior art known to those of ordinary skill in the art.
Disclosure of Invention
In view of the above-mentioned problem that the effect of adapting the game scene of the mobile terminal to the environment change in real time is poor, and the adaptation environment causes additional load overhead, which results in abnormal use, embodiments of the present invention are proposed to provide a display control method in a game and a display control apparatus in a game, which overcome or at least partially solve the above-mentioned problem.
The embodiment of the invention discloses a display control method in a game, which comprises the following steps:
acquiring a current time point in a game process;
searching the original color information of the target virtual scene corresponding to the current time point;
determining target color information of a target virtual scene corresponding to the current time point according to the original color information and a virtual scene map corresponding to a preset time point;
and rendering the target virtual scene according to the target color information to obtain the target virtual scene adaptive to the current time point.
Optionally, the acquiring a current time point in the game process includes:
acquiring a current time point in a game process in real time; or the like, or, alternatively,
and acquiring the current time point in the game process according to a preset time interval.
Optionally, before the step of acquiring the current time point in the game process, the method further includes:
acquiring a plurality of virtual scene maps acquired respectively based on a plurality of time points;
and determining original color information corresponding to the plurality of virtual scene maps.
Optionally, the virtual scene map collected at each time point includes: determining original color information corresponding to a plurality of virtual scene maps by using sub-scene maps shot from a plurality of preset directions by a virtual camera, the method comprising:
respectively extracting color information of all pixel points in the sub-scene mapping corresponding to each direction;
determining a color mean value corresponding to the sub-scene map according to the color information of all pixel points in the sub-scene map;
and determining the color average value of the sub-scene maps in each direction as the original color information corresponding to the virtual scene map.
Optionally, the determining a color mean value corresponding to the sub-scene map according to the color information of all the pixel points includes:
removing abnormal values of the color information of all pixel points in the sub scene map to obtain filtered color information;
and determining a color mean value corresponding to the sub-scene map according to the filtered color information.
Optionally, after the step of determining the original color information of the plurality of virtual scene maps, the method further includes:
and carrying out interpolation processing on the original color information of the plurality of virtual scene maps to obtain the color information of any time point.
Optionally, the method further comprises:
and determining the virtual scene map corresponding to the preset time point from the plurality of virtual scene maps.
Optionally, the determining, according to the original color information and the virtual scene map corresponding to the preset time point, the target color information of the target virtual scene corresponding to the current time point includes:
acquiring a sampling vector, and determining weight values of the sampling vector in X, Y and Z axis directions of a three-dimensional coordinate system; the sampling vector is a reflection vector corresponding to a direction vector of the virtual camera for observing the target virtual scene;
calculating to obtain a color offset ratio according to the weight value and the original color information;
sampling from the virtual scene map corresponding to the preset time point according to the sampling vector to obtain reference color information;
and calculating to obtain target color information according to the color deviation ratio and the reference color information.
Optionally, the virtual scene map corresponding to the preset time point includes sub-scene maps respectively corresponding to a plurality of preset directions one to one, and the obtaining of the reference color information by sampling from the virtual scene map corresponding to the preset time point according to the sampling vector includes:
and determining a target map from the sub-scene maps corresponding to the preset directions according to the direction of the sampling vector, and sampling from the target map according to the size of the sampling vector to obtain reference color information.
Optionally, the calculating the target color information according to the color shift ratio and the reference color information includes:
and multiplying the color deviation ratio with reference color information and preset parameters to obtain target color information, wherein the preset parameters are used for adjusting the display effect of the target virtual scene.
The embodiment of the invention also discloses a display control device in the game, which comprises:
the current time point acquisition module is used for acquiring a current time point in the game process;
the original color information searching module is used for searching the original color information of the target virtual scene corresponding to the current time point;
the target color information determining module is used for determining the target color information of the target virtual scene corresponding to the current time point according to the original color information and the virtual scene map corresponding to the preset time point;
and the rendering module is used for rendering the target virtual scene according to the target color information so as to obtain the target virtual scene adaptive to the current time point.
The embodiment of the invention also discloses an electronic device, which comprises:
a processor and a storage medium storing machine-readable instructions executable by the processor, the processor executing the machine-readable instructions to perform a method according to any one of the embodiments of the invention when the electronic device is operated.
The embodiment of the invention also discloses a computer readable storage medium, wherein a computer program is stored on the storage medium, and when the computer program is executed by a processor, the method of any one of the embodiments of the invention is executed.
The embodiment of the invention has the following advantages:
the embodiment of the invention provides a display control method in a game, which extracts color characteristic data of virtual scenes corresponding to different time points in advance in an off-line manner, stores the color characteristic data, acquires the current time point in a game virtual world in real time or according to a preset time interval in the running process of the game, and acquires original color information of a target virtual scene corresponding to the current time point from the stored data. Meanwhile, when the data are stored, the virtual scene map corresponding to the preset time point is stored as a calculation reference instead of storing the virtual scene map of each time point, the target color information of the target virtual scene corresponding to the current time point can be restored through the original color information and the virtual scene map corresponding to the preset time point, and then the target virtual scene is rendered according to the target color information, so that the target virtual scene adaptive to the current time point is presented on the graphical user interface. Because the process of extracting the color characteristic data corresponding to each time point can be offline processed in advance, the data can be directly acquired during the game running to restore the target color information of the target virtual scene corresponding to the current time point, the calculation amount and the resource expenditure during the running are greatly reduced, better effect and performance are realized on various equipment platforms, the process can be automatically completed, extra manual participation is not needed, the performance expenditure during the running is very low, and the best effect of adapting to the day and night change of the environment in real time is achieved through the most economical performance expenditure.
Drawings
In order to more clearly illustrate the technical solution of the present invention, the drawings needed to be used in the description of the present invention will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without inventive exercise.
FIG. 1 is a flow chart illustrating the steps of a method for controlling the display of a game according to an embodiment of the present invention;
FIG. 2A is a schematic diagram of a game scenario provided by an embodiment of the invention;
FIG. 2B is a schematic diagram of a virtual scene map according to an embodiment of the present invention;
fig. 2C is a schematic diagram of six surfaces of a virtual scene map according to an embodiment of the present invention;
FIG. 3 is a schematic diagram of a sample vector provided by an embodiment of the present invention;
fig. 4 is a block diagram of a display control apparatus in a game according to an embodiment of the present invention;
FIG. 5 is a block diagram of an electronic device of the present invention;
fig. 6 is a block diagram of a computer-readable storage medium of the present invention.
Detailed Description
In order to make the aforementioned objects, features and advantages of the present invention comprehensible, embodiments accompanied with figures are described in further detail below. It is to be understood that the embodiments described are only a few embodiments of the present invention, and not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
In order to solve the problems that the effect of real-time adaptation of a game scene at a mobile terminal to the change of an environment is poor, and the adaptation environment causes extra load overhead, so that the game cannot be normally used, the embodiment of the invention provides the display control method in the game. Meanwhile, when the data are stored, the virtual scene map corresponding to the preset time point is stored as a calculation reference instead of storing the virtual scene map of each time point, the target color information of the target virtual scene corresponding to the current time point can be restored through the original color information and the virtual scene map corresponding to the preset time point, and then the target virtual scene is rendered according to the target color information, so that the target virtual scene adaptive to the current time point is presented on the graphical user interface. Because the process of extracting the color characteristic data corresponding to each time point can be offline processed in advance, the data can be directly acquired during the game running to restore the target color information of the target virtual scene corresponding to the current time point, the calculation amount and the resource expenditure during the running are greatly reduced, better effect and performance are realized on various equipment platforms, the process can be automatically completed, extra manual participation is not needed, the performance expenditure during the running is very low, and the best effect of adapting to the day and night change of the environment in real time is achieved through the most economical performance expenditure.
The in-game display control method in one embodiment of the invention can be operated on a terminal device or a server. The terminal device may be a local terminal device. When the in-game display control method is executed on a server, the in-game display control method may be implemented and executed based on a cloud interactive system, where the cloud interactive system includes the server and a client device.
In an optional embodiment, various cloud applications may be run under the cloud interaction system, for example: and (5) cloud games. Taking a cloud game as an example, a cloud game refers to a game mode based on cloud computing. In the running mode of the cloud game, the running main body of the game program and the game picture presenting main body are separated, the storage and the running of the display control method in the game are completed on the cloud game server, and the client device is used for receiving and sending data and presenting the game picture, for example, the client device can be a display device with a data transmission function close to a user side, such as a first terminal device, a television, a computer, a palm computer and the like; however, the terminal device performing the display control method in the game is a cloud game server in the cloud. When a game is played, a player operates the client device to send an operation instruction to the cloud game server, the cloud game server runs the game according to the operation instruction, data such as game pictures and the like are encoded and compressed, the data are returned to the client device through a network, and finally the data are decoded through the client device and the game pictures are output.
In an alternative embodiment, the terminal device may be a local terminal device. Taking a game as an example, the local terminal device stores a game program and is used for presenting a game screen. The local terminal device is used for interacting with the player through a graphical user interface, namely, a game program is downloaded and installed and operated through an electronic device conventionally. The manner in which the local terminal device provides the graphical user interface to the player may include a variety of ways, for example, it may be rendered for display on a display screen of the terminal or provided to the player through holographic projection. For example, the local terminal device may include a display screen for presenting a graphical user interface including a game screen and a processor for running the game, generating the graphical user interface, and controlling display of the graphical user interface on the display screen.
Referring to fig. 1, a flowchart illustrating steps of an embodiment of a display control method in a game according to an embodiment of the present invention is shown, which may specifically include the following steps:
step 101, acquiring a current time point in the game;
the game comprises a game scene and game characters in a virtual world, a player can control the game characters to play in the virtual world, and in some games, the virtual world is combined with real scenes in a real world, so that the virtual world and the real world are interacted, the sense of the game of the player is improved, and different game experiences are brought to the player. Specifically, during the running of the game, 24-hour day and night alternation in the real world can be simulated, the game is distinguished from an independent time system of the real world, and the virtual scene has different expression effects in the game at different time points, for example, the virtual scene is displayed to be brighter when the illumination is sufficient in the daytime, and the virtual scene is displayed to be dimmer when the illumination is insufficient at night.
In a specific implementation, the game engine may provide a data access interface, and the current time point in the game process may be acquired through the data access interface when the game runs, so as to display the target virtual scene adapted to the current time point in the graphical user interface. It should be noted that the current time point may be any time, and is not limited to an integral time, for example, the current time point may be 2:00, 3:00, or 3:20, and the embodiment of the present invention is not limited thereto.
Step 102, searching original color information of a target virtual scene corresponding to the current time point;
the original color information refers to the original color feature data of the game environment, and the original color information may be a color value and is represented as (R, G, B) or (R, G, B, a), where R represents the color of the red channel, G represents the color of the green channel, B represents the color of the blue channel, and a represents the transparency, and the value of the a channel may not participate in the calculation because the transparency of the virtual scene displayed at different time points in the game is the same.
In specific implementation, color feature data of virtual scenes corresponding to different time points can be extracted offline in advance, the color feature data are stored, and when a game runs, original color information of a target virtual scene corresponding to the current time point is searched from the stored data.
103, determining target color information of a target virtual scene corresponding to the current time point according to the original color information and a virtual scene map corresponding to a preset time point;
the method includes the steps that a time point is preset at a preset time point, a virtual scene map corresponding to the preset time point is used as a reference for rendering a virtual scene, the virtual scene map corresponding to the preset time point is shot offline in advance and stored, the virtual scene map corresponding to the preset time point is directly obtained during game running to calculate and obtain target color information of a target virtual scene corresponding to the current time point, and the target virtual scene is rendered according to the target color information.
Specifically, each position is determined as a sampling point from a game scene in sequence, a sampling vector p (X, Y, Z) is determined based on the sampling point, the sampling vector refers to a reflection vector corresponding to a direction vector of the sampling point in a virtual scene of a virtual camera observation target, a reference color value can be obtained by sampling on a virtual scene chartlet corresponding to a preset time point according to the sampling vector, meanwhile, weight values of the sampling vector in three directions of an X axis, a Y axis and a Z axis are calculated, a color offset ratio of the sampling point can be obtained by calculation according to the weight value and original color information, and finally color information of the sampling point is restored according to the color offset ratio and the reference color value. And repeating the process until all positions in the game scene are traversed, and obtaining the target color information of the target virtual scene corresponding to the current time point.
And 104, rendering the target virtual scene according to the target color information to obtain the target virtual scene adaptive to the current time point.
In the embodiment of the invention, after the target color information of the target virtual scene corresponding to the current time point is restored, the target virtual scene can be rendered according to the target color information to obtain the target virtual scene adaptive to the current time point, so that the color of the displayed target virtual scene is also changed gradually along with the progress of the game, and the effect that the environmental color is also changed continuously when the day and the night are alternated for 24 hours is presented.
In a preferred embodiment of the present invention, the step 101 may specifically include the following sub-steps:
acquiring a current time point in a game process in real time; or, acquiring the current time point in the game process according to a preset time interval.
In the embodiment of the present invention, the current time point in the game process may be set to be obtained in real time, the current time point in the game process is obtained according to the preset time interval, and after the current time point is obtained, the subsequent step 102 and 104 are executed to render the target virtual scene adapted to the current time point, so as to present the effect that the environmental color is also changed continuously when the day and the night are alternated for 24 hours. The preset time interval may refer to a preset period, such as 5 minutes, and the preset time interval may be set according to a display effect requirement, which is not limited in this embodiment of the present invention.
In a preferred embodiment of the present invention, before the step 101, the method may further include the steps of:
acquiring a plurality of virtual scene maps acquired respectively based on a plurality of time points; determining original color information of the plurality of virtual scene maps.
The plurality of time points may be a plurality of time points set in advance, for example, 24 hours may be divided into 24 time points with every 1 hour as a unit, and 24 hours may be divided into 48 time points with every half hour as a unit. In a specific implementation, the time point division may be set according to the game effect requirement, and the embodiment of the present invention is not limited thereto.
Specifically, the shooting processing of the virtual scene map is performed by switching the game scene to each set time point, so as to collect the virtual scene map corresponding to each time point. After the virtual scene maps are acquired, the original color information of each virtual scene map can be further determined.
In a preferred embodiment of the present invention, the virtual scene map collected at each time point includes: determining original color information of a plurality of virtual scene maps shot from a plurality of preset directions by a virtual camera, comprising:
respectively extracting color information of all pixel points in the sub-scene mapping in each direction; determining a color mean value corresponding to the sub-scene mapping according to the color information of all the pixel points; and determining the color average value of the sub-scene maps in each direction as the original color information corresponding to the virtual scene map.
In the embodiment of the present invention, the virtual scene map collected at each time point includes: the virtual scene map is captured by a virtual camera from a plurality of preset directions, where the preset directions refer to preset sampling directions, for example, the preset directions may be front, back, left, right, top and bottom directions, respectively, as shown in fig. 2A, a schematic diagram of a virtual scene, as shown in fig. 2B, a schematic diagram of a virtual scene map obtained by sampling the virtual scene in fig. 2A, where the virtual scene map includes sub-scene maps corresponding to the six directions one to one, and as shown in fig. 2C, a schematic diagram of six faces of the virtual scene map includes X-, X +, Y-, Y +, Z-, and Z +, which correspond to the front, back, left, right, top and bottom directions, respectively.
When the original color information of the virtual scene map needs to be determined, the color information of all pixel points in the sub-scene map corresponding to each direction can be respectively extracted, the color mean value corresponding to the sub-scene map is determined according to the color information of all pixel points in the sub-scene map, and the color mean value of the sub-scene map in each direction is determined as the original color information corresponding to the virtual scene map. The number of pixel points included in the sub-scene map may be set according to the game effect requirement, which is not limited in the embodiment of the present invention, and assuming that 128 × 128 pixel points are included in the sub-scene map in each direction, 16384(128 × 128) color sample values may be extracted based on each direction, and the stored data amount may be greatly reduced by calculating the color average value of the pixel points as the color information of the sub-scene map in the direction for storage.
In a preferred embodiment of the present invention, the determining the color mean value corresponding to the sub-scene map according to the color information of all the pixel points includes:
abnormal value elimination processing is carried out on the color information of all pixel points in the sub scene mapping so as to obtain filtered color information; and determining a color mean value corresponding to the sub-scene map according to the filtered color information.
Specifically, in order to avoid the influence of over-bright or over-dark colors on the whole normal data sample in each direction, sample data cleaning is required, abnormal value removing processing is performed on color information of all pixel points in the sub-scene map to obtain filtered color information, and a color mean value corresponding to the sub-scene map is determined according to the filtered color information. For example, assuming that there are 128 × 128 sample data in one direction, the 128 × 128 sample data are subjected to outlier rejection. As an example, data filtering may be performed based on 3Sigma criteria, where the 3Sigma criteria is to calculate and process a group of detected data to obtain a standard deviation, and determine an interval according to a certain probability, and consider that an error exceeding the interval is not a random error but a coarse error, and data containing the error should be removed.
In a preferred embodiment of the present invention, after the step of determining the original color information of the plurality of virtual scene maps, the method further includes:
and carrying out interpolation processing on the original color information of the plurality of virtual scene maps to obtain the color information of any time point.
In a specific implementation, color information of a plurality of virtual scene maps is interpolated, for example, color information of a corresponding virtual scene map between every two time points may be interpolated by linear or curved lines, so as to obtain color information of any time point.
In a preferred embodiment of the present invention, the method may further comprise the steps of:
and determining the virtual scene map corresponding to the preset time point from the plurality of virtual scene maps.
Specifically, a time point is randomly determined as a preset time point, and a virtual scene map corresponding to the preset time point is determined from a plurality of shot virtual scene maps, the virtual scene map corresponding to the preset time point can be used as a reference for rendering a virtual scene, only the virtual scene map corresponding to the preset time point needs to be stored, and other time points are rendered according to the virtual scene maps corresponding to the preset time point during rendering, so that the corresponding virtual scene maps of each time point are prevented from being stored, and the consumption of a memory can be reduced.
In a preferred embodiment of the present invention, the step 103 includes:
acquiring a sampling vector, and determining weight values of the sampling vector in X, Y and Z axis directions of a three-dimensional coordinate system; the sampling vector is a reflection vector corresponding to a direction vector of the virtual camera for observing the target virtual scene; calculating to obtain a color offset ratio according to the weight value and the original color information; sampling from a virtual scene map corresponding to a preset time point according to the sampling vector to obtain reference color information; and calculating to obtain target color information according to the color deviation ratio and the reference color information.
Specifically, each position may be sequentially determined from the target virtual scene as a sampling point, and a sampling vector may be determined based on the sampling point, where the sampling vector refers to a reflection vector corresponding to a direction vector of the sampling point in the target virtual scene observed by the virtual camera. After the sampling vector is obtained, determining the weight values of the sampling vector in the X, Y and Z axis directions of a three-dimensional coordinate system, then determining a color offset ratio according to the weight values and color information, sampling from a virtual scene chartlet corresponding to a preset time point according to the sampling vector to obtain reference color information, and calculating to obtain target color information according to the color offset ratio and the reference color information.
In a preferred embodiment of the present invention, the virtual scene map corresponding to the preset time point includes sub-scene maps corresponding to a plurality of preset directions, and the obtaining of the reference color information by sampling from the virtual scene map corresponding to the preset time point according to the sampling vector includes:
and determining a target map from the sub-scene maps corresponding to the preset directions according to the direction of the sampling vector, and sampling from the target map according to the size of the sampling vector to obtain reference color information.
Specifically, the target map may be determined from the sub-scene maps in the directions corresponding to the multiple preset directions according to the positive and negative of each component x, y, and z in the sampling vector, and then the reference color information may be obtained by sampling from the target map according to the size of each component x, y, and z in the sampling vector.
In a preferred embodiment of the present invention, the calculating the target color information according to the color shift ratio and the reference color information includes:
and multiplying the color deviation ratio with reference color information and preset parameters to obtain target color information, wherein the preset parameters are used for adjusting the display effect of the target virtual scene.
The preset parameter is a preset coefficient, and may be set according to a game display effect requirement, which is not limited in the embodiment of the present invention. In the embodiment of the present invention, when calculating the target color information, the color shift ratio may be multiplied by the reference color information and the preset parameter to obtain the target color information.
The following describes the above-described process of calculating the target color information in detail:
by sequentially determining a position from a target virtual scene as a sampling point, a sampling vector is obtained based on the reflection direction of the observation direction (direction from a virtual camera to the sampling point) of the sampling point and is marked as CaptureVector (X, Y, Z), as shown in FIG. 3, reference color information C (r, g, b) can be obtained by sampling (X +, X-, Y +, Y-, Z +, Z-) on each sub-scene map of the virtual scene map corresponding to a preset time point according to the positive and negative values and the sizes of each component of X, Y, Z in the sampling vector, and then color information values Di (r, g, b) in the current six directions are obtained according to the current time point, wherein (i:, (-) X +, X-, Y +, Y-, Z +, Z-).
First, the sample vector is normalized (formula x + y + z ═ 1) and expressed as normalized vector (normalized vector). Then, the weight values of the sampling vector in the X, Y, and Z-axis directions are calculated from the normalized values, and expressed as nsquad ═ NormalizedVector.
The color information of each face at the current time point is represented as follows:
ColorX=CaptureVector.x>0Dx+:Dx-;
ColorY=CaptureVector.y>0Dy+:Dy-;
ColorZ=CaptureVector.z>0Dz+:Dz-。
in the calculation process of obtaining the color information of each surface, whether x is greater than 0 is judged, if x is greater than 0, Dx + is taken as ColorX, and if x is less than 0, Dx-is taken as ColorX; judging whether y is greater than 0, if y is greater than 0, taking Dy + as ColorY, and if y is smaller than 0, taking Dy-as ColorY; and judging whether z is larger than 0, if z is larger than 0, taking Dz + as ColorZ, and if z is smaller than 0, taking Dz-as ColorZ.
Further, the color shift ratio TintColor is calculated by the following formula:
TintColor=nSquared.x*ColorX+nSquared.y*ColorY+nSquared.z*ColorZ。
and calculating a target Color value according to the Color deviation ratio and the reference Color information, wherein Color is C, TintColo Scale. Wherein, Scale is a preset coefficient for offset adjustment of other effects.
In the embodiment of the invention, the current time point in the game process is obtained, the original color information of the target virtual scene corresponding to the current time point is searched, the target color information of the target virtual scene corresponding to the current time point is determined according to the original color information and the virtual scene mapping corresponding to the preset time point, and the target virtual scene is rendered according to the target color information to obtain the target virtual scene adaptive to the current time point. Therefore, when the game engine runs, different dyeing is carried out on the game engine only by matching with a pre-generated parameterized curve based on a set environment illumination map (a virtual scene map corresponding to a preset time point), and the rendering object can restore and generate an environment illumination effect similar to 24-hour day and night environment illumination according to the virtual scene map dyed at different time points, so that the environment illumination information with various time changes can be accurately and continuously expressed in the way. The processing mode ensures that the dyeing change transition of the virtual scene chartlet is continuous without generating abnormal jump, greatly saves the calculation amount of operation and the memory occupation, and can accurately feed back the environment color low-frequency information in a plurality of preset directions (such as front, back, left, right, upper and lower directions).
The method has the advantages that the calculated amount and the resource expenditure in the running process are greatly saved because most steps only need to be processed in an off-line mode, the environment information of the current time point can be continuously and accurately restored only by the aid of the specified standard environment map and the original color information of the target virtual scene corresponding to each time point generated in an off-line mode in the running process, the calculation and memory expenditure is low, the best effect is achieved under the condition of the minimum extra load, the performance and the effect performance are good on various equipment platforms, the process can be automatically completed, and extra manual participation is not needed.
It should be noted that, for simplicity of description, the method embodiments are described as a series of acts or combination of acts, but those skilled in the art will recognize that the present invention is not limited by the illustrated order of acts, as some steps may occur in other orders or concurrently in accordance with the embodiments of the present invention. Further, those skilled in the art will appreciate that the embodiments described in the specification are presently preferred and that no particular act is required to implement the invention.
Referring to fig. 4, a block diagram of a structure of a display control device in a game according to an embodiment of the present invention is shown, which may specifically include the following modules:
a current time point obtaining module 401, configured to obtain a current time point in a game process;
an original color information searching module 402, configured to search original color information of a target virtual scene corresponding to the current time point;
a target color information determining module 403, configured to determine, according to the original color information and a virtual scene map corresponding to a preset time point, target color information of a target virtual scene corresponding to the current time point;
a rendering module 404, configured to render the target virtual scene according to the target color information, so as to obtain a target virtual scene adapted to the current time point.
In a preferred embodiment of the present invention, the current time point obtaining module 401 includes:
the first current time point acquisition submodule is used for acquiring a current time point in the game process in real time; or the like, or, alternatively,
and the second current time point acquisition submodule is used for acquiring the current time point in the game process according to the preset time interval.
In a preferred embodiment of the present invention, the apparatus further comprises:
the environment map acquisition module is used for acquiring a plurality of virtual scene maps acquired based on a plurality of time points respectively;
and the color information determining module is used for determining the original color information of the virtual scene maps.
In a preferred embodiment of the present invention, the virtual scene map collected at each time point includes: the sub-scene maps shot from a plurality of preset directions by a virtual camera, the color information determination module, comprising:
the color information extraction submodule is used for respectively extracting the color information of all pixel points in the sub scene mapping corresponding to each direction;
the color mean value calculation submodule is used for determining the color mean value corresponding to the sub-scene mapping according to the color information of all the pixel points;
and the color information determining submodule is used for determining the color average value of the sub-scene maps in each direction as the original color information corresponding to the virtual scene map.
In a preferred embodiment of the present invention, the color mean calculation sub-module includes:
the data filtering unit is used for removing abnormal values of the color information of all the pixel points in the sub-scene mapping to obtain filtered color information;
and the color mean value calculating unit is used for determining the color mean value corresponding to the child birth warp map according to the filtered color information.
In a preferred embodiment of the present invention, the method further comprises:
and the interpolation processing submodule is used for carrying out interpolation processing on the original color information of the multiple virtual scene maps to obtain the color information of any time point.
In a preferred embodiment of the present invention, the method further comprises:
and the map determining module is used for determining the virtual scene map corresponding to the preset time point from the plurality of virtual scene maps.
In a preferred embodiment of the present invention, the target color information calculation module 403 includes:
the weight value calculation submodule is used for acquiring a sampling vector and determining the weight values of the sampling vector in the X, Y and Z axis directions of a three-dimensional coordinate system; the sampling vector is a reflection vector corresponding to a direction vector of the virtual camera for observing the target virtual scene;
the color offset ratio calculation submodule is used for calculating to obtain a color offset ratio according to the weight value and the original color information;
the reference color information sampling submodule is used for sampling from the virtual scene map corresponding to the preset time point according to the sampling vector to obtain reference color information;
and the target color information calculation submodule is used for calculating to obtain target color information according to the color offset ratio and the reference color information.
In a preferred embodiment of the present invention, the virtual scene map corresponding to the preset time point includes sub-scene maps respectively corresponding to a plurality of preset directions, and the reference color information sampling sub-module includes:
and the reference color information sampling unit is used for determining a target map from the sub-scene maps corresponding to the preset directions according to the direction of the sampling vector, and sampling from the target map according to the size of the sampling vector to obtain reference color information.
In a preferred embodiment of the present invention, the target color information calculation sub-module includes:
and the target color information calculating unit is used for multiplying the color deviation ratio with reference color information and preset parameters to obtain target color information, wherein the preset parameters are used for adjusting the display effect of the target virtual scene.
For the device embodiment, since it is basically similar to the method embodiment, the description is simple, and for the relevant points, refer to the partial description of the method embodiment.
An embodiment of the present invention further provides an electronic device, as shown in fig. 5, including:
a processor 501 and a storage medium 502, wherein the storage medium 502 stores machine-readable instructions executable by the processor 501, and when the electronic device runs, the processor 501 executes the machine-readable instructions to perform the method according to any one of the embodiments of the present invention. The specific implementation and technical effects are similar, and are not described herein again.
An embodiment of the present invention further provides a computer-readable storage medium, as shown in fig. 6, where the storage medium stores a computer program 601, and the computer program 601 is executed by a processor to perform the method according to any one of the embodiments of the present invention. The specific implementation and technical effects are similar, and are not described herein again.
The embodiments in the present specification are described in a progressive manner, each embodiment focuses on differences from other embodiments, and the same and similar parts among the embodiments are referred to each other.
As will be appreciated by one skilled in the art, embodiments of the present invention may be provided as a method, apparatus, or computer program product. Accordingly, embodiments of the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, embodiments of the present invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
Embodiments of the present invention are described with reference to flowchart illustrations and/or block diagrams of methods, terminal devices (systems), and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing terminal to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing terminal, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing terminal to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing terminal to cause a series of operational steps to be performed on the computer or other programmable terminal to produce a computer implemented process such that the instructions which execute on the computer or other programmable terminal provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
While preferred embodiments of the present invention have been described, additional variations and modifications of these embodiments may occur to those skilled in the art once they learn of the basic inventive concepts. Therefore, it is intended that the appended claims be interpreted as including preferred embodiments and all such alterations and modifications as fall within the scope of the embodiments of the invention.
Finally, it should also be noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or terminal that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or terminal. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or terminal that comprises the element.
The display control method in the game and the display control device in the game provided by the invention are described in detail, and the principle and the implementation mode of the invention are explained by applying specific examples, and the description of the examples is only used for helping to understand the method and the core idea of the invention; meanwhile, for a person skilled in the art, according to the idea of the present invention, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present invention.

Claims (13)

1. A display control method in a game, comprising:
acquiring a current time point in a game process;
searching the original color information of the target virtual scene corresponding to the current time point;
determining target color information of a target virtual scene corresponding to the current time point according to the original color information and a virtual scene map corresponding to a preset time point;
and rendering the target virtual scene according to the target color information to obtain the target virtual scene adaptive to the current time point.
2. The method of claim 1, wherein the obtaining a current point in time in the game process comprises:
acquiring a current time point in a game process in real time; or the like, or, alternatively,
and acquiring the current time point in the game process according to a preset time interval.
3. The method of claim 1, further comprising, prior to the step of obtaining a current point in time in the game play, the steps of:
acquiring a plurality of virtual scene maps acquired respectively based on a plurality of time points;
determining original color information of the plurality of virtual scene maps.
4. The method of claim 3, wherein the virtual scene map collected at each time point comprises: determining original color information of a plurality of virtual scene maps shot from a plurality of preset directions by a virtual camera, comprising:
respectively extracting color information of all pixel points in the sub-scene mapping corresponding to each direction;
determining a color mean value corresponding to the sub-scene mapping according to the color information of all the pixel points;
and determining the color average value of the sub-scene maps in each direction as the original color information corresponding to the virtual scene map.
5. The method of claim 4, wherein the determining the color mean corresponding to the sub-scene map according to the color information of all the pixel points comprises:
abnormal value elimination processing is carried out on the color information of all pixel points in the sub scene mapping so as to obtain filtered color information;
and determining a color mean value corresponding to the sub-scene map according to the filtered color information.
6. The method of claim 3, further comprising, after the step of determining the raw color information of the plurality of virtual scene maps:
and carrying out interpolation processing on the original color information of the plurality of virtual scene maps to obtain the color information of any time point.
7. The method of claim 3, further comprising:
and determining the virtual scene map corresponding to the preset time point from the plurality of virtual scene maps.
8. The method according to claim 1, wherein the determining the target color information of the target virtual scene corresponding to the current time point according to the original color information and the virtual scene map corresponding to a preset time point comprises:
acquiring a sampling vector, and determining weight values of the sampling vector in X, Y and Z axis directions of a three-dimensional coordinate system; the sampling vector is a reflection vector corresponding to a direction vector of the virtual camera for observing the target virtual scene;
calculating to obtain a color offset ratio according to the weight value and the original color information;
sampling from the virtual scene map corresponding to the preset time point according to the sampling vector to obtain reference color information;
and calculating to obtain target color information according to the color deviation ratio and the reference color information.
9. The method according to claim 8, wherein the virtual scene map corresponding to the predetermined time point includes sub-scene maps corresponding to a plurality of predetermined directions, respectively, and the sampling from the virtual scene map corresponding to the predetermined time point according to the sampling vector to obtain reference color information includes:
and determining a target map from the sub-scene maps corresponding to the preset directions according to the direction of the sampling vector, and sampling from the target map according to the size of the sampling vector to obtain reference color information.
10. The method according to claim 8, wherein calculating target color information based on the color shift ratio and the reference color information comprises:
and multiplying the color deviation ratio with reference color information and preset parameters to obtain target color information, wherein the preset parameters are used for adjusting the display effect of the target virtual scene.
11. A display control apparatus in a game, comprising:
the current time point acquisition module is used for acquiring a current time point in the game process;
the original color information searching module is used for searching the original color information of the target virtual scene corresponding to the current time point;
the target color information determining module is used for determining the target color information of the target virtual scene corresponding to the current time point according to the original color information and the virtual scene map corresponding to the preset time point;
and the rendering module is used for rendering the target virtual scene according to the target color information so as to obtain the target virtual scene adaptive to the current time point.
12. An electronic device, comprising:
a processor and a storage medium storing machine-readable instructions executable by the processor, the processor executing the machine-readable instructions to perform the method of any one of claims 1-10 when the electronic device is run.
13. A computer-readable storage medium, characterized in that the storage medium has stored thereon a computer program which, when being executed by a processor, carries out the method according to any one of claims 1-10.
CN202111164019.6A 2021-09-30 Display control method and device in game Active CN113893533B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111164019.6A CN113893533B (en) 2021-09-30 Display control method and device in game

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111164019.6A CN113893533B (en) 2021-09-30 Display control method and device in game

Publications (2)

Publication Number Publication Date
CN113893533A true CN113893533A (en) 2022-01-07
CN113893533B CN113893533B (en) 2024-07-09

Family

ID=

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050085296A1 (en) * 2003-10-17 2005-04-21 Gelb Daniel G. Method and system for real-time rendering within a gaming environment
CN106355637A (en) * 2016-08-30 2017-01-25 北京像素软件科技股份有限公司 Game scene environment rendering method
CN108038897A (en) * 2017-12-06 2018-05-15 北京像素软件科技股份有限公司 Shadow map generation method and device
CN108579082A (en) * 2018-04-27 2018-09-28 网易(杭州)网络有限公司 The method, apparatus and terminal of shadow are shown in game
CN111467805A (en) * 2020-05-11 2020-07-31 网易(杭州)网络有限公司 Method and device for realizing dynamic change of virtual scene, medium and electronic equipment
CN112169324A (en) * 2020-09-22 2021-01-05 完美世界(北京)软件科技发展有限公司 Rendering method, device and equipment of game scene
CN112402974A (en) * 2020-11-23 2021-02-26 成都完美时空网络技术有限公司 Game scene display method and device, storage medium and electronic equipment
CN113368496A (en) * 2021-05-14 2021-09-10 广州三七互娱科技有限公司 Weather rendering method and device for game scene and electronic equipment

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050085296A1 (en) * 2003-10-17 2005-04-21 Gelb Daniel G. Method and system for real-time rendering within a gaming environment
CN106355637A (en) * 2016-08-30 2017-01-25 北京像素软件科技股份有限公司 Game scene environment rendering method
CN108038897A (en) * 2017-12-06 2018-05-15 北京像素软件科技股份有限公司 Shadow map generation method and device
CN108579082A (en) * 2018-04-27 2018-09-28 网易(杭州)网络有限公司 The method, apparatus and terminal of shadow are shown in game
CN111467805A (en) * 2020-05-11 2020-07-31 网易(杭州)网络有限公司 Method and device for realizing dynamic change of virtual scene, medium and electronic equipment
CN112169324A (en) * 2020-09-22 2021-01-05 完美世界(北京)软件科技发展有限公司 Rendering method, device and equipment of game scene
CN112402974A (en) * 2020-11-23 2021-02-26 成都完美时空网络技术有限公司 Game scene display method and device, storage medium and electronic equipment
CN113368496A (en) * 2021-05-14 2021-09-10 广州三七互娱科技有限公司 Weather rendering method and device for game scene and electronic equipment

Similar Documents

Publication Publication Date Title
CN108295467B (en) Image presentation method and device, storage medium, processor and terminal
RU2727101C1 (en) Image processing device, method and storage medium
JPH10143678A (en) Illumination and shading simulation of computer graphics /image generator
CN109829868B (en) Lightweight deep learning model image defogging method, electronic equipment and medium
CN108830923B (en) Image rendering method and device and storage medium
CN112446939A (en) Three-dimensional model dynamic rendering method and device, electronic equipment and storage medium
CN108965847A (en) A kind of processing method and processing device of panoramic video data
CN108063915A (en) A kind of image-pickup method and system
CN111161685B (en) Virtual reality display equipment and control method thereof
CN111583378A (en) Virtual asset processing method and device, electronic equipment and storage medium
CN109218817A (en) A kind of method and apparatus showing virtual present prompting message
CN105898343B (en) A kind of net cast, terminal net cast method and apparatus
CN111191542A (en) Abnormal action recognition method, device, medium and electronic equipment in virtual scene
CN113893533B (en) Display control method and device in game
CN112231020B (en) Model switching method and device, electronic equipment and storage medium
CN113893533A (en) Display control method and device in game
CN113230659A (en) Game display control method and device
CN113457134A (en) Game scene processing method, device, equipment and storage medium
CN116761017A (en) High availability method and system for video real-time rendering
CN111696034A (en) Image processing method and device and electronic equipment
CN107577808B (en) Method, device, server and medium for sorting multi-level list pages
CN103514593B (en) Image processing method and device
CN112691378B (en) Image processing method, apparatus and readable medium
CN115228083A (en) Resource rendering method and device
CN111514586B (en) Motion blur implementation method and device, storage medium and electronic device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant