CN111408131B - Information processing method and device in game, electronic equipment and storage medium - Google Patents

Information processing method and device in game, electronic equipment and storage medium Download PDF

Info

Publication number
CN111408131B
CN111408131B CN202010305756.2A CN202010305756A CN111408131B CN 111408131 B CN111408131 B CN 111408131B CN 202010305756 A CN202010305756 A CN 202010305756A CN 111408131 B CN111408131 B CN 111408131B
Authority
CN
China
Prior art keywords
light source
target
running track
game
target light
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010305756.2A
Other languages
Chinese (zh)
Other versions
CN111408131A (en
Inventor
谢耿
林才钊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Netease Hangzhou Network Co Ltd
Original Assignee
Netease Hangzhou Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netease Hangzhou Network Co Ltd filed Critical Netease Hangzhou Network Co Ltd
Priority to CN202010305756.2A priority Critical patent/CN111408131B/en
Publication of CN111408131A publication Critical patent/CN111408131A/en
Application granted granted Critical
Publication of CN111408131B publication Critical patent/CN111408131B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Processing Or Creating Images (AREA)
  • Image Generation (AREA)

Abstract

The disclosure relates to an information processing method and device in a game, electronic equipment and a storage medium, and relates to the technical field of games, wherein the method comprises the following steps: acquiring a first running track of a target light source in a game scene in the daytime and a second running track of the target light source at night; adjusting the second running track so that the second running track is positioned above the horizon; obtaining target parameters for forming a non-real game scene picture in response to an adjustment operation of rendering parameters for the game scene; and rendering the game scene according to the first running track of the target light source and the target parameter or according to the adjusted second running track and the target parameter to obtain a non-real game scene picture. The method and the device can improve the flexibility and convenience of scene picture generation and realize the effect of the non-real game scene picture.

Description

Information processing method and device in game, electronic equipment and storage medium
Technical Field
The present disclosure relates to the field of game technology, and more particularly, to an information processing method in a game, an information processing apparatus in a game, an electronic device, and a computer-readable storage medium.
Background
With the development of game applications, the requirements for scenes in the game applications are gradually increased. In the related art, most of the physical calculations based on pursuing a real picture are performed during day and night processing in a game. In the above manner, fine adjustment of scene images cannot be flexibly and conveniently performed, and the method has certain limitations, is relatively complex in operation, low in efficiency and cannot realize an unreal day and night state.
It should be noted that the information disclosed in the above background section is only for enhancing understanding of the background of the present disclosure and thus may include data that does not constitute prior art known to those of ordinary skill in the art.
Disclosure of Invention
The present disclosure is directed to a method and apparatus for processing information in a game, an electronic device, and a storage medium, and thus, at least to some extent, to overcome the problem that a scene picture cannot be flexibly adjusted due to limitations and drawbacks of the related art.
Other features and advantages of the present disclosure will be apparent from the following detailed description, or may be learned in part by the practice of the disclosure.
According to one aspect of the present disclosure, there is provided an information processing method in a game, including: acquiring a first running track of a target light source in a game scene in the daytime and a second running track at night; adjusting the second running track so that the second running track is positioned above the horizon; obtaining target parameters for forming a non-real game scene picture in response to an adjustment operation of rendering parameters for the game scene; and rendering the game scene according to the first running track of the target light source and the target parameter or according to the adjusted second running track and the target parameter to obtain a non-real game scene picture.
In an exemplary embodiment of the present disclosure, the adjusting the second moving track so that the second moving track is located above the horizon includes: and adjusting the second running track to be consistent with the first running track.
In an exemplary embodiment of the present disclosure, the method further comprises: and setting projection of a target object in the game scene under the action of the target light source according to the first running track of the target light source or the adjusted second running track.
In an exemplary embodiment of the present disclosure, the setting a projection formed by a target object in the game scene under the action of the target light source according to the first running track or the adjusted second running track of the target light source includes: and when the target light source enters a target time interval, setting the projection formed by the target object under the action of the target light source to be displayed in a manner from fading to fading.
In an exemplary embodiment of the present disclosure, the maximum length of the projection is equal to a preset value.
In an exemplary embodiment of the present disclosure, the rendering parameters include at least one of: and the illumination information, the fog effect color and the background color of the target light source.
In an exemplary embodiment of the present disclosure, the illumination information of the target light source includes at least one of: the height of the target light source, the angle of the target light source, the illumination color of the target light source, the illumination brightness of the target light source and the ambient light color formed by the target light source.
In an exemplary embodiment of the present disclosure, the method further comprises: responding to the adjustment operation of the light receiving parameters of the target object in the game scene, and adjusting the light receiving parameters of the target object; and rendering the target object in the game scene according to the light receiving parameter of the target object.
In an exemplary embodiment of the present disclosure, the method further comprises: acquiring sky textures matched with the current time; and rendering sky in the game scene according to the sky texture.
In an exemplary embodiment of the present disclosure, the method further comprises: and switching the non-real game scene picture to a preset baking scene picture in response to a switching operation aiming at the baking effect.
According to one aspect of the present disclosure, there is provided an information processing apparatus in a game, including: the track acquisition module is used for acquiring a first running track of the target light source in the game scene in the daytime and a second running track of the target light source at night; the track adjusting module is used for adjusting the second running track so that the second running track is positioned above the horizon; a parameter adjustment module for obtaining target parameters for forming a non-real game scene picture in response to an adjustment operation of rendering parameters for the game scene; and the picture forming module is used for rendering the game scene according to the first running track of the target light source and the target parameter or according to the adjusted second running track and the target parameter to obtain a non-real game scene picture.
According to one aspect of the present disclosure, there is provided an electronic device including: a processor; and a memory for storing executable instructions of the processor; wherein the processor is configured to execute the information processing method in the game of any one of the above via execution of the executable instructions.
According to an aspect of the present disclosure, there is provided a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements the information processing method in a game of any one of the above.
In the information processing method in the game, the information processing device in the game, the electronic equipment and the computer readable storage medium provided in the exemplary embodiments of the present disclosure, on one hand, by adjusting the second running track of the target light source at night, the game scene can be more consistent with the actual situation and more accurate. On the basis, as the rendering parameters in the game scene can be adjusted to obtain the target parameters for forming the non-real game scene picture, the non-real game scene picture is formed according to the first running track and the target parameters or according to the adjusted second running track and the target parameters, the problem that a day and night system can only be built according to actual conditions in the related art is avoided, the limitation in the related art is avoided, the convenience is improved, the scene picture can be flexibly and finely adjusted to have differentiation, the non-real game scene picture is realized on the basis of conforming to the actual conditions, the non-real day and night effect is obtained, and the diversity of the display picture is enriched. On the other hand, as the rendering parameters in the game scene can be adjusted to obtain the target parameters for forming the non-real game scene picture, the non-real game scene picture is formed according to the first running track and the target parameters or according to the adjusted second running track and the target parameters, and the target parameters are accurately obtained through the adjustment operation of the rendering parameters, the limitations in the related art are avoided, the operation steps are reduced, and the operation efficiency is improved.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the disclosure and together with the description, serve to explain the principles of the disclosure. It will be apparent to those of ordinary skill in the art that the drawings in the following description are merely examples of the disclosure and that other drawings may be derived from them without undue effort.
FIG. 1 schematically illustrates a schematic diagram of an information processing method in a game in an exemplary embodiment of the present disclosure;
FIG. 2 schematically illustrates a schematic diagram of a trajectory of a target light source in a game in an exemplary embodiment of the present disclosure;
FIG. 3 schematically illustrates a schematic diagram of a modified trajectory of a target light source in an exemplary embodiment of the present disclosure;
fig. 4 schematically illustrates a motion trajectory diagram corresponding to processing a projection in an exemplary embodiment of the present disclosure;
FIG. 5 schematically illustrates a schematic view of a projection in an exemplary embodiment of the present disclosure;
FIG. 6 schematically illustrates a schematic diagram of a trajectory obtained by adjusting the height and angle of a target light source in an exemplary embodiment of the present disclosure;
FIG. 7 schematically illustrates a schematic diagram of a system interface in an exemplary embodiment of the present disclosure;
FIG. 8 schematically illustrates a schematic view of a scene screen corresponding to rendering parameters in an exemplary embodiment of the present disclosure;
fig. 9 schematically illustrates a schematic view of a scene screen corresponding to a baking effect in an exemplary embodiment of the present disclosure;
FIG. 10 schematically illustrates a schematic view of a scene picture corresponding to a sky texture in an exemplary embodiment of the present disclosure;
FIG. 11 schematically illustrates a schematic view of a scene screen corresponding to a target object in an exemplary embodiment of the present disclosure;
FIG. 12 schematically illustrates a block diagram of an information processing apparatus in a game in an exemplary embodiment of the present disclosure;
fig. 13 schematically illustrates a block diagram of an electronic device in an exemplary embodiment of the present disclosure.
Detailed Description
Example embodiments will now be described more fully with reference to the accompanying drawings. However, the exemplary embodiments may be embodied in many forms and should not be construed as limited to the examples set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of the example embodiments to those skilled in the art. The described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. In the following description, numerous specific details are provided to give a thorough understanding of embodiments of the present disclosure. One skilled in the relevant art will recognize, however, that the aspects of the disclosure may be practiced without one or more of the specific details, or with other methods, components, devices, steps, etc. In other instances, well-known technical solutions have not been shown or described in detail to avoid obscuring aspects of the present disclosure.
Furthermore, the drawings are merely schematic illustrations of the present disclosure and are not necessarily drawn to scale. The same reference numerals in the drawings denote the same or similar parts, and thus a repetitive description thereof will be omitted. Some of the block diagrams shown in the figures are functional entities and do not necessarily correspond to physically or logically separate entities. These functional entities may be implemented in software or in one or more hardware modules or integrated circuits or in different networks and/or processor devices and/or microcontroller devices.
The present exemplary embodiment first provides an information processing method in a game, which can be applied to various game scenes. The specific description of the information processing method in the game with reference to the accompanying drawings is as follows:
referring to fig. 1, in step S110, a first moving track of a target light source in a game scene during daytime and a second moving track at night are acquired.
In this exemplary embodiment, the target light source may be a virtual sun in the game, and the target light source may be simulated according to a real sun. The motion trail can be used for reflecting the motion change condition of the virtual sun, and the motion trail can be determined in a simulation mode according to the motion trail of the real sun. The first moving track refers to a moving track of the target light source in daytime, and the second moving track refers to a moving track of the target light source in night.
The specific process of determining the running track may include: and determining a complete circumferential cycle for the illumination direction of the target light source, and simulating the running track of the target light source in the game through the cyclic change of the illumination angle. The operation of the target light source can be represented by the angle of the parallel light, namely, the daytime when the angle of the parallel light is between positive 0 degrees and 180 degrees, and the night when the angle of the parallel light is between negative 0 degrees and 180 degrees. If the game scene has a definite direction and orientation requirement, the target light source can be controlled to operate according to the actual operation condition (physical law of east-west fall) according to the actual condition. Parallel light is a concept of lamplight in a game engine, is used for simulating sun illumination, and is one of the most main light sources for illumination of a game scene. The parallel light in the engine has no position information and only direction information. The first moving track and the second moving track of the target light source may be obtained as shown in fig. 2.
After the first and second trajectories are acquired, a real day-night system can be constructed first. The circadian system refers to a system that can be implemented in a game engine that simulates day-night alternation phenomenon in the real world. The method is mainly used for simulating illumination change between the days and brings more substitution game experience to players. In the course of game development, in order to simulate the real day-to-day alternation, it is often necessary to customize a complete system for the engine that is suitable for the game itself.
In step S120, the second running track is adjusted so that the second running track is located above the horizon.
In the embodiments of the present disclosure, in a night scene constituted between negative 0 degrees and 180 degrees, the illumination direction is illuminated from below to above, thus resulting in formation of an erroneous light shadow shown in fig. a in fig. 3. In the related art, a model can be made below the ground surface to shield light, so that the light cannot penetrate, but a piece of black paint cannot produce a projection effect at night, so that the effect is inaccurate. In order to solve the above-mentioned problem, in the embodiments of the present disclosure, the second moving track of the target light source may be adjusted to obtain the correct light and shadow change at night.
The adjustment of the second trajectory of the target light source can be understood as: and adjusting the second running track to be consistent with the first running track, and enabling the second running track which is originally below the horizon to be above the horizon. Specifically, the adjustment process may include: and mirroring the second running track of the target light source and controlling the target light source to run in the same direction according to the first running track. The mirror image processing refers to mirror image of the second moving track of the target light source so as to be consistent with the first moving track, thereby controlling the target light source in day and night to move in the same direction on the same moving track. Therefore, the target light source can always run on the first running track between 0 and 180 degrees, the correct light and shadow change can be obtained, the problem of night projection is solved, and the projection accuracy can be improved. The first moving track and the second moving track of the target light source provided in the embodiment of the present disclosure are shown in a diagram B in fig. 3.
After the second running track is adjusted, a projection formed by the target object in the game scene under the action of the target light source can be set according to the first running track of the target light source or the adjusted second running track, so that a real day and night system is constructed from the dimension of the projection.
Wherein the target object may be any virtual character or virtual object in the game scene, etc., such as a virtual building. In the game scene, the projection of the target object can be generated under the action of the target light source, and in particular, the projection of the shadow which is formed by mapping the target object and used for representing the target object can be caused by the first running track or the second running track of the target light source and the illumination direction. After the first running track or the adjusted second running track is determined, the projection effect of the target object can be processed to construct a real day and night scene. The step of setting a projection of a target object in the game scene formed by the target light source comprises the following steps: and adjusting the display mode of the target object based on the projection generated by the target light source according to a plurality of time intervals. The time interval is a constraint condition for measuring whether to adjust the projection, and can be specifically determined according to practical situations. In the embodiment of the disclosure, the target time interval may be an interval in which projection can be smoothly displayed in the switching process of day and night. The target time interval may include projecting a plurality of time intervals having the same trend of variation. For example, the target time interval may include a 3-point to 6-point time interval and an 18-point to 21-point time interval. When the target light source enters the target time interval, the formed projection may be set to display in a fade-out to fade-in manner. The fading-out means gradually shortening, which is understood as that the projection fades out (fades out) until the projection completely disappears, specifically, the projection length gradually shortens to 0 so as to achieve the fading out, and the projection length may specifically gradually shorten from the maximum length (preset value) until the projection length is 0. The preset value may be determined according to practical situations, and is not particularly limited herein. By progressively, i.e. progressively elongating, it is understood that the display is progressively faded out until full display, in particular the length of the projection is progressively longer starting from 0 to achieve a progressive display, e.g. the projection length is progressively increased from 0 until the length is a preset value for representing the maximum length. When the target light source is outside the target time interval, the projection of the target object is always displayed.
Fig. 4 schematically shows a schematic diagram of a trajectory of the projection processing, with reference to fig. 4, from 0 to 3, there being a night projection; from 3 to 6 points, the projection is faded out to faded in until it appears completely (first gradually shortens until it disappears and then gradually appears); the projection display is carried out in the daytime between 6 points and 12 points, the projection length is gradually shortened from a preset value, and the projection length is shortest when the projection length is 12 points; the projection between 12 points and 18 points is gradually lengthened to a preset value; between 18 and 21 points, the projection fades out to fades in until it appears completely; between 21 and 24, night projection display.
In the embodiment of the disclosure, by the method of dividing the time interval, when the target light source enters the target time interval, the display mode of the projection generated under the action of the target light source is adjusted, and the real day and night system can be formed according to the adjusted projection. When alternating in daytime and night, abrupt change can be avoided, so that the phenomenon that the projection is lengthened but the transition which cannot be shortened immediately is not natural is avoided, the projection is displayed in a mode of combining gradual shortening (fading) and gradual lengthening (fading) in a target time interval, the natural smooth projection transition can be realized in the day-night switching process, and the display effect and the projection accuracy and rationality are improved. The display state of the projection may be as shown in fig. 5, in which fig. 5, a diagram a, is the projection before adjustment, and fig. 5, B, is the projection after adjustment.
Therefore, the constructed day and night system can be more in accordance with the actual condition, more reasonable and accurate, the projection display mode is more accurate and more in accordance with the actual condition, the unnatural phenomenon of switching between the day and the night is avoided, and the smooth transition and switching between the day and the night are realized.
With continued reference to fig. 1, in step S130, target parameters for forming a non-real game scene picture are obtained in response to an adjustment operation of rendering parameters for the game scene.
In the disclosed embodiments, the rendering parameters may be parameters of various types and derived from different objects for forming a game scene picture. The different objects may include the target light source and the scene itself. The rendering parameters may include illumination information of the target light source itself, and may also include scene parameters such as fog color and background color. The illumination information of the target light source comprises at least one of the height of the target light source, the angle of the target light source, the illumination color of the target light source, the illumination brightness of the target light source and the ambient light color formed by the target light source. The height of the target light source is used for realizing the light and shadow effect of different longitudes and latitudes, and the projection length is mainly influenced; the angle of the target light source is used for realizing the shadow effect of different longitudes and latitudes, and mainly influences the projection direction; the illumination color simulation of the target light source influences the sun illumination color of the scene; the illumination brightness simulation of the target light source influences the sun illumination brightness of the scene; the color of the ambient light formed by the target light source refers to the color influence of indirect ambient illumination on the object; the fog effect color largely determines the overall color tone of the scene; the background color determines the scene background color.
For the above rendering parameters, each rendering parameter may be in one-to-one correspondence with the time parameter, that is, each rendering parameter may be added to the time parameter, so as to combine the scene parameter and the time parameter to control the game scene frame at the current time corresponding to the time parameter. The time parameter may be a newly added time point or a modified time point, a removed time point, etc.
The target parameter refers to a parameter of an unreal game scene picture generated by rendering the game scene after the rendering parameter is adjusted. The target parameter may correspond to the type of rendering parameter, but the specific value of the target parameter may be different from the rendering parameter, but adjusted according to actual requirements. By adjusting the rendering parameters, the target parameters can be accurately determined, and the accuracy is improved.
In step S140, the game scene is rendered according to the first running track of the target light source and the target parameter, or according to the adjusted second running track and the target parameter, so as to obtain a non-real game scene picture.
In the embodiment of the disclosure, the non-real game scene picture refers to a non-real game scene picture which is formed to be not completely consistent with the real game scene picture and displayed on the terminal after the user responds to the adjustment operation of the rendering parameters on the basis of the real scene picture.
Based on the steps, scene rendering can be carried out according to the combination of the first running track and the target parameters in daytime to obtain a non-real game scene picture; and rendering the game scene according to the adjusted second running track and the target parameter at night to obtain a non-real game scene picture.
When the rendering parameter is illumination information of the target light source, the length and direction of projection corresponding to the target object can be adjusted according to the position parameter of the target light source, namely, the height of the target light source and the angle of the target light source, and the first running track and the second running track can be adjusted according to the height and the angle of the target light source. In some specific scenes, in order to restore the real longitude and latitude positions of the scene, the incidence angle and the height of the target light source can be more accurately determined. The motion trail of the target light source and the length change of projection are influenced by adjusting different parameters. Referring to fig. 6, the moving trajectories are completely different at different angles and heights. Therefore, by adjusting the height and angle of the target light source, the moving track and the display state of projection can be determined in a targeted manner, and self-defined adjustment is realized.
When the rendering parameters are color parameters such as illumination color, illumination brightness and formed ambient light color of the target light source, the scheme is realized based on the color parameters because the unreal effect is mainly pulled apart from the color aspect by the difference distance from the real vision and the real effect is based on the illumination attribute. If the day-night change is converted into a black-to-white change, the concept of color is given to the change, and a color change between days is formed. Several main rendering parameters affecting the effect can be extracted into each time parameter to be corresponding to each specific time point, and the color change of day and night can be achieved through the color setting of the specific time point and the interpolation calculation of a program, so that an unreal game scene picture is formed. Specifically, the target parameter can be obtained in response to the adjustment operation of the color parameter in the rendering parameter, and the non-real game scene picture is formed by combining the target parameter and the interpolation parameter obtained by the interpolation operation. For example, two time points, 0 point and 3 points, the color parameter of 0 point (illumination color of the target light source) is set to yellow, and the color parameter of 3 points is set to red, so that the interpolation parameter can be calculated by program interpolation as the color of 1 point and 2 points, specifically, orange between transition from yellow to red, i.e. the interpolation parameter is between the color parameters at the end points. In the embodiment of the disclosure, the color parameters and the interpolation parameters are combined according to the time parameters, so that the non-real game scene picture can be formed in the color dimension.
Referring to fig. 7, the time parameter may be a newly added time point or a modified time point, specifically, may be any time point within 24 hours, where the newly added time point may automatically generate corresponding various attributes for fine arts adjustment. For example, the system can freely add time points, unnecessary time points can be removed at any time, and the time points which are wanted to be checked can be input at any time for previewing. In the embodiment of the disclosure, a complete attribute control system can be built in an engine, and a finer performance effect is achieved by adjusting each attribute, and the basic content mainly comprises: system switch, effect preview, time axis (control total axis of the whole time line, and any time point effect can be checked at any time by adjusting the axis), time acceleration (speed up preview), time point, adding or deleting time point, and other detail parameters. These attributes will constitute a controllable system that can be adjusted according to its own needs to make the scene picture configurable, and further form the non-real game scene picture according to the configuration result (configured parameters).
The rendering parameters may also include fog and background colors. Based on the method, the target parameters can be obtained by adjusting the fog effect color and the background color, and the game scene is rendered according to the target parameters so as to generate a non-real game scene picture.
It should be noted that, the starting point of each day and the ending point of the previous day, that is, the parameters of 0 o 'clock must be consistent with the rendering parameters of 24 o' clock, so that the day and night can be seamlessly joined, and the non-real game scene picture obtained by adjusting the rendering parameters in the day and night switching process can be as shown in fig. 8.
In the embodiment of the disclosure, the game scene may be rendered through the target parameters corresponding to the first running track and the rendering parameters, or through the target parameters corresponding to the second running track and the rendering parameters, so as to obtain a non-real game scene picture. The control can be performed from the dimensions of rendering parameters and running tracks, so that on the basis of a real day and night system, the adjustment operation can be realized, the unreal effect can be further realized, the limitations in the related technology are avoided, and the flexibility and the diversity are improved.
In addition, the non-real game scene picture may be generated by: and switching the non-real game scene picture to a preset baking scene picture in response to a switching operation for the baking effect. The baking effect may in particular be configured with baking parameters of the scene, which refer to parameters for providing auxiliary illumination, in particular may be represented by shader coefficients. Since the condition that the real-time light and the baking effect coexist at night possibly exists, the light needs to be lightened in the scene, but no other real-time light except one parallel light can lighten the scene, at the moment, the effect at night needs to be baked in advance, a shader parameter of a shader coefficient is added in a day and night system, the real-time effect and the baking effect are switched by adjusting the shader coefficient, and the unreal game scene is switched to a preset baking scene. The preset baking scene picture here refers to a non-real game scene picture that lights up at night. The non-real game scene picture corresponding to the baking parameters may be referred to as shown in fig. 9, wherein a graph a in fig. 9 is a real-time effect of not baking during the day, and a graph B in fig. 9 is a display picture of a night baking effect.
In addition, the non-real game scene picture may be generated by: responding to the adjustment operation of the light receiving parameters of the target object in the game scene, and adjusting the light receiving parameters of the target object; and rendering the target object in the game scene according to the light receiving parameter of the target object. The light receiving parameter refers to the degree to which the target object receives the target light source, and may specifically be a bright portion parameter and a dark portion parameter of the target object. And determining the light receiving parameters of the target object under the action of the target light source by adjusting the light receiving parameters of the target object so as to render the target object.
A non-real game scene picture of a target object is schematically shown in fig. 10. The light receiving parameters of the target object may include a bright portion and a dark portion. The bright part refers to a portion of the target subject that receives light. Referring to fig. 10, when the target object is a house, the bright portion may be the intensity of influence of ambient light in the scene on the roof. Ambient light refers to re-illumination generated by the scene environment built in the engine, has an important influence on the light receiving of objects in the scene, and is an important component of the scene illumination. The dark portions refer to the non-light-receiving portions and the shadow portions, and referring to fig. 10, may be the intensity of the influence of ambient light on the projection and the non-light-receiving walls of the house. The coloring device of the target object is processed, and two light receiving parameters of the bright part and the dark part are added to control the influence degree of the ambient light on the whole target object, so that the color contrast of the bright part and the dark part of the target object is increased, the color contrast is expressed by more uniform colors, and the unreal effect is realized. Where a shader refers to applets running on a processor for performing image shading (calculating illumination, brightness, color, etc. in an image) that are run for a particular portion of a graphics rendering pipeline. The adjusted light receiving parameters are obtained by adjusting the light receiving parameters of the target objects, and the target objects of the game scene can be further rendered according to the adjusted light receiving parameters, so that a non-real game scene picture is realized.
Further, in order to improve the display effect, a sky texture matched with the current time can be obtained, and sky in the game scene is rendered according to the sky texture. Because part of game scenes can be flat view or full open view operation, besides illumination change, the environment and the light are synchronized by switching different sky texture effects in combination with light change. Specifically, the sky texture of the game scene can be combined with the time parameter, different sky textures are added to different time points in the day and night system, and when the time line goes to a certain time point, the switching of the sky texture is triggered, so that the matching of time and the sky effect is achieved, and the non-real game scene picture comprising the sky texture is displayed. More precisely, a special effect mark of the target light source can be made, and the special effect mark is bound on the calculated first running track or the adjusted second running track of the target light source, so that the special effect mark is displaced along with time change, and the special effect mark is like a sun, moon, bright east, rising and west in reality, and simulates a more complete day and night effect. A display of sky textures may be shown with reference to fig. 11.
The system is based on the premise that parallel light (namely sunlight) can be used as real-time light, and corresponding object colorants are needed to be matched. By the interaction between the two, the unrealistic day and night change effect is achieved. The method is not completely realized according to the actual physical rule, so that on the premise of being based on the basic physical rule, the method also needs to make adjustment operation according to the actual situation so as to achieve the natural combination of reality and non-reality. Since the effect is not completely debugged based on the real situation, the subjective consciousness of the creator is more biased, and the main factors influencing the diurnal variation are extracted to form a diurnal variation system. In the embodiment of the disclosure, the first moving track and the second moving track of the target light source are determined, the second moving track of the target light source is adjusted, and the display mode of the projection of the target object under the action of the target light source is adjusted, so that the display device is more in line with the actual situation, a real day and night system is displayed, and when the switching is performed in the daytime and at night, the switching can be performed naturally and smoothly, and the switching effect is improved. On the basis of real day and night, on the premise of following an actual change rule, and on the premise of rendering target parameters obtained after the rendering parameters are adjusted, rendering a game scene to determine an unreal game scene picture, thereby constructing an unreal day and night change system, and improving accuracy, pertinence and extensibility. Further, under the condition of simulating real day and night alternation, the rendering parameters of the game scene are adjusted to obtain target parameters, the scene can be rendered according to the first running track and the target parameters on the basis of a real day and night system or according to the adjusted second running track and the target parameters from the target parameters of the target light source, the scene is rendered, the non-real game scene picture is constructed, the limitation can be avoided, the differential display is realized, the display effect is improved, the diversity and the operability of the scene picture are increased, convenience and flexibility can be provided, and the operation efficiency is improved.
In the present exemplary embodiment, there is also provided an in-game information processing apparatus, referring to fig. 12, the in-game information processing apparatus 1200 may include:
the track acquisition module 1201 is configured to acquire a first running track of a target light source in a game scene during daytime and a second running track of the target light source during night;
a track adjustment module 1202, configured to adjust the second moving track so that the second moving track is above the horizon;
a parameter adjustment module 1203 configured to obtain target parameters for forming a non-real game scene picture in response to an adjustment operation of rendering parameters for the game scene;
and the image forming module 1204 is configured to render the game scene according to the first running track of the target light source and the target parameter, or according to the adjusted second running track and the target parameter, to obtain an unreal game scene image.
In one exemplary embodiment of the present disclosure, the trajectory adjustment module is configured to: and adjusting the second running track to be consistent with the first running track.
In an exemplary embodiment of the present disclosure, the apparatus further comprises: the projection generating module is used for setting the projection formed by the target object in the game scene under the action of the target light source according to the first running track of the target light source or the adjusted second running track.
In one exemplary embodiment of the present disclosure, a projection generation module includes: and the projection display module is used for setting the projection formed by the target object under the action of the target light source to be displayed in a manner from fading to fading when the target light source enters the target time interval.
In an exemplary embodiment of the present disclosure, the maximum length of the projection is equal to a preset value.
In an exemplary embodiment of the present disclosure, the rendering parameters include at least one of: and the illumination information, the fog effect color and the background color of the target light source.
In an exemplary embodiment of the present disclosure, the illumination information of the target light source includes at least one of: the height of the target light source, the angle of the target light source, the illumination color of the target light source, the illumination brightness of the target light source and the ambient light color formed by the target light source.
In an exemplary embodiment of the present disclosure, the apparatus further comprises: the light receiving adjustment module is used for responding to the adjustment operation of the light receiving parameters of the target object in the game scene and adjusting the light receiving parameters of the target object; and the target object rendering module is used for rendering the target object in the game scene according to the light receiving parameter of the target object.
In an exemplary embodiment of the present disclosure, the apparatus further comprises: the texture determining module is used for obtaining sky textures matched with the current time; and the sky rendering module is used for rendering the sky in the game scene according to the sky texture.
In an exemplary embodiment of the present disclosure, the apparatus further comprises: and the baking switching module is used for switching the non-real game scene picture to a preset baking scene picture in response to a switching operation aiming at the baking effect.
It should be noted that, the specific details of each module in the above information processing apparatus in the game have been described in detail in the corresponding information processing method in the game, and thus will not be described here again.
It should be noted that although in the above detailed description several modules or units of a device for action execution are mentioned, such a division is not mandatory. Indeed, the features and functionality of two or more modules or units described above may be embodied in one module or unit in accordance with embodiments of the present disclosure. Conversely, the features and functions of one module or unit described above may be further divided into a plurality of modules or units to be embodied.
Furthermore, although the various steps of the methods in this disclosure are depicted in a particular order in the drawings, this does not require or imply that the steps must be performed in that particular order, or that all illustrated steps be performed, to achieve desirable results. Additionally or alternatively, some steps may be omitted, multiple steps combined into one step to be performed, and/or one step decomposed into multiple steps to be performed, etc.
In an exemplary embodiment of the present disclosure, an electronic device capable of implementing the above method is also provided.
Those skilled in the art will appreciate that the various aspects of the invention may be implemented as a system, method, or program product. Accordingly, aspects of the invention may be embodied in the following forms, namely: an entirely hardware embodiment, an entirely software embodiment (including firmware, micro-code, etc.) or an embodiment combining hardware and software aspects may be referred to herein as a "circuit," module "or" system.
An electronic device 1300 according to this embodiment of the invention is described below with reference to fig. 13. The electronic device 1300 shown in fig. 13 is merely an example and should not be construed as limiting the functionality and scope of use of embodiments of the present invention.
As shown in fig. 13, the electronic device 1300 is embodied in the form of a general purpose computing device. The components of the electronic device 1300 may include, but are not limited to: the at least one processing unit 1310, the at least one memory unit 1320, a bus 1330 connecting the different system components (including the memory unit 1320 and the processing unit 1310), and a display unit 1340.
Wherein the storage unit stores program code that is executable by the processing unit 1310 such that the processing unit 1310 performs steps according to various exemplary embodiments of the present invention described in the above section of the "exemplary method" of the present specification. For example, the processing unit 1310 may perform the steps as shown in fig. 1.
The storage unit 1320 may include readable media in the form of volatile storage units, such as Random Access Memory (RAM) 13201 and/or cache memory 13202, and may further include Read Only Memory (ROM) 13203.
The storage unit 1320 may also include a program/utility 13204 having a set (at least one) of program modules 13205, such program modules 13205 including, but not limited to: an operating system, one or more application programs, other program modules, and program data, each or some combination of which may include an implementation of a network environment.
Bus 1330 may be a local bus representing one or more of several types of bus structures including a memory unit bus or memory unit controller, a peripheral bus, an accelerated graphics port, a processing unit, or using any of a variety of bus architectures.
The electronic device 1300 may also communicate with one or more external devices 1400 (e.g., keyboard, pointing device, bluetooth device, etc.), one or more devices that enable a user to interact with the electronic device 1300, and/or any device (e.g., router, modem, etc.) that enables the electronic device 1300 to communicate with one or more other computing devices. Such communication may occur through an input/output (I/O) interface 1350. Also, the electronic device 1300 may communicate with one or more networks such as a Local Area Network (LAN), a Wide Area Network (WAN) and/or a public network, for example, the Internet, through a network adapter 1360. As shown, the network adapter 1360 communicates with other modules of the electronic device 1300 over the bus 1330. It should be appreciated that although not shown, other hardware and/or software modules may be used in connection with electronic device 1300, including, but not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, RAID systems, tape drives, data backup storage systems, and the like.
In an exemplary embodiment of the present disclosure, a computer-readable storage medium having stored thereon a program product capable of implementing the method described above in the present specification is also provided. In some possible embodiments, the various aspects of the invention may also be implemented in the form of a program product comprising program code means for causing a terminal device to carry out the steps according to the various exemplary embodiments of the invention as described in the "exemplary methods" section of this specification, when said program product is run on the terminal device.
In the disclosed embodiments, a program product for implementing the above method according to an embodiment of the present invention is also described, which may employ a portable compact disc read only memory (CD-ROM) and comprise program code, and may be run on a terminal device, such as a personal computer. However, the program product of the present invention is not limited thereto, and in this document, a readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
The program product may employ any combination of one or more readable media. The readable medium may be a readable signal medium or a readable storage medium. The readable storage medium can be, for example, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples (a non-exhaustive list) of the readable storage medium would include the following: an electrical connection having one or more wires, a portable disk, a hard disk, random Access Memory (RAM), read-only memory (ROM), erasable programmable read-only memory (EPROM or flash memory), optical fiber, portable compact disk read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
The computer readable signal medium may include a data signal propagated in baseband or as part of a carrier wave with readable program code embodied therein. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination of the foregoing. A readable signal medium may also be any readable medium that is not a readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Program code for carrying out operations of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, C++ or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computing device, partly on the user's device, as a stand-alone software package, partly on the user's computing device, partly on a remote computing device, or entirely on the remote computing device or server. In the context of a remote computing device, the remote computing device may be connected to the user computing device through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computing device (e.g., connected via the Internet using an Internet service provider).
Furthermore, the above-described drawings are only schematic illustrations of processes included in the method according to the exemplary embodiment of the present application, and are not intended to be limiting. It will be readily appreciated that the processes shown in the above figures do not indicate or limit the temporal order of these processes. In addition, it is also readily understood that these processes may be performed synchronously or asynchronously, for example, among a plurality of modules.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This application is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.

Claims (13)

1. An information processing method in a game, comprising:
acquiring a first running track of a target light source in a game scene in the daytime and a second running track of the target light source at night;
the second running track is adjusted through mirror image processing, so that the second running track below the horizon is above the horizon, and the target light sources in the day and at night are controlled to run in the same direction;
Obtaining target parameters for forming a non-real game scene picture in response to an adjustment operation of rendering parameters for the game scene;
and rendering the game scene according to the first running track of the target light source and the target parameter or according to the adjusted second running track and the target parameter to obtain a non-real game scene picture.
2. The in-game information processing method according to claim 1, wherein the adjusting the second running track so that the second running track is located above a horizon comprises:
and adjusting the second running track to be consistent with the first running track.
3. The in-game information processing method according to claim 1, characterized in that the method further comprises:
and setting projection of a target object in the game scene under the action of the target light source according to the first running track of the target light source or the adjusted second running track.
4. The method for processing information in a game according to claim 3, wherein the setting a projection of a target object in the game scene formed by the target light source according to the first running track of the target light source or the adjusted second running track comprises:
And when the target light source enters a target time interval, setting the projection formed by the target object under the action of the target light source to be displayed in a manner from fading to fading.
5. The in-game information processing method according to claim 4, wherein a maximum length of the projection is equal to a preset value.
6. The in-game information processing method according to claim 1, wherein the rendering parameters include at least one of: and the illumination information, the fog effect color and the background color of the target light source.
7. The in-game information processing method according to claim 6, wherein the illumination information of the target light source includes at least one of: the height of the target light source, the angle of the target light source, the illumination color of the target light source, the illumination brightness of the target light source and the color of the environment light formed by the target light source.
8. The in-game information processing method according to claim 1, characterized in that the method further comprises:
responding to the adjustment operation of the light receiving parameters of the target object in the game scene, and adjusting the light receiving parameters of the target object;
And rendering the target object in the game scene according to the light receiving parameter of the target object.
9. The in-game information processing method according to claim 1, characterized in that the method further comprises:
acquiring sky textures matched with the current time;
and rendering the sky in the game scene according to the sky texture.
10. The in-game information processing method according to claim 1, characterized in that the method further comprises:
and switching the non-real game scene picture to a preset baking scene picture in response to a switching operation for the baking effect.
11. An information processing apparatus in a game, comprising:
the track acquisition module is used for acquiring a first running track of the target light source in the game scene in the daytime and a second running track of the target light source at night;
the track adjusting module is used for adjusting the second running track through mirror image processing, so that the second running track below the horizon is above the horizon, and the target light sources in the day and at night are controlled to run in the same direction;
a parameter adjustment module for obtaining target parameters for forming a non-real game scene picture in response to an adjustment operation of rendering parameters for the game scene;
And the picture forming module is used for rendering the game scene according to the first running track of the target light source and the target parameter or according to the adjusted second running track and the target parameter to obtain a non-real game scene picture.
12. An electronic device, comprising:
a processor; and
a memory for storing executable instructions of the processor;
wherein the processor is configured to perform the in-game information processing method of any one of claims 1-10 via execution of the executable instructions.
13. A computer-readable storage medium, on which a computer program is stored, characterized in that the computer program, when being executed by a processor, implements the information processing method in a game according to any one of claims 1-10.
CN202010305756.2A 2020-04-17 2020-04-17 Information processing method and device in game, electronic equipment and storage medium Active CN111408131B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010305756.2A CN111408131B (en) 2020-04-17 2020-04-17 Information processing method and device in game, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010305756.2A CN111408131B (en) 2020-04-17 2020-04-17 Information processing method and device in game, electronic equipment and storage medium

Publications (2)

Publication Number Publication Date
CN111408131A CN111408131A (en) 2020-07-14
CN111408131B true CN111408131B (en) 2023-09-26

Family

ID=71488565

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010305756.2A Active CN111408131B (en) 2020-04-17 2020-04-17 Information processing method and device in game, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN111408131B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112037292B (en) * 2020-09-01 2022-08-26 完美世界(北京)软件科技发展有限公司 Weather system generation method, device and equipment
CN112619162B (en) * 2020-12-22 2023-04-07 上海米哈游天命科技有限公司 Resource object management method and device, electronic equipment and storage medium
CN112799572B (en) * 2021-01-28 2023-04-07 维沃移动通信有限公司 Control method, control device, electronic equipment and storage medium
CN115430142A (en) * 2022-09-05 2022-12-06 北京有竹居网络技术有限公司 Game scene editing method, device, equipment and medium

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110152291A (en) * 2018-12-13 2019-08-23 腾讯科技(深圳)有限公司 Rendering method, device, terminal and the storage medium of game picture

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110152291A (en) * 2018-12-13 2019-08-23 腾讯科技(深圳)有限公司 Rendering method, device, terminal and the storage medium of game picture

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
网易.昼夜交替.http://mc.163.com/20191010/31527_839753.html.2019,第1-5页. *

Also Published As

Publication number Publication date
CN111408131A (en) 2020-07-14

Similar Documents

Publication Publication Date Title
CN111408131B (en) Information processing method and device in game, electronic equipment and storage medium
CN111068312B (en) Game picture rendering method and device, storage medium and electronic equipment
US20170287196A1 (en) Generating photorealistic sky in computer generated animation
CN109887066B (en) Lighting effect processing method and device, electronic equipment and storage medium
US9164723B2 (en) Virtual lens-rendering for augmented reality lens
EP4250125A1 (en) Method for generating firework visual effect, electronic device, and storage medium
CN110009720B (en) Image processing method and device in AR scene, electronic equipment and storage medium
US9183654B2 (en) Live editing and integrated control of image-based lighting of 3D models
CN111756956B (en) Virtual light control method and device, medium and equipment in virtual studio
KR20210157284A (en) Method for determining ambient illumination in ar scene, apparatus and storage medium
CN112891946B (en) Game scene generation method and device, readable storage medium and electronic equipment
CN109557674B (en) Brightness adjusting method and device
CN111915712B (en) Illumination rendering method and device, computer readable medium and electronic equipment
CN114119818A (en) Rendering method, device and equipment of scene model
JP7125963B2 (en) Information processing program, information processing apparatus, and information processing method
CN112734896A (en) Environment shielding rendering method and device, storage medium and electronic equipment
Hillaire A scalable and production ready sky and atmosphere rendering technique
CN111489430B (en) Game light and shadow data processing method and device and game equipment
CN110473279A (en) A kind of weather particle renders method, apparatus, computer equipment and storage medium
CN111127607B (en) Animation generation method, device, equipment and medium
US7020434B2 (en) Animated radar signal display with fade
CN117197296A (en) Traffic road scene simulation method, electronic equipment and storage medium
CN113368496B (en) Weather rendering method and device for game scene and electronic equipment
CN112446944B (en) Method and system for simulating real environment light in AR scene
CN115526976A (en) Virtual scene rendering method and device, storage medium and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant