WO2024037153A1 - 基于回合制对战的界面显示方法、信息提供方法及系统 - Google Patents

基于回合制对战的界面显示方法、信息提供方法及系统 Download PDF

Info

Publication number
WO2024037153A1
WO2024037153A1 PCT/CN2023/099637 CN2023099637W WO2024037153A1 WO 2024037153 A1 WO2024037153 A1 WO 2024037153A1 CN 2023099637 W CN2023099637 W CN 2023099637W WO 2024037153 A1 WO2024037153 A1 WO 2024037153A1
Authority
WO
WIPO (PCT)
Prior art keywords
battle
halo
server
rendering
information
Prior art date
Application number
PCT/CN2023/099637
Other languages
English (en)
French (fr)
Inventor
章辰昊
许东松
Original Assignee
腾讯科技(深圳)有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 腾讯科技(深圳)有限公司 filed Critical 腾讯科技(深圳)有限公司
Priority to KR1020247016180A priority Critical patent/KR20240093633A/ko
Publication of WO2024037153A1 publication Critical patent/WO2024037153A1/zh
Priority to US18/677,754 priority patent/US20240316451A1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/30Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers
    • A63F13/35Details of game servers
    • A63F13/355Performing operations on behalf of clients with restricted processing capabilities, e.g. servers transform changing game scene into an encoded video stream for transmitting to a mobile phone or a thin client
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/30Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers
    • A63F13/35Details of game servers
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/53Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/60Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor
    • A63F13/69Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor by enabling or updating specific game elements, e.g. unlocking hidden features, items, levels or versions
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/70Game security or game management aspects
    • A63F13/77Game security or game management aspects involving data related to game devices or game servers, e.g. configuration data, software version or amount of memory
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/822Strategy games; Role-playing games
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/02Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]

Definitions

  • Embodiments of the present application relate to the field of computer technology, and in particular to an interface display method, information providing method and system based on turn-based battles.
  • Turn-based role-playing game refers to a game that uses turn-based combat strategy.
  • players can play a master virtual character in the real world or virtual world, and can use the master virtual character or the pet virtual character owned by the master virtual character to interact with enemy units (such as in games NPC (Non-Player Character, non-player character), monster controlled by AI (Artificial Intelligence, artificial intelligence), or pet virtual character captured by other characters, etc.) engage in turn-based battles.
  • enemy units such as in games NPC (Non-Player Character, non-player character), monster controlled by AI (Artificial Intelligence, artificial intelligence), or pet virtual character captured by other characters, etc.
  • turn-based role-playing games provide two completely different maps: a world map and a battle map.
  • the master virtual character performs non-combat activities in the world map (such as playing, capturing pet avatars, collecting treasure boxes, collecting virtual props, etc.)
  • the world scene corresponding to the world map (or non-combat scene) is controlled by the scene server.
  • the scene server itself runs the environmental data used to generate the scene; in the battle scene, when the master virtual character performs combat activities in the battle map (such as controlling the captured pet virtual character and engaging in turn-based combat with enemy units)
  • the battle scene corresponding to the battle map is executed by the battle server.
  • the battle server needs to copy the environment data used to render the scene from the scene server, and generate a halo with the aura generated by the battle activities based on the copied environment data. scene.
  • the scene server for the world scene and the battle server for the battle scene both need to maintain a copy of environmental data.
  • the scene server When there are changes in the world scene, if the master virtual character performs combat activities in the battle map again, the scene server The updated environment data needs to be copied to the battle server again, so that the battle server can generate a scene with the aura generated by the battle activities based on the updated environment data.
  • This causes business logic and data maintenance gaps between the scene server and the battle server. Redundant, there is a huge amount of data processing and data transmission, which easily affects the efficiency of environment generation.
  • the embodiments of the present application provide an interface display method, information providing method and system based on turn-based battles, which can reduce the amount of data interactive transmission between the first server and the second server, and can also avoid the need for the first server and the second server to
  • the redundancy problem of maintaining a copy of scene data is reduced, which reduces the server maintenance overhead and improves the client interface display efficiency.
  • an interface display method based on turn-based battle is provided.
  • the method is executed by a client.
  • the method includes:
  • halo rendering information from a second server, the second server is used to process the world environment of the virtual world; wherein the halo rendering information is generated by the second server based on the halo information sent by the first server Information, the halo information is used to represent the halo triggered by the battle behavior, and the halo affects elements in the combat scene of the turn-based battle;
  • the halo rendering information is used to render the combat behavior.
  • the halo is rendered.
  • an information providing method based on turn-based battles is provided.
  • the method is executed by a server.
  • the server includes a first server and a second server.
  • the method includes:
  • the first server receives an initiation request for a combat action sent by the combat control process, and generates combat rendering information corresponding to the combat action according to the initiation request; wherein the combat action is an action initiated by a participant in a turn-based battle,
  • the battle rendering information is used to render the battle behavior;
  • the first server determines that the combat behavior triggers a halo
  • the first server sends a call request to the second server; wherein the call request includes halo information used to characterize the halo, and the halo has a negative impact on the halo.
  • Turn-based combat elements influence the combat scene;
  • the second server generates halo rendering information according to the call request, and sends the halo rendering information to the scene control process.
  • the halo rendering information is sent by the scene control process to the battle control process.
  • the halo rendering information Used to render the halo;
  • the first server After receiving the notification of successful delivery of the halo rendering information from the second server, the first server sends the combat rendering information to the combat control process.
  • an interface display device based on turn-based battle includes:
  • the battle control module is configured to send an initiation request for the battle behavior to the first server when it is detected that a participant in the turn-based battle initiates a battle behavior, and the first server is used to process the battle behavior;
  • the battle control module is also used to receive battle rendering information from the first server; wherein the battle rendering information is used to render the battle behavior;
  • a scene control module configured to receive aura rendering information from a second server, the second server being used to process the world environment of the virtual world; wherein the aura rendering information is generated by the second server according to the first Information generated by halo information sent by the server.
  • the halo information is used to represent the halo triggered by the battle behavior. The halo affects elements in the combat scene of the turn-based battle;
  • the combat control module is also configured to render the halo according to the halo rendering information during the process of rendering the combat behavior based on the combat rendering information.
  • a computer system includes a client and a server.
  • the client is used to execute the above-mentioned interface display method based on turn-based battle.
  • the server is used to execute the above-mentioned interface display method.
  • Information provision method based on turn-based combat.
  • a computer device includes a processor and a memory.
  • a computer program is stored in the memory.
  • the computer program is loaded and executed by the processor to implement the above.
  • the computer equipment includes terminal equipment and servers.
  • a computer-readable storage medium in which a computer program is stored, and the computer program is loaded and executed by a processor to implement the above-mentioned turn-based battle interface. Display method, or implement the above information providing method based on turn-based battle.
  • a computer program product or a computer program is provided, the computer program product or the computer program including computer instructions, and the computer instructions are stored in a computer-readable storage medium.
  • the processor of the computer device reads the computer instructions from the computer-readable storage medium, and the processor executes the computer instructions, so that the computer device performs the above-mentioned interface display method based on turn-based battles, or performs the above-mentioned information providing method based on turn-based battles. .
  • the client When the client detects that a participant in the turn-based battle initiates a battle behavior, the client sends an initiation request to the first server for processing the battle behavior, and receives a battle message sent by the first server for rendering the battle behavior. Rendering information; in addition, the first server will send the aura information representing the aura triggered by the battle behavior to the server used to process the world aura.
  • the second server of the environment receives the halo rendering information generated by the second server based on the halo information, and integrates the battle rendering information and the halo rendering information to implement the interface display process. That is to say: the two servers no longer perform the environment generation for combat behavior and the environment generation for non-combat behavior respectively.
  • the first server sends a small amount of halo information to the second server during combat behavior, so that even during combat behavior
  • the data related to environment generation is processed through the second server to avoid the problem that the second server needs to copy the overall update of the environment data to the first server, resulting in a large amount of data transmission.
  • This not only greatly reduces the time between the first server and the second server.
  • the amount of data exchange and transmission between them can also be avoided by maintaining a copy of scene information on the second server side to avoid the redundant problem of both the first server and the second server maintaining a copy of scene data, thereby effectively reducing server maintenance.
  • Overhead it is also conducive to improving the interface display efficiency and improving the efficiency of human-computer interaction.
  • Figure 1 is a schematic diagram of a world scene provided by an embodiment of the present application.
  • Figure 2 is a schematic diagram of a battle scene provided by an embodiment of the present application.
  • Figure 3 is a schematic diagram of the solution implementation environment provided by an embodiment of the present application.
  • Figure 4 is a flow chart of an interface display method based on turn-based combat provided by one embodiment of the present application
  • Figure 5 is a schematic diagram of a halo in a combat scene provided by an embodiment of the present application.
  • Figure 6 is a schematic diagram of a halo in a world scene provided by an embodiment of the present application.
  • Figures 7 to 8 exemplarily illustrate schematic diagrams of affinity rendering animation
  • Figures 9 to 10 exemplarily illustrate schematic diagrams of environmental effects on skills
  • Figure 11 is a flow chart of an information providing method based on turn-based combat provided by one embodiment of the present application.
  • Figures 12 to 13 are flow charts of an interface display method based on turn-based combat provided by another embodiment of the present application.
  • Figure 14 is a schematic diagram of the interaction between the battle control side and the scene control side provided by an embodiment of the present application.
  • Figure 15 is a block diagram of an interface display device for turn-based combat provided by an embodiment of the present application.
  • Figure 16 is a structural block diagram of a computer system provided by an embodiment of the present application.
  • Figure 17 is a structural block diagram of a computer device provided by an embodiment of the present application.
  • Virtual world It is the virtual world displayed (or provided) when the application is running on the terminal.
  • the virtual world can be a simulation environment of the real world, a semi-simulation and semi-fictional environment, or a purely fictitious environment.
  • the virtual world can be any one of a two-dimensional virtual world, a 2.5-dimensional virtual world, and a three-dimensional virtual world, which is not limited in this application.
  • the following embodiments illustrate that the virtual world is a three-dimensional virtual world.
  • Master virtual character refers to the movable object played by the player in the virtual world.
  • the master virtual character can be a virtual character, a virtual animal, an animation character, etc., such as a character or animal displayed in a three-dimensional virtual world.
  • the master virtual character is a three-dimensional model created based on animation skeleton technology.
  • Each master virtual character has its own shape and volume in the three-dimensional virtual world and occupies a part of the space in the three-dimensional virtual world.
  • Pet virtual character refers to a movable object controlled by artificial intelligence in the virtual world.
  • the pet virtual character can be a virtual creature, a virtual animal, a virtual monster, a virtual elf, a virtual pet, etc.
  • the world map includes multiple plots. Each plot is a polygonal plot.
  • the polygonal plot can be any one of square, rectangle, and hexagon.
  • each plot is a 50cm x 50cm square.
  • Each plot has its own surface properties. Surface properties include grass, stone, water, etc.
  • the multiple plots included in the world map can be of the same type, or a combination of multiple different types of plots.
  • Battle map Referring to Figures 1 and 2, in the virtual environment 10 (i.e., the virtual world), when the first pet avatar 12 encounters the second pet avatar 14 at a certain location on the world map and enters a battle. , place the world map with One or more plots within a certain range with the reference position determined by the first pet virtual character 12 as the center are determined as the battle map 16 .
  • the reference position is the position where the first pet avatar 12 is located, or a suitable combat position closest to the first pet avatar 12 .
  • the battle map 16 includes all plots within a circular range centered on the reference position and with a predetermined length as the radius; in some embodiments, the battle map 16 includes the reference position as the center and a predetermined length as the radius. and width of all plots within a rectangle.
  • World scene The world scene refers to the scene corresponding to the world map.
  • the world scene is displayed in the user interface, one or more plots in the world map can be displayed in the user interface.
  • the user interface displays one or more plots where the master virtual character or the pet avatar is currently located on the world map, as well as some interface elements related to the above-mentioned displayed plots, the master avatar, the pet avatar, etc. .
  • the elements in the embodiments of the present application can be used to compose a scene (such as a world scene), such as visual elements such as plots, master virtual characters, and pet virtual characters.
  • the battle scene refers to the scene corresponding to the battle map.
  • the battle map can be displayed in the user interface, such as displaying all or part of the land contained in the battle map.
  • the user interface displays the pet avatar or one or more plots where the pet avatar is currently located in the battle map, as well as some interface elements related to the above-mentioned displayed plots, the main control avatar, the pet avatar, etc.
  • the world scene and the battle scene can be switched, for example, from the world scene to the battle scene, or from the battle scene to the world scene.
  • the virtual camera can adopt different shooting perspectives when displaying world scenes and combat scenes.
  • the allowed user operations can also be different.
  • the user can make a perceptual distinction between the world scene and the battle scene.
  • the battle map used in the battle scene is one or more places in the world map. blocks, so the switching between the world scene and the battle scene does not bring a strong sense of tearing, but a very smooth and natural switching.
  • Potential energy attributes or signs that affect combat in the virtual world. Potential energy includes grass type, fire type, water type, stone type, ice type, electric type, poison type, light type, ghost type, demon type, normal type, martial type, cute type, fantasy type, insect type, wing type, dragon type At least one of the Department and Mechanical Department.
  • Halo An abstract element that has the ability to influence at least one visual element among the plots, master virtual characters, and pet virtual characters within a certain range in the virtual world. This abstract element is different from the elements in the embodiments of the present application. It can be specially used to describe the halo, making the halo concrete and visual. Halo is an abstract element in the virtual world that appears continuously, randomly, follows the master virtual character, follows the pet virtual character, appears triggered by skills, or appears triggered by props. Halos are invisible or visible in the virtual world, such as flame halo, light halo, blood-added halo, Buff halo, etc.
  • the battle process changes the world environment: When there is a turn-based battle between pet avatars, the skills released by the pet avatars will have an impact on the environment of the virtual world. For example, pet avatars fight in a battle scene, and the pet avatar releases a fire skill, and the grassland plots fired will be ignited.
  • the skill display effect i.e., halo
  • the skill display effect can be rendered by the battle control process through the halo rendering interface.
  • the skill display effect i.e., halo
  • Rendering is completed through the halo rendering interface.
  • the halo in the battle scene can be displayed permanently, and the halo in the world scene can be canceled after the continuous display threshold time, and the land corresponding to the halo returns to its original appearance.
  • the world environment changes the battle process: When there is a turn-based battle between pet avatars, the environment in the virtual world will have an impact on the pet avatars, such as the skill damage of the pet avatars, or the damage of the pet avatars.
  • the skill display effect has an impact.
  • the environment in the virtual world includes the environment of the land and the environment of the weather. These two aspects will comprehensively affect the dislike and liking of the pet virtual character for the environment.
  • the pet virtual character's dislike and liking for the environment include the following different levels: strong affinity, weak affinity, indifference, weak resistance, and strong resistance.
  • the pet avatar likes both environments (land + weather), it will obtain a strong affinity effect; if the pet avatar only likes one environment and does not hate the other environment, it will obtain a weak affinity effect; If the avatar likes one environment and hates another environment, no effect will be obtained; if the pet avatar only hates one environment, If you don't like the other environment, you will get a weak conflict effect; if the pet avatar hates both environments, you will get a strong conflict effect.
  • the server or client needs to obtain the environment regularly (such as every round) and determine the impact of the environment on the pet virtual character.
  • Position changes in turn-based battles In traditional turn-based battles, the standing positions of our pet avatar and the enemy's pet avatar in the battle map are fixed, that is, standing positions. In the embodiment of the present application, when the pet avatars are fighting against each other, both the attacking party and the attacked party are not stationary, that is, displacement will occur.
  • the pet avatar When the pet avatar is the attacker, active displacement will occur. If the first position currently located by the pet avatar meets the skill release conditions, the pet avatar can release the skill in place at the first position. If the above-mentioned first position does not meet the skill release conditions, before the skill is released, the pet avatar will be controlled to actively move from the first position to a second position that meets the skill release conditions, and the pet avatar will release the skill at the second position.
  • the above-mentioned second position can be called a legal combat point, which refers to a position that satisfies the skill release conditions of the pet virtual character.
  • the pet avatar after the skill is released, the pet avatar will move to a third position.
  • the third position may be the same as or different from the above-mentioned first position.
  • the above-mentioned third position can be called a legal standing point, which refers to the position of the pet avatar after the release of the skill is completed.
  • the above skill release conditions may be related to pet avatar, skills, environment and other
  • the pet avatar When the pet avatar is the one being attacked, it will produce passive displacement. When it is attacked by a skill, there will be a displacement process of being knocked back. For example, the pet avatar is currently in the first position and is attacked by a skill at the first position. The pet avatar can move from the first position to the fourth position. The fourth position can be based on the pet avatar as the attacked party and the attacking party. The pet avatar, the skills received, the environment and other factors are determined.
  • the environment it is in may also change, thereby affecting the battle of the pet avatar in subsequent rounds.
  • FIG. 3 shows a schematic diagram of a solution implementation environment provided by an embodiment of the present application.
  • the solution implementation environment can be implemented as a computer system architecture, and the implementation environment can include: a terminal device 310 and a server 320 .
  • the terminal device 310 may be an electronic device such as a mobile phone, a tablet computer, a game console, a multimedia playback device, a PC (Personal Computer), etc.
  • the terminal device 310 can install a client of an application, such as a game application, a simulation learning application, a virtual reality (Virtual Reality, VR) application, an augmented reality (Augmented Reality, AR) application, and a social networking application. Applications, interactive entertainment applications, etc.
  • a client running the turn-based RPG is installed in the terminal device 310; the server 320 is used to provide background services for clients of applications (such as game applications) in the terminal device 310 .
  • the server 320 may be a backend server of the above-mentioned application program (such as a game application program).
  • Server 320 may be one server, a server cluster composed of multiple servers, or a cloud computing service center.
  • the server 320 includes a first server 321 and a second server 322 .
  • the first server 321 runs combat behavior processing logic and affinity processing logic for logical processing of combat operations, skills, etc., and for calculating the environmental affinity (referred to as affinity) of the pet virtual character. .
  • the second server 322 runs environment processing logic and aura processing logic to logically process the world environment of the virtual world and logically process the aura corresponding to the battle behavior.
  • the terminal device 310 and the server 320 can communicate with each other through the network 330.
  • the network 330 may be a wired network or a wireless network.
  • the terminal device 310 obtains the battle rendering information from the first server 321, and the battle rendering information is used to render the battle behavior; in addition, the terminal device 310 obtains the halo rendering from the second server 322 Information, the halo rendering information is used to render the halo; then, the terminal device 310 displays and renders according to the battle rendering information and the halo rendering information.
  • Custom RPGs players play a virtual character in a real world or a virtual world.
  • This round Custom RPG provides two types of maps: world map and battle map.
  • virtual characters move around the world map, such as playing, capturing pet avatars, collecting treasure boxes, collecting virtual props, etc.; in combat scenarios, virtual characters control captured pet avatars in the battle map, and Enemy units (such as NPCs in the game, AI-controlled monsters, or pet virtual characters captured by other characters, etc.) engage in turn-based battles.
  • an innovative turn-based RPG mechanism is provided.
  • This turn-based RPG combines the traditional world map and battle map into one.
  • the battle map is a sub-map dynamically determined from the world map during each battle. In this way, when switching between world scenes (or non-combat scenes) and combat scenes, there will not be a huge difference in the map content displayed in the user interface, thereby avoiding the sense of tearing that exists in related technologies.
  • this turn-based RPG also allows the environment in the virtual world (weather, time, land, etc.) to affect the main control avatar, pet avatar and combat process, and conversely, the main control avatar, pet avatar and combat process also It will affect the environment in the virtual world, thereby organically integrating the turn-based combat process into the virtual world. It is no longer two torn parts, but forms a whole.
  • the battle process of this turn-based RPG can be singles, doubles, or team battles, which are not limited in the embodiments of the present application.
  • the battle process can be as follows:
  • Figure 4 shows a flow chart of an interface display method based on turn-based combat provided by an embodiment of the present application.
  • the execution subject of each step of the method can be the terminal device 310 in the solution implementation environment shown in Figure 3,
  • the method may include the following steps (steps 401 to 404):
  • Step 401 When it is detected that a participant in the turn-based battle initiates a battle behavior, a request to initiate a battle behavior is sent to the first server.
  • a battle control process runs in the client. When detecting that a participant initiates a battle behavior, the battle control process sends a request to initiate a battle behavior to the first server.
  • the battle control process is a process used to process content associated with battle scenes.
  • the battle control process can realize the rendering of the above-mentioned battle process, such as rendering the behaviors of pet virtual characters and enemy units during the battle.
  • the turn-based battle in the embodiment of the present application is run under the turn-based RPG mechanism provided by the embodiment of the present application.
  • the turn-based battle may refer to a turn-based battle between the above-mentioned pet avatar and the enemy unit.
  • the turn-based battle may include three rounds of battle, and in each round of battle, the pet avatar and the enemy unit may take turns to attack each other once. After completing three rounds of battle, the turn-based battle ends. Alternatively, one of the pet avatar and the enemy unit is defeated, and the turn-based battle ends. This is not limited in the embodiments of the present application.
  • the participants in the turn-based battle may refer to the above-mentioned pet avatar, or may refer to the enemy unit of the pet avatar.
  • Battle behaviors may refer to combat behaviors that participants perform in response to the player's battle control operations, such as releasing skills, normal attacks, escaping, using virtual props, defense, and other behaviors.
  • combat behavior may also refer to combat behavior to be performed by artificial intelligence-controlled participants, which is not limited in the embodiments of this application.
  • the first server is the background server of the application program, which corresponds to the battle control process.
  • the first server can process the control signals generated by the player in the battle scene by executing the battle processing logic to advance the above battle process.
  • the first server may also be used to handle initiation requests for combat actions.
  • the first server can perform logical processing on the combat behavior by executing combat behavior processing logic, and generate combat rendering information to perform rendering and display of the combat behavior.
  • the first server is the same as the introduction in the above embodiment, and will not be described again here.
  • the initiation request of the battle behavior is used to request the execution of the rendering process of the battle behavior and to obtain the information required for the rendering of the battle behavior, that is, the battle rendering information.
  • the initiation request may include identification information of the combat behavior, such as identification information of skills, identification information of virtual props, etc.
  • the client in addition to the battle control process, the client also runs a scene control process.
  • the scene control process is a process used to process content associated with world scenes.
  • the scene control process can control the rendering of the virtual character's activities in the world scene.
  • the scene control process and the battle control process are two different processes that are independent of each other.
  • the player can choose to start a turn-based battle.
  • the battle screen display process of the turn-based battle can be as follows:
  • the scene control process obtains the basic information corresponding to the participants in the turn-based battle and the environmental information corresponding to the battle scene, and sends the basic information and environmental information to the second server.
  • the above-mentioned basic information may refer to the character information required in the battle, such as the health value, level, attributes, skills of the participants, attributes, skills of the enemy units and other information.
  • the above basic information may include the level and health value of the master virtual character, the attributes and skills of the pet avatar used by the master virtual character, the attributes and skills of enemy units, and other information.
  • the environmental information corresponding to the battle scene may refer to the environmental information corresponding to the battle map, such as land information, time information, weather information, etc.
  • the environmental affinity between the pet avatar and the battle scene can be determined based on the environmental information to determine the final gain or debuff effect of the pet avatar's corresponding skill in the first round of battle.
  • the second server is the background server of the application program, which corresponds to the scene control process.
  • the second server can process the control signals generated by the player in the world scene by executing scene processing logic to realize the advancement and rendering of the master virtual character's activities in the world scene.
  • the introduction of the second server is the same as that in the above embodiment and will not be described again here.
  • the second server After receiving the basic information and the environment information, the second server sends the basic information and the environment information to the first server.
  • the battle control process receives the battle screen of the turn-based battle from the first server; wherein the battle screen of the turn-based battle is a screen generated by the first server based on the basic information and environmental information sent by the second server.
  • the battle screen may refer to a screen where the pet avatar and the enemy unit battle in a battle scene.
  • the first server can generate a battle screen (or battle screen rendering information) based on the basic information and environmental information from the second server by executing the battle behavior processing logic, and send it to the battle control process.
  • the battle control process displays the battle screen of the turn-based battle.
  • the battle control process can directly display the battle screen in the user interface, or render the battle screen according to the battle screen rendering information, which is not limited in the embodiments of the present application.
  • the second server can more intuitively know the situation of the participants and the situation of the battle scene, and then use the communication between the first server and the second server
  • the interaction process enables the battle control process that interacts with the first server to obtain a battle screen that is more in line with the current battle situation, improving the accuracy of obtaining the battle screen, which is conducive to rendering and displaying a more realistic battle on the client. screen to improve the efficiency of human-computer interaction.
  • Step 402 Receive battle rendering information from the first server; wherein the battle rendering information is used to render battle behaviors.
  • the battle rendering information may refer to a series of behavior control parameters related to the battle behavior in the time dimension. Based on these behavior control parameters, the pet virtual character can be controlled to complete a set of actions or performances.
  • a battle control process runs in the client. In addition to sending a battle action initiation request to the first server, the battle control process also receives battle rendering information from the first server.
  • the battle control process controls the pet virtual character to complete a set of skill release actions based on the battle rendering information, thereby completing the rendering of the skill release.
  • Step 403 Receive halo rendering information from the second server; wherein, the halo rendering information is information generated by the second server based on the halo information sent by the first server.
  • the halo information is used to represent the halo triggered by the battle behavior.
  • the halo battle system is based on the turn-based system. Effects on elements in combat scenes.
  • Halo rendering information can refer to a series of element control parameters related to the halo in the time dimension. According to these element control parameters, some elements (such as fire elements) can be controlled to complete their impact on the environment (such as land plots). For example, when the halo is a flame halo, according to the corresponding halo rendering information, the conversion process of the fire element converting the plot from grass attribute to fire attribute can be rendered. It should be noted that the elements in the battle scene affected by the halo can be at least one visual element among the land plot, the main control avatar, and the pet avatar.
  • the second server runs halo processing logic, and by logically processing the halo represented by the halo information (such as the identification information of the halo), the halo rendering information corresponding to the halo can be obtained.
  • the scene control process running in the client receives halo rendering information from the second server.
  • the second server after generating the halo rendering information, the second server sends the halo rendering information to the scene control process.
  • the first server determines that the combat behavior triggers the halo
  • it sends the halo information to the second server.
  • the battle behavior cannot trigger the aura
  • only the battle behavior is logically processed, and information such as the damage caused, the Buff added, the displacement generated, etc. are determined to generate the battle rendering information.
  • the first server also maintains a relationship table between the combat behavior and the halo, and the relationship table can be queried to determine whether the combat behavior can trigger the halo.
  • the halo information of the halo can also be determined based on the relationship table.
  • the scene control process running in the client will send the halo rendering information to the battle control process.
  • the scene control process can first cache the halo rendering information, and then send the halo rendering information to the battle control process after caching the halo rendering information.
  • the battle control process can first obtain the battle rendering information, and then obtain the halo rendering information, thereby realizing the display process of the battle behavior and the halo rendering process in sequence (avoiding the problem of rendering the halo first without warning), so that the display of the battle behavior can be achieved
  • the connection with the halo display is more natural and smooth.
  • Step 404 In the process of rendering the battle behavior based on the battle rendering information, the halo is rendered according to the halo rendering information.
  • the client implements the halo rendering process through a battle control process.
  • the battle control process After the battle control process receives the halo rendering information sent by the scene control process, it renders the halo based on the halo rendering information.
  • the battle control process after receiving the battle rendering information and the halo rendering information, the battle control process renders the halo according to the halo rendering information during the process of rendering the battle behavior according to the battle rendering information.
  • the battle control process no longer renders the halo.
  • the battle control process and the scene control process share a halo rendering interface.
  • the battle control process calls the halo rendering interface to render the halo based on the halo rendering information.
  • the scene control process replaces the battle control process and calls the halo rendering interface to complete the connection rendering of the halo.
  • the scene control process renders the halo according to cached halo rendering information.
  • the scene control process calls the halo rendering interface, and renders and displays the remaining halo corresponding to the turn-based battle based on the cached halo rendering information.
  • the remaining halo refers to the remaining halo in the virtual world among all the halo rendering information. Auras displayed in (for example, in turn-based battles, some auras are de-rendered).
  • the scene control process performs halo statistics based on the cached halo rendering information, and dynamically updates the remaining halo corresponding to the turn-based battle, so that after the turn-based battle ends, the rendering display can be seamlessly connected.
  • the remaining halo corresponding to the turn-based battle is the remaining halo corresponding to the turn-based battle.
  • the scene control process caches the halo rendering information and then renders it.
  • the scene control process needs to have the stability of scene display.
  • the scene control process can first cache the halo rendering information related to the halo triggered by the battle behavior, and then To prevent the battle control process from rendering the halo based on the halo rendering information, the halo rendering process is implemented using the halo rendering information cached by this process to avoid the problem of being unable to render the halo normally.
  • the battle control side will synchronize the settlement data of the turn-based battle to the scene control side at one time.
  • the first server will synchronize settlement data such as health consumption, experience value acquisition, and virtual resource consumption in the turn-based battle to the second server at one time.
  • the display timing and display position of the halo can be indicated through the battle rendering tag corresponding to the battle behavior.
  • the battle rendering tag may include the identification information of the turn-based battle, the identification information of the battle behavior, the identification information of the halo, the location corresponding to the halo (such as the plot, affected characters, etc.), the rendering details of the battle behavior, etc. This process can include the following:
  • the scene control process receives the battle rendering tag from the second server; where the battle rendering tag is a tag generated by the first server based on the battle behavior when the battle behavior satisfies the trigger halo.
  • the first server determines that the battle behavior triggers the halo, it generates the battle rendering tag of the battle behavior, then generates a call request based on the battle rendering information and the halo information, and uses RPC (Remote Procedure Call) to call The request is sent to the second server, which then sends the battle rendering tag to the scene control process.
  • RPC Remote Procedure Call
  • the scene control process sends the battle rendering tag to the battle control process.
  • the scene control process After receiving the battle rendering tag and aura rendering information from the second server, the scene control process sends the battle rendering tag and the aura rendering information together to the battle control process.
  • the battle control process renders the halo according to the halo rendering information during the process of rendering the battle behavior according to the battle rendering information according to the display timing and display position indicated by the battle rendering tag.
  • the battle control process determines the display timing and display position of the halo based on the rendering details of the battle behavior in the battle rendering tag and the corresponding position of the halo.
  • the display when the display timing is reached, the display Render a halo at a location (such as a parcel). For example, after the release process of the skill that triggers the halo is rendered, the halo is rendered to achieve a natural connection between the two renderings.
  • the type of the plot is transformed into the fire type under the influence of the aura of the fire type, that is, the scene of the burning grass on the plot is rendered; or, in the initial state of the plot
  • the type of the plot is converted to ice type under the influence of the aura of the ice type, that is, the picture of the water on the plot being frozen is rendered; or, when the initial type of the plot is soil type
  • the type of the plot is converted to the grass type under the influence of the halo of the grass type, that is, a picture of grass growing on the plot is rendered, which is not limited in the embodiment of the present application.
  • the scene control process will also render and display the spherical fire halo 502 at the position of the virtual character 501, and render and display the spherical fire halo 502.
  • the grass was set on fire.
  • the battle control process sends a rendering end notification of the battle behavior to the first server.
  • the battle control process sends the battle behavior to the first server The rendering end notification.
  • the rendering end notification is used to inform the first server of the battle behavior and the result of the halo rendering completion.
  • the first server After receiving the rendering end notification, the first server generates an acquisition request for updated environment information, and sends the acquisition request to the second server to obtain the updated environment information. After acquiring the updated environment information, the first server obtains the updated environment information according to the updated environment information. Determine the affinity between a participant and updated environmental information, which affects the skill strength of the participant. Among them, the acquisition request is used to request to obtain updated environment information.
  • the acquisition request may include identification information corresponding to the combat behavior, information about the land parcel affected by the halo, etc.
  • the updated environment information refers to the environmental information of the battle scene after being affected by the halo, such as The attributes of the plot after being affected by the halo.
  • the first server can obtain updated environment information corresponding to the halo based on the halo information.
  • the battle control process receives the affinity rendering animation from the first server; where the affinity rendering animation is an animation generated by the first server based on the updated environment information sent by the second server.
  • the updated environment information refers to the battle scene passing through the halo.
  • the affinity rendering animation is used to represent the affinity between the participant and the updated environmental information.
  • the affinity affects the skill intensity of the participant.
  • the environmental information may include weather information, land area information and time information.
  • the first server can combine the environment information and the attributes of the participants to determine the affinity between the participants and the environment. For example, you can choose to obtain the weather potential energy and land potential energy corresponding to the participant, and then determine the affinity between the participant and the environment based on the weather potential energy and land potential energy, combined with the attribute information of the participant.
  • the weather potential energy can be determined based on weather information and time information. For example, a clear sky at night will give the "ghost" potential energy, while a clear sky in the morning will give the "light” potential energy.
  • the plot potential energy is determined by the type of plot the participant is on. For example, a plot with the grass attribute has "grass" potential energy.
  • the participants' dislike and liking of the environment include the following different levels: strong affinity, weak affinity, indifference, weak resistance, and strong resistance.
  • the participant's attributes are compatible with both the weather potential energy and the land potential energy (that is, the participant likes both the land plot and the weather), a strong affinity effect will be obtained, and the affinity will be increased by 2; if the participant's attributes are only compatible with the weather If the potential energy is compatible with one of the potential energies of the plot and does not conflict with the other (that is, the participant only likes one and does not dislike the other), then a weaker affinity effect will be obtained, and the affinity will be increased by 1; if the participant's If the attribute is only compatible with one of the weather potential energy and the land potential energy, and conflicts with the other (that is, the participant only likes one and hates the other), no effect will be obtained, and the affinity will be increased by 0; if participating If the party's attributes conflict with one of the weather potential energy and the land potential energy, and are not suitable for the other (that is, the participating party only hates one and does not like the other), it will obtain a weak conflict effect and the affinity will be reduced by 1; If the
  • the corresponding affinity of the participant is positive, it is deemed that the environmental affinity of the participant has been triggered, and the participant's combat behavior will be enhanced, such as increasing the range of the skill, the attack power of the skill, the attack effect of the skill, etc. If the corresponding affinity of the participant is negative, it is deemed that the participant is resistant to the environment, and the participant's combat behavior will be debuffed, such as reducing the range of skills, the attack power of skills, and the attack effect of skills, etc. If the corresponding affinity of a participant is 0, the ability to influence the participant's combat behavior will not be adjusted.
  • the first server uses the same method as above to obtain the affinity between the participant and the updated environment information, and then generates an affinity rendering animation based on the affinity.
  • different affinities correspond to different affinity rendering animations.
  • the affinity rendering animation corresponding to strong affinity is a big face
  • the affinity rendering animation corresponding to weak affinity is a smiling face
  • the affinity rendering animation corresponding to strong conflict is an angry face.
  • the affinity rendering animation may be an icon, a dynamic icon, animation, etc., which is not limited in the embodiments of the present application.
  • the first server delivers the generated affinity rendering animation to the battle control process.
  • the battle control process displays affinity rendering animation.
  • the battle control process can display the affinity rendering animation near the participant.
  • the battle control process displays an affinity rendering animation corresponding to the strong affinity above the pet avatar 701. 702, the affinity rendering animation 702 is a sun with a big smile.
  • the battle control process displays the affinity rendering animation 802 corresponding to the weak affinity above the pet virtual character 802, and the affinity rendering animation 802 is a smiling sun.
  • the pet avatar 901 with the earth attribute uses the skill 902 with the earth attribute (such as active impact) on the grass to attack the opponent unit. Due to the interaction between the pet avatar 901 and the environment If there is no sense between them, no adjustment has been made to skill 902 (i.e. ordinary impact). When pet avatar 901 moves to the rocky terrain, skill 902 is used again to attack the opponent's unit. Since the relationship between pet avatar 901 and the environment changes from insensitivity to strong affinity, the effect of skill 902 is enhanced. For example, the power of skill 902 is increased by 50%, and a sand and gravel special effect is added to skill 902 (that is, an impact with sand and gravel special effects and increased power).
  • the process of displaying the affinity rendering animation is implemented by the battle control process.
  • the battle control process will also receive the affinity rendering animation generated based on the updated environment information sent between the first server and the second server after the rendering of the battle behavior is completed, so as to pass the affinity
  • the rendering animation shows the environmental conditions of the battle scene after being affected by the halo in a more timely manner. It not only greatly improves the authenticity of the screen display, but also enriches the display effect of the screen to improve the fun of the game and facilitate players to make decisions through affinity rendering animation. The next battle situation will improve the efficiency of human-computer interaction.
  • the technical solution provided by the embodiments of this application no longer requires two servers to respectively perform environment generation for combat behaviors and environment generation for non-combat behaviors. Instead, the first server generates a small amount of halo information during combat behaviors. Sent to the second server, so that the data related to environment generation is processed by the second server even during combat activities, avoiding the problem that the second server needs to copy the entire environment data update to the first server, resulting in a large amount of data transmission. Not only does it greatly reduce the amount of data interactive transmission between the first server and the second server, it also avoids the need for both the first server and the second server to maintain a copy of scene data by maintaining a copy of scene information on the second server side. Redundant problems, thereby effectively reducing server maintenance overhead, and also helping to improve interface display efficiency and human-computer interaction efficiency.
  • the second server can send halo rendering information to the client for processing content associated with the world scene.
  • the scene control process provides the halo rendering information to the battle control process in the client, realizing the information interaction between the battle control side and the scene control side, so that the battle control side and the scene control side can The performance is consistent and the integration between the battle control side and the scene control side is improved.
  • the battle rendering tag indicates the display position and display timing of the halo, which not only reduces the amount of data transmission, but also enables the rendering of battle behaviors.
  • the natural and smooth connection between the process and the rendering process of the halo reduces the amount of data transmission during each interaction and improves the rendering effect of the battle; in addition, the synchronization of the halo rendering through the battle control side and the scene control side , it also further improves the integration between the world scene and the battle scene. While improving the authenticity of the screen display, it also solves the problem of data duplication processing on the battle control side and the scene control side.
  • FIG 11 shows a flow chart of an information provision method based on turn-based combat provided by an embodiment of the present application.
  • the execution subject of each step of the method can be the server 320 in the solution implementation environment shown in Figure 3, as shown in
  • the method may include the following steps (steps 1101 to 1104):
  • Step 1101 The first server receives the initiation request of the combat behavior sent by the combat control process, and generates the combat rendering information corresponding to the combat behavior according to the initiation request; wherein the combat behavior is a behavior initiated by the participants of the turn-based combat, and the combat rendering information is used for Render the combat behavior.
  • the request to initiate a battle behavior is used to request the rendering of the battle behavior and obtain the information required for the rendering of the battle behavior. That is, battle rendering information.
  • the initiation request may include identification information of the combat behavior, such as identification information of skills, identification information of virtual props, etc.
  • the first server can obtain data related to the combat behavior based on the identification information of the combat behavior.
  • the first server can perform logical processing on the battle behavior and generate battle rendering information by executing the battle behavior processing logic.
  • the first server is the same as the introduction in the above embodiment, and will not be described again here.
  • the turn-based battle in the embodiment of the present application is run under the turn-based RPG mechanism provided by the embodiment of the present application.
  • the turn-based battle may refer to a turn-based battle between the above-mentioned pet avatar and the enemy unit.
  • the participants in the turn-based battle may refer to the above-mentioned pet avatar, or may refer to the enemy unit of the pet avatar. For example, when the main control virtual character encounters an enemy unit, the player can choose to start a turn-based battle.
  • the process of providing the battle screen of the turn-based battle can be as follows:
  • the second server receives the basic information corresponding to the participants in the turn-based battle and the environmental information corresponding to the battle scene sent by the scene control process.
  • the scene control process pulls the basic information corresponding to the participants and the environmental information corresponding to the battle scene, and uploads them to the second server.
  • the second server sends basic information and environment information to the first server.
  • the second server After receiving the basic information and the environment information, the second server sends the basic information and the environment information to the first server.
  • the first server generates a battle screen for the turn-based battle based on the basic information and environmental information, and sends the battle screen for the turn-based battle to the battle control process.
  • the battle screen may refer to a screen where the pet avatar and the enemy unit battle in a battle scene.
  • the battle control process may display the battle screen of the turn-based battle in the user interface.
  • Step 1102 When the first server determines that the battle behavior triggers the halo, it sends a call request to the second server; where the call request includes halo information used to characterize the halo, and the halo affects elements in the combat scene of the turn-based battle. .
  • the first server After receiving the request to initiate a battle action, the first server also detects whether the battle action triggers the halo. When it is determined that the combat behavior triggers the halo, the first server generates a call request based on the halo information, and sends the call request to the second server through RPC. When it is determined that the battle behavior does not trigger the halo, the first server does not generate a call request and only generates the battle rendering information.
  • Step 1103 The second server generates halo rendering information according to the call request and sends the halo rendering information to the scene control process.
  • the halo rendering information is sent by the scene control process to the battle control process.
  • the halo rendering information is used to render the halo.
  • the second server After receiving the call request, the second server determines the halo according to the halo information in the call request, performs logical processing on the halo, and generates halo rendering information.
  • the second server sends the halo rendering information to the scene control process to forward the halo rendering information to the battle control process through the scene control process.
  • the second server after receiving the call request, the second server first determines the plot corresponding to the halo, and then determines whether there is a historical halo on the plot corresponding to the halo. If the plot corresponding to the halo already has a historical halo, the second server generates halo rendering information based on the relationship between the historical halo and the halo; where, in the case of an overlay relationship between the historical halo and the halo, the halo rendering information uses In order to cancel the influence of the historical halo on the land parcel, add the influence of the halo on the land parcel; or, in the case of a mutually exclusive relationship between the historical halo and the halo, the halo rendering information is used to maintain the influence of the historical halo on the land parcel.
  • adding a water attribute halo to a land parcel that has a fire attribute halo will lead to overwriting logic, that is, canceling the fire attribute halo and its influence on the land parcel, and adding a water attribute halo and Influence.
  • Adding a fire attribute halo to a land parcel that has a water attribute halo will lead to a mutually exclusive logic, that is, continuing to maintain the influence of the historical halo on the land parcel.
  • the halo rendering information will be generated if the type of the plot supports halo (for example, a plot with grass attributes supports a halo with fire attribute). Supports aura (such as water attribute When the plot does not support fire attribute halos), the halo rendering information will not be generated.
  • the difference in the second server determines the generation of the halo rendering information.
  • the halo rendering information generated by the second server is used to cancel the impact of the historical halo on the land parcel and add the impact of the halo on the land parcel, so that the current halo can be used to create a more vivid and timely rendering Display the battle situation and avoid the limitation of displaying the historical halo and making the halo display unvivid;
  • the halo rendering information generated by the second server is used to maintain the influence of the historical halo on the plot, thus To avoid the problem of screen display fragmentation caused by the repulsion between the battle situation and the virtual scene display situation, by comprehensively considering the historical halo and the halo corresponding to the current battle, the scene rendering process is realized on the basis of maintaining the authenticity of the scene display.
  • Step 1104 After receiving the successful notification of the halo rendering information from the second server, the first server sends the battle rendering information to the battle control process.
  • the second server After sending the halo rendering information to the scene control process, the second server sends a delivery success notification to the first server to inform the first server that the halo rendering information is successfully delivered and can send the battle rendering information to the battle control process. In this way, it can be ensured that the battle control process receives the halo rendering information and the battle rendering information at approximately the same time.
  • the battle control process can then render the halo according to the halo rendering information in the process of rendering the battle behavior according to the battle rendering information.
  • the display timing and display position of the halo can be indicated through a battle rendering tag corresponding to the battle behavior.
  • the battle rendering tag can be provided as follows:
  • the first server determines that the battle behavior triggers the halo, it generates a battle rendering tag based on the battle behavior; the battle rendering tag is used to indicate the display timing and display position of the halo.
  • the battle rendering tag may include the identification information of the turn-based battle, the identification information of the battle behavior, the identification information of the halo, the location corresponding to the halo (such as the plot, affected characters, etc.), the rendering details of the battle behavior, etc.
  • the first server sends the battle rendering tag to the second server.
  • the first server can package the battle rendering tag and aura information into a call request, and send the call request to the second server through RPC.
  • the second server sends the battle rendering tag to the scene control process, and the battle rendering tag is sent to the battle control process by the scene control process.
  • the above process introduces the relevant content of the battle rendering tag.
  • the battle rendering tag can more intuitively indicate the display position and display timing of the halo, which improves the pertinence of the screen rendering process, reduces the amount of data transmission that requires multiple interactions, and improves the screen rendering efficiency while improving the battle rendering effect;
  • the synchronization process between the battle control side and the scene control side further improves the integration between the world scene and the battle scene, and solves the problem of repeated data processing on the battle control side and the scene control side.
  • the second server After receiving the call request, the second server generates aura rendering information, and sends the aura rendering information and the battle rendering tag together to the scene control process, so that the aura rendering information and the battle rendering tag are sent to the battle control process through the scene control process.
  • the battle control process renders the halo according to the halo rendering information during the process of rendering the battle behavior according to the battle rendering information according to the display timing and display position indicated by the battle rendering tag.
  • the turn-based battle includes an intermediate round of battle and a last round of battle
  • the intermediate round of battle refers to a battle in the turn-based battle except for the last round of battle.
  • the second server when the halo is triggered by a mid-round battle, the second server sends the halo rendering information and the battle rendering tag generated by the first server to the scene control process.
  • the middle round battle refers to the turn-based battle excluding the last round of battle.
  • the second server sends the halo rendering information, the battle rendering tag generated by the first server, and the turn-based battle end notification to the scene control process.
  • the second server notifies the scene control process that the turn-based battle is about to end by sending a turn-based battle end notification, and can prepare for the connection of halo rendering.
  • the second server sends aura rendering information and battle rendering
  • the process of dyeing the tag instructs the scene control process to continue the subsequent process of scene rendering based on this content; when it is in the last battle, considering that the battle is about to end, the second server sends the halo rendering information, the battle rendering tag, and the end of the game.
  • the notification process instructs the scene control process to perform scene rendering based on the content and then stop the data reception and scene rendering process to avoid resource waste of additional data transmission.
  • the first server After receiving the rendering end notification of the battle behavior sent by the battle control process, the first server sends a request to obtain updated environment information to the second server; where the updated environment information refers to the environmental information of the battle scene after being affected by the halo.
  • the rendering end notification is used to inform the first server of the battle behavior and the completion of the aura rendering.
  • the update environment information acquisition request is used to request to acquire the update environment information.
  • the environmental information may include weather information, land parcel information, and time information.
  • the second server sends updated environment information to the first server.
  • the second server After pulling the updated environment information, the second server sends the updated environment information to the first server.
  • the first server generates an affinity rendering animation based on the updated environment information; the affinity rendering animation is used to represent the affinity between the participant and the updated environment information, and the affinity affects the skill intensity of the participant.
  • the first server can generate an affinity rendering animation based on the affinity between the participant and the environment, and the process can be as follows:
  • the first server determines the plot potential energy and weather potential energy corresponding to the participant based on the updated environmental information; among them, the plot potential energy is used to represent the impact of the type of plot on the participant, and the weather potential energy is used to represent the weather and time impact on the parties involved.
  • weather potential energy can be determined based on time information and weather information.
  • the first server determines the environmental potential energy corresponding to the participant based on the participant's corresponding land potential energy and weather potential energy.
  • the environmental potential energy can include two parts: weather potential energy and land potential energy.
  • the first server generates an affinity rendering animation based on the environmental potential energy corresponding to the participant and the attribute information of the participant.
  • the first server can determine the affinity between the participant and the environment based on the environmental potential energy corresponding to the participant and the attribute information of the participant, and then generate an affinity rendering animation based on the affinity.
  • the method for determining the affinity is the same as that introduced in the above embodiment.
  • the first server sends the affinity rendering animation to the battle control process.
  • the process of generating an affinity rendering animation through the first server according to the affinity between the participant and the environment is introduced.
  • the first server can determine the impact of various updated environmental information on the participant when the participant initiates the battle behavior, so as to achieve the purpose of real-time analysis of the participants, and then determine the real-time The impact of the environment on the skill strength of the participants.
  • This content represents the affinity between the participants and the environment.
  • the affinity rendering animation is generated based on the affinity determined in real time, which can more timely affect the battle scene through the halo.
  • the final environmental situation is displayed, which not only greatly improves the authenticity of the screen display, but also enriches the display effect of the screen, making it easier for players to decide the next battle situation through affinity rendering animation, and improving the efficiency of human-computer interaction.
  • the content of generating affinity rendering animation based on updated environment information is introduced.
  • After obtaining the updated environmental information that represents the battle scene after being affected by the halo determine the plot potential energy that represents the impact of the plot type on the participants, and determine the weather potential energy that represents the impact of weather and time on the participants, thereby comprehensively
  • the environmental potential energy represented by block potential energy and weather potential energy, as well as the attribute information of the participants more comprehensively consider various factors and the influence between various factors, so as to generate more timely and accurate environmental potential energy and the impact on the skill strength of the participants.
  • Affects the affinity rendering animation improves the authenticity of the affinity rendering animation, and ensures the combat scene and world while giving players a full combat experience.
  • the unity of scene experience avoids the sense of fragmentation in scenes and improves the efficiency of data interaction between different scenes.
  • the battle control process After receiving the affinity rendering animation, the battle control process displays the affinity rendering animation in the battle scene.
  • the technical solution provided by the embodiments of this application no longer requires two servers to respectively perform environment generation for combat behaviors and environment generation for non-combat behaviors. Instead, the first server generates a small amount of halo information during combat behaviors. Sent to the second server, so that the data related to environment generation is processed by the second server even during combat activities, avoiding the problem that the second server needs to copy the entire environment data update to the first server, resulting in a large amount of data transmission. Not only does it greatly reduce the amount of data interactive transmission between the first server and the second server, it also avoids the need for both the first server and the second server to maintain a copy of scene data by maintaining a copy of scene information on the second server side. Redundant problems, thereby effectively reducing server maintenance overhead, and also helping to improve interface display efficiency and human-computer interaction efficiency.
  • the first server to issue battle rendering information and the second server to issue glowing ring rendering information, it avoids the need for both the first server and the second server to issue battle rendering information or halo rendering information, causing the client to be in the process of processing.
  • the same rendering information needs to be compatible with multiple sources of rendering information, thereby reducing the pressure on the client to process rendering information.
  • FIG. 12 and Figure 13 shows a flow chart of an interface display method based on turn-based combat provided by another embodiment of the present application.
  • the execution subject of each step of the method can be in the implementation environment of the solution shown in Figure 1 Terminal 310 or server 320, this method may include the following steps (steps 1201 to 1219):
  • Step 1201 The scene control process displays the picture corresponding to the world scene.
  • Step 1202 When the scene control process determines to start the turn-based battle, it pulls the basic information corresponding to the participants of the turn-based battle and the environmental information corresponding to the battle scene, and sends the basic information and environmental information to the second server.
  • Step 1203 The second server sends basic information and environment information to the first server.
  • Step 1204 The first server generates a battle screen corresponding to the battle scene based on the basic information and environmental information, and sends the battle screen corresponding to the battle scene to the battle control process.
  • Step 1205 The battle control process displays the battle screen of the battle scene.
  • Step 1206 When detecting that a participant initiates a battle behavior, the battle control process sends a request to initiate a battle behavior to the first server.
  • Step 1207 After receiving the initiation request of the combat behavior, the first server generates combat rendering information corresponding to the combat behavior according to the initiation request, and when it is determined that the combat behavior triggers the halo, sends a calling request to the second server.
  • the calling request It includes aura information used to characterize the aura and a battle rendering tag used to indicate the display timing and display position of the aura.
  • Step 1208 The second server generates the halo rendering information according to the call request, sends the halo rendering information and the battle rendering tag to the scene control process, and sends a successful notification of the halo rendering information to the first server.
  • the halo rendering information is used for halo rendering. To render.
  • Step 1209 The scene control process caches the halo rendering information and the battle rendering tag, and sends the halo rendering information and the battle rendering tag to the battle control process.
  • Step 1210 After receiving the notification of successful delivery of the halo rendering information, the second server sends the battle rendering information to the battle control process.
  • Step 1211 The battle control process renders the halo according to the halo rendering information during the process of rendering the battle behavior according to the battle rendering information according to the display timing and display position indicated by the battle rendering tag.
  • Step 1212 The battle control process sends a rendering end notification of the battle behavior to the first server.
  • Step 1213 The first server sends an acquisition request for updated environment information to the second server; where the updated environment information refers to the environmental information of the battle scene after being affected by the halo.
  • Step 1214 The second server sends updated environment information to the first server.
  • Step 1215 The first server generates an affinity rendering animation based on the updated environment information and sends it to the battle control process.
  • Step 1216 The battle control process displays the affinity rendering animation.
  • Step 1217 After determining that the turn-based battle is over, the battle control process stops running.
  • Step 1218 After determining that the turn-based battle is over, the scene control process renders the halo according to the cached halo rendering information.
  • Step 1219 After the threshold duration, the scene control process cancels the rendering and display of the halo and resumes displaying the picture corresponding to the world scene.
  • the leftmost vertical process is the inter-round cycle main process of the combat control side, and its interaction with the scene control side mainly occurs at two points in time. Interaction occurs: The first time point is after the player selects a skill to determine whether the skill will trigger the aura. In the case where the skill triggers the aura, the combat control side needs to interact with the scene control side. The combat control side needs to send a request for halo rendering information to the scene control side to request the halo rendering information. Then, depending on whether the halo is added successfully, it is decided whether to modify the virtual world (such as a land plot), and the process on the scene control side is ended.
  • the first time point is after the player selects a skill to determine whether the skill will trigger the aura.
  • the combat control side needs to interact with the scene control side.
  • the combat control side needs to send a request for halo rendering information to the scene control side to request the halo rendering information. Then, depending on whether the halo is added successfully, it is decided whether to modify the virtual world
  • the second point in time is before the turn-based battle is over and after the rendering of skills and auras is determined to be completed, the first server on the battle control side needs to pull updated environment information from the second server on the scene control side, and update the environment information based on Generate affinity rendering animation and send it to the battle control process on the battle control side for display.
  • the technical solution provided by the embodiments of this application supports the implementation of the combat behavior processing logic through the combat control process and the first server to obtain the combat rendering information, and supports the implementation of the halo processing logic through the scene control process and the second server. , to obtain halo rendering information, without having to maintain a battle behavior processing logic and aura processing logic on both the battle control side and the scene control side, and only need to maintain one scene information on the scene control side, thereby reducing the cost of turn-based battles. Redundancy in business logic and data maintenance, thereby reducing maintenance overhead.
  • FIG. 15 shows a block diagram of an interface display device for turn-based combat provided by an embodiment of the present application.
  • the device has the function of implementing the above method example, and the function can be implemented by hardware, or can be implemented by hardware executing corresponding software.
  • the device may be the computer equipment introduced above, or may be provided in the computer equipment.
  • the device 1500 includes: a battle control module 1501 and a scene control module 1502.
  • the battle control module 1501 is configured to, when detecting that a participant in a turn-based battle initiates a battle behavior, send an initiation request for the battle behavior to the first server, and the first server is used to process the battle behavior. .
  • the battle control module 1501 is also configured to receive battle rendering information from the first server; wherein the battle rendering information is used to render the battle behavior.
  • the scene control module 1502 is used to receive halo rendering information from a second server, and the second server is used to process the world environment of the virtual world; wherein the halo rendering information is obtained by the second server according to the first Information generated by halo information sent by a server.
  • the halo information is used to represent the halo triggered by the battle behavior. The halo affects elements in the battle scene of the turn-based battle.
  • the battle control module 1501 is also configured to render the halo according to the halo rendering information during the process of rendering the battle behavior based on the battle rendering information.
  • the scene control module 1502 is also configured to receive a battle rendering tag from the second server through the scene control process; wherein the battle rendering tag is the first server in the When the combat behavior satisfies the triggering of the halo, it is generated based on the combat behavior.
  • the scene control module 1502 is also used by the scene control process to send the battle rendering to the battle control process. Dye labels.
  • the battle control module 1501 is also configured to render the battle behavior based on the battle rendering information according to the display timing and display position indicated by the battle rendering tag through the battle control process, according to The halo rendering information renders the halo.
  • the scene control module 1502 is also used to:
  • the scene control process renders the halo according to the cached halo rendering information.
  • the scene control module 1502 is also configured to obtain the basic information corresponding to the participants and the environment corresponding to the battle scene through the scene control process when entering the turn-based battle. information, and send the basic information and the environment information to the second server.
  • the battle control module 1501 is also configured to receive the battle screen of the turn-based battle from the first server through the battle control process; wherein the battle screen is the first server according to the second Generated from the basic information and the environment information sent by the server.
  • the battle control module 1501 is also used to display the battle screen through the battle control process.
  • the battle control module 1501 is also configured to send a rendering end notification of the battle behavior to the first server through the battle control process.
  • the battle control module 1501 is also configured to receive the affinity rendering animation from the first server through the battle control process; wherein the affinity rendering animation is the first server based on the second An animation generated by updated environment information sent by the server.
  • the updated environment information refers to the environmental information of the battle scene after being affected by the halo.
  • the affinity rendering animation is used to represent the participant and the updated environment. The affinity between information affects the skill strength of the participants.
  • the battle control module 1501 is also configured to display the affinity rendering animation through the battle control process.
  • the technical solution provided by the embodiment of the present application supports the implementation of combat behavior processing logic through the first server to obtain combat rendering information, and supports the implementation of halo processing logic through the second server to obtain halo rendering information. Due to the above process By maintaining a copy of scene information on the scene control side, it avoids maintaining a copy of battle behavior processing logic and aura processing logic on both the battle control side and the scene control side, thereby significantly reducing the business logic and data maintenance of turn-based battles. redundancy, thereby reducing maintenance overhead.
  • Figure 16 shows a structural block diagram of a computer system provided by an embodiment of the present application.
  • the computer system 1600 includes: a first terminal 120, a server 140, a second terminal 160 and a third terminal 180.
  • the first terminal 120 has an application program supporting the virtual world installed and running.
  • the application can be any one of a three-dimensional map program, a virtual reality application, an augmented reality program, an RPG program, a turn-based game program, and a turn-based RPG program.
  • the first terminal 120 is a terminal used by the first user. The first user uses the first terminal 120 to control the first virtual character located in the virtual world to perform activities.
  • the first terminal 120 is connected to the server 140 through a wireless network or a wired network.
  • the server 140 includes at least one of a server, multiple servers, a cloud computing platform, and a virtualization center.
  • the server 140 includes a processor 144 and a memory 142.
  • the memory 142 further includes a receiving module 1421, a control module 1422 and a sending module 1423.
  • the receiving module 1421 is used to receive requests sent by the client, such as detecting the location of the enemy virtual character. request; the control module 1422 is used to control the rendering of the virtual world screen; the sending module 1423 is used to send a response to the client, such as sending the location of the third virtual character to the client.
  • the server 140 is used to provide background services for applications that support the three-dimensional virtual world.
  • the second terminal 160 has an application program supporting the virtual world installed and running.
  • the second terminal 160 is used by the second user Terminal, the second user uses the second terminal 160 to control the second virtual character located in the virtual world to perform activities, and the second virtual character also serves as the master virtual character.
  • the third terminal 180 installs and runs an application program that supports the virtual world.
  • the third terminal 180 is a terminal used by a third user.
  • the third user uses the third terminal 180 to control the third virtual character located in the virtual world to perform activities.
  • the first virtual character, the second virtual character and the third virtual character are in the same virtual world.
  • the first avatar and the second avatar belong to different camps, and the second avatar and the third avatar belong to the same camp.
  • the application programs installed on the first terminal 120, the second terminal 160 and the third terminal 180 are the same, or the application programs installed on the three terminals are of the same type on different operating system platforms (Android or IOS). app.
  • the first terminal 120 may generally refer to one of multiple terminals
  • the second terminal 160 may generally refer to one of multiple terminals
  • the third terminal 180 may generally refer to one of multiple terminals.
  • This embodiment only uses the first terminal 120, the second terminal 160 and the third terminal 180 for illustration.
  • the device types of the first terminal 120 , the second terminal 160 and the third terminal 180 are the same or different.
  • the device types include: smart phone, smart watch, smart TV, tablet computer, e-book reader, MP3 player, MP4 player. , at least one of a laptop computer and a desktop computer.
  • the following embodiments take the terminal including a smart phone as an example.
  • the number of the above terminals may be more or less. For example, there may be only one terminal, or there may be dozens, hundreds, or more terminals. The embodiments of this application do not limit the number of terminals and device types.
  • FIG 17 shows a structural block diagram of a computer device 1700 provided by an embodiment of the present application.
  • the computer device 1700 can be a portable mobile terminal, such as: a smartphone, a tablet computer, an MP3 player (Moving Picture Experts Group Audio Layer III, Moving Picture Experts Compression Standard Audio Layer 3), MP4 (Moving Picture Experts Group Audio Layer IV, Motion Picture Expert compresses the standard audio layer 4) player.
  • the computer device 1700 may also be called a user device, a portable terminal, or other names.
  • computer device 1700 includes: processor 1701 and memory 1702.
  • the processor 1701 may include one or more processing cores, such as a 4-core processor, an 8-core processor, etc.
  • Memory 1702 may include one or more computer-readable storage media, which may be tangible and non-transitory.
  • the computer device 1700 optionally further includes a peripheral device interface 1703 and at least one peripheral device.
  • the peripheral device includes: at least one of a radio frequency circuit 1704, a touch display screen 1705, a camera 1706, an audio circuit 1707, and a power supply 1708.
  • computing device 1700 also includes one or more sensors 1709.
  • the one or more sensors 1709 include, but are not limited to: acceleration sensor 1710, gyro sensor 1711, pressure sensor 1712, optical sensor 1713, and proximity sensor 1714.
  • FIG. 17 does not constitute a limitation on the computer device 1700, and may include more or fewer components than shown, or combine certain components, or adopt different component arrangements.
  • a computer-readable storage medium is also provided.
  • a computer program is stored in the storage medium. When executed by a processor, the computer program implements the above-mentioned interface display method based on turn-based battles. , or the above information providing method based on turn-based battles.
  • the computer-readable storage medium may include: ROM (Read-Only Memory), RAM (Random-Access Memory), SSD (Solid State Drives, solid state drive) or optical disk, etc.
  • random access memory can include ReRAM (Resistance Random Access Memory, resistive random access memory) and DRAM (Dynamic Random Access Memory, dynamic random access memory).
  • a computer program product or computer program is also provided, the computer program product or computer program including computer instructions stored in a computer-readable storage medium.
  • the processor of the computer device reads the computer instructions from the computer-readable storage medium, and the processor executes the computer instructions, so that the computer device executes the above-mentioned interface display method based on turn-based combat, or the above-mentioned interface display method based on turn-based combat. Turn-based How to provide battle information.
  • the information including but not limited to subject device information, subject personal information, etc.
  • data including but not limited to data used for analysis, stored data, displayed data, etc.
  • signals involved in this application All are authorized by the object or fully authorized by all parties, and the collection, use and processing of relevant data need to comply with the relevant laws, regulations and standards of the relevant regions.
  • the virtual characters, operations, world scenes, battle scenes, etc. involved in this application were all obtained with full authorization.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Business, Economics & Management (AREA)
  • Computer Security & Cryptography (AREA)
  • General Business, Economics & Management (AREA)
  • Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Processing Or Creating Images (AREA)
  • Information Transfer Between Computers (AREA)

Abstract

一种基于回合制对战的界面显示方法,包括:在检测到回合制对战的参与方发起对战行为的情况下,向第一服务器(321)发送对战行为的发起请求;接收来自第一服务器(321)的对战渲染信息;接收来自第二服务器(322)的光环渲染信息;光环渲染信息是第二服务器(322)根据第一服务器(321)发送的光环信息生成的;向对战控制进程发送光环渲染信息;在基于对战渲染信息对对战行为进行渲染的过程中,根据光环渲染信息对光环进行渲染。还提供了一种基于回合制对战的信息提供方法、界面显示装置、计算机系统、计算机设备、计算机可读存储介质及计算机程序产品。该方法大大降低了第一服务器和第二服务器之间的数据交互传输量,有效降低了服务器的维护开销,有利于提高界面显示效率,提高人机交互效率。

Description

基于回合制对战的界面显示方法、信息提供方法及系统
本申请要求于2022年8月19日提交的申请号为202211000457.3、发明名称为“基于回合制对战的界面显示方法、信息提供方法及系统”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请实施例涉及计算机技术领域,特别涉及一种基于回合制对战的界面显示方法、信息提供方法及系统。
背景技术
回合制角色扮演游戏(Role-Playing Game,RPG)是指一种采用回合制战斗策略的游戏。在回合制角色扮演游戏中,玩家可以在写实世界或虚拟世界中扮演一个主控虚拟角色,并可以利用主控虚拟角色或主控虚拟角色所拥有的宠物虚拟角色,与敌方单位(如游戏中的NPC(Non-Player Character,非玩家角色)、AI(Artificial Intelligence,人工智能)控制的怪物,或其他角色捕捉的宠物虚拟角色等)进行回合制战斗。
在相关技术中,回合制角色扮演游戏提供有两种完全不同的地图:世界地图和战斗地图。当主控虚拟角色在世界地图中进行非战斗活动(如:游玩、捕捉宠物虚拟角色、收集宝箱、收集虚拟道具等)时,世界地图对应的世界场景(或称为非战斗场景)由场景服务器维护执行,场景服务器自身运行有用于生成场景的环境数据;在战斗场景下,当主控虚拟角色在战斗地图中进行战斗活动(如:控制已捕捉的宠物虚拟角色,与敌方单位进行回合制战斗)时,战斗地图对应的战斗场景由战斗服务器执行,战斗服务器需要从场景服务器中将用于渲染场景的环境数据进行复制,并基于复制后得到的环境数据生成带有战斗活动所产生的光环的场景。
然而,相关技术中针对世界场景的场景服务器和战斗场景的战斗服务器,均需要维护一份环境数据,当世界场景中存在变化后,若主控虚拟角色再次在战斗地图中进行战斗活动,场景服务器需将更新后的环境数据再次复制至战斗服务器,以便战斗服务器基于更新后的环境数据生成带有战斗活动所产生的光环的场景,使得场景服务器和战斗服务器之间存在业务逻辑和数据维护上的冗余,存在极大的数据处理量和数据传输量,从而容易影响环境生成效率。
发明内容
本申请实施例提供了一种基于回合制对战的界面显示方法、信息提供方法及系统,能够降低第一服务器和第二服务器之间的数据交互传输量,也能够避免第一服务器和第二服务器均维护一份场景数据的冗余问题,在降低服务器维护开销的同时提高客户端的界面显示效率。所述技术方案如下:
根据本申请实施例的一个方面,提供了一种基于回合制对战的界面显示方法,所述方法由客户端执行,所述方法包括:
在检测到回合制对战的参与方发起对战行为的情况下,向第一服务器发送所述对战行为的发起请求,所述第一服务器用于对所述对战行为进行处理;
接收来自所述第一服务器的对战渲染信息;其中,所述对战渲染信息用于对所述对战行为进行渲染;
接收来自第二服务器的光环渲染信息,所述第二服务器用于对虚拟世界的世界环境进行处理;其中,所述光环渲染信息是所述第二服务器根据所述第一服务器发送的光环信息生成的信息,所述光环信息用于表征所述对战行为所触发的光环,所述光环对所述回合制对战的战斗场景中的元素产生影响;
在基于所述对战渲染信息对所述对战行为进行渲染的过程中,根据所述光环渲染信息对 所述光环进行渲染。
根据本申请实施例的一个方面,提供了一种基于回合制对战的信息提供方法,所述方法由服务器执行,所述服务器包括第一服务器和第二服务器,所述方法包括:
所述第一服务器接收对战控制进程发送的对战行为的发起请求,根据所述发起请求生成所述对战行为对应的对战渲染信息;其中,所述对战行为是回合制对战的参与方发起的行为,所述对战渲染信息用于对所述对战行为进行渲染;
所述第一服务器在确定所述对战行为触发光环的情况下,向所述第二服务器发送调用请求;其中,所述调用请求包括用于表征所述光环的光环信息,所述光环对所述回合制对战的战斗场景中的元素产生影响;
所述第二服务器根据所述调用请求生成光环渲染信息,向场景控制进程发送所述光环渲染信息,所述光环渲染信息由所述场景控制进程发送给所述对战控制进程,所述光环渲染信息用于对所述光环进行渲染;
所述第一服务器在接收到来自所述第二服务器的所述光环渲染信息的下发成功通知之后,向所述对战控制进程发送所述对战渲染信息。
根据本申请实施例的一个方面,提供了一种基于回合制对战的界面显示装置,所述装置包括:
对战控制模块,用于在检测到回合制对战的参与方发起对战行为的情况下,向第一服务器发送所述对战行为的发起请求,所述第一服务器用于对所述对战行为进行处理;
所述对战控制模块,还用于接收来自所述第一服务器的对战渲染信息;其中,所述对战渲染信息用于对所述对战行为进行渲染;
场景控制模块,用于接收来自第二服务器的光环渲染信息,所述第二服务器用于对虚拟世界的世界环境进行处理;其中,所述光环渲染信息是所述第二服务器根据所述第一服务器发送的光环信息生成的信息,所述光环信息用于表征所述对战行为所触发的光环,所述光环对所述回合制对战的战斗场景中的元素产生影响;
所述对战控制模块,还用于在基于所述对战渲染信息对所述对战行为进行渲染的过程中,根据所述光环渲染信息对所述光环进行渲染。
根据本申请实施例的一个方面,提供了一种计算机系统,所述计算机系统包括客户端和服务器,所述客户端用于执行上述基于回合制对战的界面显示方法,所述服务器用于执行上述基于回合制对战的信息提供方法。
根据本申请实施例的一个方面,提供了一种计算机设备,所述计算机设备包括处理器和存储器,所述存储器中存储有计算机程序,所述计算机程序由所述处理器加载并执行以实现上述基于回合制对战的界面显示方法,或者实现上述基于回合制对战的信息提供方法。
所述计算机设备包括终端设备和服务器。
根据本申请实施例的一个方面,提供了一种计算机可读存储介质,所述可读存储介质中存储有计算机程序,所述计算机程序由处理器加载并执行以实现上述基于回合制对战的界面显示方法,或者实现上述基于回合制对战的信息提供方法。
根据本申请实施例的一个方面,提供了一种计算机程序产品或计算机程序,该计算机程序产品或计算机程序包括计算机指令,该计算机指令存储在计算机可读存储介质中。计算机设备的处理器从计算机可读存储介质读取该计算机指令,处理器执行该计算机指令,使得该计算机设备执行上述基于回合制对战的界面显示方法,或者执行上述基于回合制对战的信息提供方法。
本申请实施例提供的技术方案可以包括如下有益效果:
当客户端检测到回合制对战的参与方发起对战行为的情况下,客户端向用于处理对战行为的第一服务器发送发起请求,并接收第一服务器发送的用于对对战行为进行渲染的对战渲染信息;此外,第一服务器会将表征对战行为所触发光环的光环信息发送至用于处理世界环 境的第二服务器,从而接收第二服务器基于光环信息生成的光环渲染信息,并综合对战渲染信息和光环渲染信息实现界面显示过程。也即:不再由两个服务器分别执行对战行为的环境生成和非对战行为的环境生成,而是在对战行为下由第一服务器将少量的光环信息发送至第二服务器,从而即便在对战行为下也是通过第二服务器处理与环境生成相关的数据,避免第二服务器需要将环境数据整体更新复制至第一服务器而导致数据传输量较大的问题,不仅大大降低了第一服务器和第二服务器之间的数据交互传输量,也能够通过在第二服务器侧维护一份场景信息的方式,避免第一服务器和第二服务器均维护一份场景数据的冗余问题,进而有效降低了服务器的维护开销,也有利于提高界面显示效率,提高人机交互效率。
另外,通过支持用于处理对战行为的第一服务器下发对战渲染信息,通过用于处理世界环境的第二服务器下发光环渲染信息,避免了第一服务器和第二服务器均要下发对战渲染信息或光环渲染信息,而导致客户端在处理相同渲染信息的时候需要兼容多种渲染信息来源的问题,从而降低了客户端处理渲染信息的压力。
附图说明
图1是本申请一个实施例提供的世界场景的示意图;
图2是本申请一个实施例提供的战斗场景的示意图;
图3是本申请一个实施例提供的方案实施环境的示意图;
图4是本申请一个实施例提供的基于回合制对战的界面显示方法的流程图;
图5是本申请一个实施例提供的战斗场景下的光环的示意图;
图6是本申请一个实施例提供的世界场景下的光环的示意图;
图7至图8示例性示出了亲和度渲染动画的示意图;
图9至图10示例性示出了环境影响技能的示意图;
图11是本申请一个实施例提供的基于回合制对战的信息提供方法的流程图;
图12至图13是本申请另一个实施例提供的基于回合制对战的界面显示方法的流程图;
图14是本申请一个实施例提供的对战控制侧与场景控制侧之间的交互的示意图;
图15是本申请一个实施例提供的基于回合制对战的界面显示装置的框图;
图16是本申请一个实施例提供的计算机系统的结构框图;
图17是本申请一个实施例提供的计算机设备的结构框图。
具体实施方式
在对本申请实施例进行介绍说明之前,首先对本申请中涉及的相关名词进行解释说明。
1、虚拟世界:是应用程序在终端上运行时显示(或提供)的虚拟世界。该虚拟世界可以是对真实世界的仿真环境,也可以是半仿真半虚构的环境,还可以是纯虚构的环境。虚拟世界可以是二维虚拟世界、2.5维虚拟世界和三维虚拟世界中的任意一种,本申请对此不加以限定。下述实施例以虚拟世界是三维虚拟世界来举例说明。
2、主控虚拟角色:是指玩家在虚拟世界中扮演的可活动对象。主控虚拟角色可以是虚拟人物、虚拟动物、动漫人物等,比如:在三维虚拟世界中显示的人物、动物。可选地,主控虚拟角色是基于动画骨骼技术创建的三维立体模型。每个主控虚拟角色在三维虚拟世界中具有自身的形状和体积,占据三维虚拟世界中的一部分空间。
3、宠物虚拟角色:是指由人工智能在虚拟世界中控制的可活动对象。宠物虚拟角色可以是虚拟生物、虚拟动物、虚拟怪物、虚拟精灵、虚拟宠物等。
4、世界地图:世界地图包括多个地块。每个地块是一块多边形地块。该多边形地块为正方形、长方形、六边形中的任意一种。例如,每个地块是50厘米×50厘米的正方形。每一个地块都拥有自己的地表属性。地表属性包括草、石头、水等。另外,世界地图所包括的多个地块,可以采用同一种类型的地块,也可以采用多种不同类型地块的组合。
5、战斗地图:参考图1和图2所示,在虚拟环境10(即虚拟世界)中,当第一宠物虚拟角色12在世界地图中的某个地点遭遇第二宠物虚拟角色14进入战斗时,将世界地图中以 第一宠物虚拟角色12所确定的参考位置为中心的一定范围内的一个或多个地块,确定为战斗地图16。该参考位置是第一宠物虚拟角色12所在的位置,或者,离第一宠物虚拟角色12最近的合适战斗位置。在一些实施例中,该战斗地图16包括以参考位置为中心,预定长度为半径的圆形范围内的所有地块;在一些实施例中,该战斗地图16包括以参考位置为中心,预定长度和宽度的矩形范围内的所有地块。
6、世界场景:世界场景是指世界地图所对应的场景,当用户界面中显示世界场景时,用户界面中可以显示世界地图中的一个或多个地块。例如,用户界面中显示主控虚拟角色或宠物虚拟角色在世界地图中当前所处的一个或多个地块,以及与上述显示地块、主控虚拟角色、宠物虚拟角色等相关的一些界面元素。本申请实施例中的元素可用于组成场景(如世界场景),诸如地块、主控虚拟角色、宠物虚拟角色等可视元素。
7、战斗场景:战斗场景是指战斗地图所对应的场景,当用户界面中显示战斗场景时,用户界面中可以显示战斗地图,如显示战斗地图所包含的全部或部分地块。例如,用户界面中显示宠物虚拟角色或宠物虚拟角色在战斗地图中当前所处的一个或多个地块,以及与上述显示地块、主控虚拟角色、宠物虚拟角色等相关的一些界面元素。
世界场景和战斗场景可以切换,例如从世界场景切换至战斗场景,也可以从战斗场景切换至世界场景。
在显示世界场景和战斗场景时,虚拟摄像机可以采用不同的拍摄视角。可选地,在世界场景和战斗场景中,除了上述拍摄视角有所不同之外,所允许的用户操作也可以有所不同。通过上述方式,可以让用户从感知上对世界场景和战斗场景做一个区分,但是由于世界场景和战斗场景共用相同的世界地图,战斗场景所使用的战斗地图是世界地图中的一个或多个地块,因此世界场景和战斗场景的切换并不会带来强烈的撕裂感,而是非常平滑自然的切换。
8、势能:虚拟世界中对战斗产生影响的属性或标识。势能包括草系、火系、水系、石系、冰系、电系、毒系、光系、幽灵系、恶魔系、普通系、武系、萌系、幻系、虫系、翼系、龙系、机械系中的至少一种。
9、光环:在虚拟世界中对一定范围内的地块、主控虚拟角色、宠物虚拟角色中的至少一种可视元素存在影响能力的抽象元素。该抽象元素区别于本申请实施例中的元素,其可特用于描述光环,使得光环具体化、可视化。光环是虚拟世界中持续出现、随机出现、跟随主控虚拟角色出现、跟随宠物虚拟角色出现、由技能触发出现或由道具触发出现的抽象元素。光环在虚拟世界中是不可见或可见的,例如,火焰光环、光系光环、加血光环、Buff光环等等。
10、战斗过程改变世界环境:在宠物虚拟角色之间进行回合制对战时,宠物虚拟角色释放的技能会对虚拟世界的环境产生影响。例如,宠物虚拟角色在战斗场景中进行对战,宠物虚拟角色释放了一个火技能,射击中的草地地块会被点燃。可选地,在战斗场景中,该技能显示效果(即光环)可以是由对战控制进程通过光环渲染接口完成渲染的,在世界场景中,该技能显示效果(即光环)可以是由场景控制进程通过该光环渲染接口完成渲染的。其中,战斗场景下的光环可以是永久显示,世界场景下的光环可以在持续显示阈值时长之后取消显示,光环对应的地块恢复原貌。
进一步地,当地块的环境发生改变之后,又会对宠物虚拟角色产生影响。
11、世界环境改变战斗过程:在宠物虚拟角色之间进行回合制对战时,虚拟世界中的环境会对宠物虚拟角色产生影响,比如对宠物虚拟角色的技能伤害产生影响,或对宠物虚拟角色的技能显示效果产生影响。示例性地,虚拟世界中的环境包括地块的环境和天气的环境,这两方面会综合影响宠物虚拟角色对于环境的厌恶和喜欢程度。示例性地,宠物虚拟角色对环境的厌恶和喜欢程度包括如下几个不同等级:强亲和、弱亲和、无感、弱抵触、强抵触。
如果宠物虚拟角色对2个环境(地块+天气)均喜欢,则获得强力亲和效果;如果宠物虚拟角色只喜欢1个环境,另一个环境不厌恶,则获得较弱亲和效果;如果宠物虚拟角色喜欢1个环境同时厌恶另一个环境,则不会获得任何效果;如果宠物虚拟角色只厌恶1个环境, 另一个环境不喜欢,则获得较弱抵触效果;如果宠物虚拟角色对2个环境均厌恶,则获得强抵触效果。
在对战过程中,服务器或客户端需要定期(如每回合)获取环境,并确定环境对宠物虚拟角色所产生的影响。
12、回合制战斗中的位置变化:传统的回合制战斗中,我方宠物虚拟角色和敌方宠物虚拟角色在战斗地图中的站立位置是固定不变的,也即站桩。在本申请实施例中,宠物虚拟角色之间进行对战时,不论是攻击方还是被攻击方,都是非站桩的,即会产生位移。
宠物虚拟角色作为攻击方时,会产生主动位移。如果宠物虚拟角色当前所处的第一位置满足技能释放条件,则宠物虚拟角色可以在该第一位置原地释放技能。如果上述第一位置不满足技能释放条件,则在技能释放之前,会控制宠物虚拟角色从第一位置主动移动到满足技能释放条件的第二位置,宠物虚拟角色在该第二位置释放技能。上述第二位置可以称为合法战斗点,是指满足宠物虚拟角色的技能释放条件的位置。可选地,宠物虚拟角色在释放技能完成之后,会移动到第三位置,该第三位置可以和上述第一位置相同,也可以不同。上述第三位置可以称为合法站立点,是指宠物虚拟角色在释放技能完成之后所处的位置。另外,上述技能释放条件可以与宠物虚拟角色、技能、环境等因素有关。
宠物虚拟角色作为受击方时,会产生被动位移,在受到技能攻击时,会有被击退的位移过程。例如,宠物虚拟角色当前处于第一位置,在第一位置受到技能攻击,宠物虚拟角色可以从第一位置移动至第四位置,第四位置可以根据作为受击方的宠物虚拟角色、作为攻击方的宠物虚拟角色、受到的技能、环境等因素确定。
另外,随着宠物虚拟角色的位置发生变化,其所处的环境(包括所处地块的环境、天气的环境)也有可能发生变化,从而对宠物虚拟角色后续回合的战斗产生影响。
请参考图3,其示出了本申请一个实施例提供的方案实施环境的示意图。该方案实施环境可以实现成为计算机系统的架构,该实施环境可以包括:终端设备310和服务器320。
终端设备310可以是诸如手机、平板电脑、游戏主机、多媒体播放设备、PC(Personal Computer,个人计算机)等电子设备。终端设备310中可以安装应用程序的客户端,诸如游戏类应用程序、模拟学习类应用程序、虚拟现实(Virtual Reality,VR)类应用程序、增强现实(Augmented Reality,AR)类应用应用程序、社交类应用程序、互动娱乐类应用程序等的客户端。
以回合制RPG为例,参考图3,终端设备310中安装运行有回合制RPG的客户端;服务器320用于为终端设备310中的应用程序(如游戏类应用程序)的客户端提供后台服务。例如,服务器320可以是上述应用程序(如游戏类应用程序)的后台服务器。服务器320可以是一台服务器,也可以是由多台服务器组成的服务器集群,或者是一个云计算服务中心。
以回合制RPG为例,参考图3,服务器320包括第一服务器321和第二服务器322。其中,第一服务器321中运行有对战行为处理逻辑和亲和度处理逻辑,以用于对对战操作、技能等进行逻辑处理,以及用于计算宠物虚拟角色的环境亲和度(简称亲和度)。
第二服务器322中运行环境处理逻辑和光环处理逻辑,以对虚拟世界的世界环境进行逻辑处理,以及对对战行为对应的光环进行逻辑处理。
终端设备310和服务器320之间可通过网络330进行互相通信。该网络330可以是有线网络,也可以是无线网络。
示例性地,参考图3,在战斗场景中,终端设备310从第一服务器321获取对战渲染信息,对战渲染信息用于对对战行为进行渲染;此外,终端设备310从第二服务器322获取光环渲染信息,光环渲染信息用于对光环进行渲染;之后,终端设备310根据对战渲染信息和光环渲染信息进行显示与渲染。
在相关技术的回合制RPG中,玩家在写实世界或虚拟世界中扮演一个虚拟角色。该回合 制RPG提供两种类型的地图:世界地图和战斗地图。在非战斗场景下,虚拟角色在世界地图中活动,比如游玩、捕捉宠物虚拟角色、收集宝箱、收集虚拟道具等;在战斗场景下,虚拟角色在战斗地图中控制已捕捉的宠物虚拟角色,与敌方单位(如游戏中的NPC、AI控制的怪物,或其他角色捕捉的宠物虚拟角色等)进行回合制战斗。
在相关技术中,由于世界地图和战斗地图是两个完全不同的地图,因此在世界场景(或称为非战斗场景)和战斗场景之间进行切换时,用户界面中会显示差异性较大的地图内容,玩家可以明显感受到两个不同地图的差异,存在强烈的撕裂感。相关技术为了缓解这种撕裂感,往往会在切换时显示一个过渡动画,但效果仍然不佳。
在本申请实施例中,提供了一种创新的回合制RPG机制。该回合制RPG中将传统的世界地图和战斗地图合二为一。战斗地图是每次战斗时,从世界地图中动态确定出来的一块子地图。这样在世界场景(或称为非战斗场景)和战斗场景之间进行切换时,用户界面中显示的地图内容不会存在巨大差异,从而避免相关技术所存在的撕裂感。并且,该回合制RPG还允许虚拟世界中的环境(天气、时间、地块等)对主控虚拟角色、宠物虚拟角色以及战斗过程进行影响,反之主控虚拟角色、宠物虚拟角色以及战斗过程也会对虚拟世界中的环境进行影响,从而将回合制战斗过程有机融入虚拟世界中,不再是撕裂的两部分,而是形成一个整体。
该回合制RPG的对战过程可以是单打,或双打,或团战,本申请实施例对此不作限定。示例性地,该对战过程可以如下:
1.选择要出战的宠物虚拟角色。
2.显示对战场景,选择宠物虚拟角色要使用的技能。
3.控制触摸屏上的触摸控件,释放技能。
4.显示技能动画效果。
请参考图4,其示出了本申请一个实施例提供的基于回合制对战的界面显示方法的流程图,该方法各步骤的执行主体可以是图3所示方案实施环境中的终端设备310,如终端设备310中安装运行的应用程序的客户端,该方法可以包括如下几个步骤(步骤401~步骤404):
步骤401,在检测到回合制对战的参与方发起对战行为的情况下,向第一服务器发送对战行为的发起请求。
在一个可选的实施例中,客户端中运行有对战控制进程,对战控制进程在检测到参与方发起对战行为的情况下,向第一服务器发送对战行为的发起请求。
在本申请实施例中,对战控制进程是一种用于处理与战斗场景相关联内容的进程。示例性地,对战控制进程可以实现上述对战过程的渲染,如对战过程中宠物虚拟角色和敌方单位的行为进行渲染。
本申请实施例中的回合制对战是在本申请实施例提供的回合制RPG机制下运行的。该回合制对战可以是指上述宠物虚拟角色与敌方单位之间进行回合制的对战。示例性地,回合制对战可以包括三个回合对战,在每一回合对战中,宠物虚拟角色和敌方单位可以轮流对对方进行一次攻击。完成三个回合对战之后,该回合制对战结束。或者,宠物虚拟角色和敌方单位中的一方战败,该回合制对战结束,本申请实施例对此不作限定。
回合制对战的参与方可以是指上述宠物虚拟角色,也可以是指宠物虚拟角色的敌方单位。对战行为可以是指参与方响应于玩家的对战控制操作而要执行的战斗行为,诸如释放技能、普通攻击、逃跑、使用虚拟道具、防守等行为。对战行为也可以是指人工智能控制参与方所要执行的战斗行为,本申请实施例对此不作限定。
第一服务器为应用程序的后台服务器,其与对战控制进程对应。第一服务器可以通过执行对战处理逻辑,处理玩家在战斗场景中所产生的控制信号,以实现上述对战过程的推进。第一服务器还可用于处理对战行为的发起请求。示例性地,第一服务器可以通过执行对战行为处理逻辑,对对战行为进行逻辑处理,生成对战渲染信息,以进行对战行为的渲染显示。 第一服务器与上述实施例介绍相同,这里不再赘述。
对战行为的发起请求用于请求执行对战行为的渲染过程,以及获取对战行为渲染所需的信息,即对战渲染信息。该发起请求中可以包括有对战行为的标识信息,诸如技能的标识信息、虚拟道具的标识信息等。
在一个可选的实施例中,客户端中除运行有对战控制进程外,还运行有场景控制进程。
在本申请实施例中,场景控制进程是一种用于处理与世界场景相关联内容的进程。示例性地,场景控制进程可以实现主控虚拟角色在世界场景中活动的渲染。其中,场景控制进程与对战控制进程是两种相互独立的不同进程。
可选地,在主控虚拟角色遇到敌方单位时,玩家可以选择开启回合制对战。在一个示例中,回合制对战开启过程中,回合制对战的对战画面显示过程可以如下:
1、在进入回合制对战的情况下,场景控制进程获取回合制对战的参与方对应的基础信息和战斗场景对应的环境信息,并向第二服务器发送基础信息和环境信息。
上述基础信息可以是指对战中所需要的角色信息,诸如参与方的生命值、等级、属性、技能、敌方单位的属性、技能等信息。示例性地,上述基础信息可以包括主控虚拟角色的等级和生命值,主控虚拟角色所使用的宠物虚拟角色的属性和技能、敌方单位的属性、技能等信息。
战斗场景对应的环境信息可以是指战斗地图所对应的环境信息,诸如地块信息、时间信息、天气信息等。可选地,在首回合对战中,可以根据该环境信息确定宠物虚拟角色与战斗场景之间的环境亲和度,以确定首回合对战中,宠物虚拟角色对应技能的最终增益或减益效果。
第二服务器为应用程序的后台服务器,其与场景控制进程对应。第二服务器可以通过执行场景处理逻辑,处理玩家在世界场景中所产生的控制信号,以实现主控虚拟角色在世界场景中活动的推进与渲染。第二服务器与上述实施例介绍相同,这里不再赘述。
第二服务器在接收到基础信息和环境信息之后,将基础信息和环境信息发送至第一服务器。
2、对战控制进程接收来自第一服务器的回合制对战的对战画面;其中,回合制对战的对战画面是第一服务器根据第二服务器发送的基础信息和环境信息生成的画面。
对战画面可以是指宠物虚拟角色与敌方单位在战斗场景中对战的画面。第一服务器可以通过执行对战行为处理逻辑,基于来自第二服务器的基础信息和环境信息,生成对战画面(或对战画面渲染信息),并将其下发至对战控制进程。
3、对战控制进程显示回合制对战的对战画面。
可选地,对战控制进程可以直接在用户界面中显示对战画面,或者根据对战画面渲染信息进行对战画面的渲染,本申请实施例对此不作限定。
在上述对战画面显示过程中,借助场景控制进程所发送的基础信息和环境信息,使得第二服务器更直观地知悉参与方的情况以及战斗场景的情况,进而借助第一服务器和第二服务器之间的交互过程,使得与第一服务器进行交互的对战控制进程获取到更符合当前对战情况的对战画面,提升了对战画面的获取准确度,从而有利于在客户端上渲染显示更具有真实感的对战画面,提升人机交互效率。
步骤402,接收来自第一服务器的对战渲染信息;其中,对战渲染信息用于对对战行为进行渲染。对战渲染信息可以是指在时间维度上的一系列与对战行为相关的行为控制参数,根据这些行为控制参数,可以控制宠物虚拟角色完成一套动作或表演。
在一个可选的实施例中,客户端中运行有对战控制进程,对战控制进程除向第一服务器发送对战行为的发起请求外,还会接收来自第一服务器的对战渲染信息。
示例性地,在对战行为对释放技能的情况下,对战控制进程根据该对战渲染信息,控制宠物虚拟角色完成一套技能释放动作,从而完成技能释放的渲染。
步骤403,接收来自第二服务器的光环渲染信息;其中,光环渲染信息是第二服务器根据第一服务器发送的光环信息生成的信息,光环信息用于表征对战行为所触发的光环,光环对回合制对战的战斗场景中的元素产生影响。
光环渲染信息可以是指在时间维度上的一系列与光环相关的元素控制参数,根据这些元素控制参数,可以控制一些元素(如火元素)完成对环境(如地块)的影响。示例性地,在光环为火焰光环的情况下,根据对应的光环渲染信息,可以渲染得到火元素将地块从草属性转换为火属性这一转换过程。需要说明的是,光环所影响的战斗场景中的元素,可以为地块、主控虚拟角色、宠物虚拟角色中的至少一种可视元素。
可选地,第二服务器中运行有光环处理逻辑,通过对光环信息(如光环的标识信息)所表征的光环进行逻辑处理,即可得到该光环对应的光环渲染信息。
在一个可选的实施例中,客户端中运行的场景控制进程接收来自第二服务器的光环渲染信息。
示意性的,第二服务器在生成光环渲染信息后,将光环渲染信息发送至场景控制进程。
在一个示例中,第一服务器在确定对战行为触发光环的情况下,向第二服服务器发送光环信息。在确定对战行为不能触发光环的情况下,只对对战行为进行逻辑处理,确定诸如造成的伤害、添加的Buff、产生的位移等信息,以生成对战渲染信息。
可选地,第一服务器还对应维护有对战行为与光环之间的关系表格,可以通过查询该关系表格,确定对战行为是否能够触发光环。在触发光环的情况下,还可以根据该关系表格确定该光环的光环信息。
在一个可选的实施例中,客户端中运行的场景控制进程在接收到光环渲染信息后,会向对战控制进程发送光环渲染信息。
示意性的,场景控制进程在接收到光环渲染信息之后,可以先缓存该光环渲染信息,在缓存好光环渲染信息之后,再将光环渲染信息发送至对战控制进程。如此可以使得对战控制进程先获取对战渲染信息,再获取光环渲染信息,从而依次实现对战行为的显示过程以及光环的渲染过程(避免没有征兆地先对光环进行渲染的问题),使得对战行为的显示和光环的显示之间衔接地更加自然顺滑。
步骤404,在基于对战渲染信息对对战行为进行渲染的过程中,根据光环渲染信息对光环进行渲染。
在一个可选的实施例中,客户端通过对战控制进程实现光环渲染过程。
可选地,对战控制进程接收场景控制进程发送的光环渲染信息后,基于光环渲染信息对光环进行渲染。
示意性的,在收到对战渲染信息和光环渲染信息的情况下,对战控制进程在根据对战渲染信息对对战行为进行渲染的过程中,根据光环渲染信息对光环进行渲染。在仅收到光环渲染信息的情况下,对战控制进程不再进行光环的渲染。
在一个示例中,对战控制进程和场景控制进程共用一个光环渲染接口。在回合制战斗(即战斗场景)中,对战控制进程调用该光环渲染接口,根据光环渲染信息进行光环的渲染。在回合制对战结束之后,场景控制进程替换对战控制进程调用该光环渲染接口,完成光环的衔接渲染。
示例性地,场景控制进程根据缓存的光环渲染信息对光环进行渲染。例如,在回合制对战结束之后,场景控制进程调用光环渲染接口,根据缓存的光环渲染信息,渲染显示回合制对战对应的剩余光环,该剩余光环是指在所有光环渲染信息中还需在虚拟世界中显示的光环(例如,回合制对战中,有的光环被取消渲染)。
或者,在对战控制进程进行光环渲染的过程中,场景控制进程根据缓存的光环渲染信息进行光环的统计,动态更新回合制对战对应的剩余光环,以在回合制对战结束之后,无缝衔接渲染显示回合制对战对应的剩余光环。
在上述过程中,介绍了由场景控制进程对光环渲染信息进行缓存后进行渲染的过程。场景控制进程作为用于处理与世界场景相关联内容的进程,需要具有场景显示的稳定性,借助缓存过程能够使得场景控制进程首先将与对战行为所触发的光环相关的光环渲染信息进行缓存,进而避免在对战控制进程无法基于光环渲染信息对光环进行渲染时,通过本进程缓存的光环渲染信息实现光环渲染过程,避免无法正常渲染出光环的问题。
可选地,在回合制对战结束之后,对战控制侧会将回合制对战的结算数据一次性同步至场景控制侧。示例性地,在回合制对战结束之后,第一服务器会将回合制战斗中的生命值消耗情况、经验值获取情况、虚拟资源耗损情况等结算数据一次性同步至第二服务器。
在一个示例中,可以通过对战行为对应的对战渲染标签来指示光环的显示时机和显示位置。该对战渲染标签可以包括回合制对战的标识信息、对战行为的标识信息、光环的标识信息、光环对应的位置(如地块、受影响角色等)、对战行为的渲染细节等。该过程可以包括如下内容:
1、场景控制进程接收来自第二服务器的对战渲染标签;其中,对战渲染标签是第一服务器在对战行为满足触发光环的情况下,根据对战行为生成的标签。
第一服务器在确定对战行为触发光环的情况下,生成对战行为的对战渲染标签,然后根据战渲染信息和光环信息生成调用请求,并通过RPC(Remote Procedure Call,远程过程调用)的方式,将调用请求发送至第二服务器,第二服务器再将对战渲染标签发送至场景控制进程。
2、场景控制进程向对战控制进程发送对战渲染标签。
场景控制进程在接收到来自第二服务器的对战渲染标签和光环渲染信息之后,将对战渲染标签和光环渲染信息一起发送至对战控制进程。
3、对战控制进程根据对战渲染标签所指示的显示时机和显示位置,在根据对战渲染信息对对战行为进行渲染的过程中,根据光环渲染信息对光环进行渲染。
示例性地,对战控制进程根据战渲染标签中的对战行为的渲染细节和光环对应的位置,确定出光环的显示时机和显示位置,在渲染显示对战行为的过程中,当到达显示时机时在显示位置(如地块)处渲染显示光环。例如,在触发光环的技能的释放过程渲染完成之后,进行该光环的渲染,以实现两者渲染的自然衔接。
示例性地,以光环影响虚拟世界中的地块为例。
在地块的初始类型为草类型的情况下,地块的类型在火类型的光环的影响下变换为火类型,也即渲染显示地块上的草燃烧的画面;或者,在地块的初始类型为水类型的情况下,地块的类型在冰类型的光环的影响下变换为冰类型,也即渲染显示地块上的水结冰的画面;或者,在地块的初始类型为土类型的情况下,地块的类型在草类型的光环的影响下变换为草类型,也即渲染显示地块上的生长草的画面,本申请实施例对此不作限定。
例如,参考图5和图6,在战斗场景500中,火属性的宠物虚拟角色501在进攻的时候,会在自身所在的位置施加一个半径为4米的球体火光环502,球体火光环502将球体火光环502内的地块从草类型转换为火类型,也即将球体火光环502内的草地点燃。可选地,在回合制战斗结束时,若该球体火光环502属于剩余光环,则场景控制进程也会在虚拟角色501的位置处渲染显示球体火光环502,以及渲染显示球体火光环502内的草地被点燃。
在本申请实施例中,宠物虚拟角色作为攻击方时,可能会产生主动位移。宠物虚拟角色作为受击方时,可能会产生被动位移,在受到技能攻击时,会有被击退的位移过程。随着宠物虚拟角色的位置发生变化,其所处的环境(包括所处地块的环境、天气的环境)也有可能发生变化,从而对宠物虚拟角色后续回合的战斗产生影响。因此,在每一回合对战中,需要重新确定宠物虚拟角色与虚拟世界之间的环境亲和度(简称亲和度,即上述亲和效果),其可以包括如下内容。
1、对战控制进程向第一服务器发送对战行为的渲染结束通知。
可选地,在完成对战行为和光环的渲染之后,对战控制进程向第一服务器发送对战行为 的渲染结束通知。该渲染结束通知用于向第一服务器告知对战行为和光环完成渲染的结果。
第一服务器在接收到渲染结束通知之后,生成更新环境信息的获取请求,并将该获取请求发送至第二服务器,以获取更新环境信息,第一服务器在获取更新环境信息之后,根据更新环境信息确定参与方与更新环境信息之间的亲和度,该亲和度影响参与方的技能强度。其中,获取请求用于请求获取更新环境信息,该获取请求中可以包括对战行为对应的标识信息、光环所影响的地块信息等,更新环境信息是指战斗场景经过光环影响后的环境信息,诸如经过光环影响后的地块的属性。第一服务器根据光环信息可以获取该光环对应的更新环境信息。
2、对战控制进程接收来自第一服务器的亲和度渲染动画;其中,亲和度渲染动画是第一服务器根据第二服务器发送的更新环境信息生成的动画,更新环境信息是指战斗场景经过光环影响后的环境信息,亲和度渲染动画用于表征参与方与更新环境信息之间的亲和度,亲和度影响参与方的技能强度。
在本申请实施例中,环境信息可以包括天气信息、地块信息和时间信息。第一服务器可以结合环境信息和参与方的属性,确定参与方与环境之间的亲和度。示例性地,可以选获取参与方对应的天气势能和地块势能,再根据天气势能和地块势能,结合参与方的属性信息,确定参与方与环境之间的亲和度。其中,天气势能可以是根据天气信息和时间信息来确定。例如,夜晚的晴天会给予“鬼”势能,而上午的晴天会给予“光”势能。地块势能是由参与方所处的地块的类型决定的。例如,草属性的地块具有“草”势能。
可选地,参与方对环境的厌恶和喜欢程度包括如下几个不同等级:强亲和、弱亲和、无感、弱抵触、强抵触。
如果参与方的属性与天气势能和地块势能均相适配(即参与方对地块和天气均喜欢),则获得强力亲和效果,亲和度加2;如果参与方的属性仅与天气势能和地块势能中的一个相适配,与另一个不抵触(即参与方只喜欢1个,另一个不厌恶),则获得较弱亲和效果,亲和度加1;如果参与方的属性仅与天气势能和地块势能中的一个相适配,与另一个相抵触(即参与方只喜欢1个,厌恶另一个),则不会获得任何效果,亲和度加0;如果参与方的属性与天气势能和地块势能中的一个相抵触,与另一个不适配(即参与方只厌恶1个,不喜欢另一个),则获得较弱抵触效果,亲和度减1;如果参与方的属性与天气势能和地块势能均相抵触(即参与方对地块和天气均厌恶),则获得强抵触效果,亲和度减2。
如果参与方对应的亲和度为正,则视为触发了参与方的环境亲和,对参与方的对战行为进行增益,如增加技能的范围、技能的攻击力、技能的攻击效果等。如果参与方对应的亲和度为负,则视为参与方对环境抵触,对参与方的对战行为进行减益,如降低技能的范围、技能的攻击力、技能的攻击效果等。如果参与方对应的亲和度为0,则不对参与方的对战行为的影响能力进行调整。
第一服务器采用与上述相同的方法,即可获取参与方与更新环境信息之间的亲和度,进而根据亲和度,生成亲和度渲染动画。可选地,不同的亲和度对应不同的亲和度渲染动画。例如,强亲和对应的亲和度渲染动画为大笑脸,弱亲和对应的亲和度渲染动画为微笑脸,强抵触对应的亲和度渲染动画为愤怒脸。亲和度渲染动画可以为图标、动态图标、动画等,本申请实施例对此不作限定。
第一服务器将生成的亲和度渲染动画下发给对战控制进程。
3、对战控制进程显示亲和度渲染动画。
对战控制进程在接收到亲和度渲染动画之后,可以在参与方的附近位置显示该亲和度渲染动画。
例如,参考图7,在战斗场景700中,火属性的宠物虚拟角色701和环境之间为强亲和,则对战控制进程在宠物虚拟角色701的上方显示强亲和对应的亲和度渲染动画702,亲和度渲染动画702为一个大笑脸的太阳。
又例如,参考图8,在战斗场景800中,光属性的宠物虚拟角色801和环境之间为弱亲 和,则对战控制进程在宠物虚拟角色802的上方显示弱亲和对应的亲和度渲染动画802,亲和度渲染动画802为一个微笑脸的太阳。
又例如,参考图9和图10,在战斗场景900中,土属性的宠物虚拟角色901在草地上使用土属性的技能902(如主动撞击)对对方单位进行攻击,由于宠物虚拟角色901和环境之间无感,则未对技能902做出调整(即普通的撞击)。而在宠物虚拟角色901移动至岩石地块上,再次使用技能902对对方单位进行攻击,由于宠物虚拟角色901和环境之间从无感转换为强亲和,则对技能902效果进行了增益,如技能902的威力提升50%,并为技能902添加砂石特效(即带有砂石特效且威力提升的撞击)。
在上述确定环境亲和度的内容中,由对战控制进程实现显示亲和力渲染动画的过程。对战控制进程除会通过光环渲染信息对光环进行渲染外,还会在对战行为的渲染结束后,接收根据第一服务器和第二服务器之间发送的更新环境信息生成的亲和力渲染动画,以便通过亲和力渲染动画更加及时地将战斗场景经过光环影响后的环境情况展现出来,不仅大大提升了画面显示的真实性,还能够丰富画面的展示效果,以提高游戏的趣味性,便于玩家通过亲和力渲染动画决定下一步对战情况,提升人机交互效率。
综上所述,本申请实施例提供的技术方案,不再由两个服务器分别执行对战行为的环境生成和非对战行为的环境生成,而是在对战行为下由第一服务器将少量的光环信息发送至第二服务器,从而即便在对战行为下也是通过第二服务器处理与环境生成相关的数据,避免第二服务器需要将环境数据整体更新复制至第一服务器而导致数据传输量较大的问题,不仅大大降低了第一服务器和第二服务器之间的数据交互传输量,也能够通过在第二服务器侧维护一份场景信息的方式,避免第一服务器和第二服务器均维护一份场景数据的冗余问题,进而有效降低了服务器的维护开销,也有利于提高界面显示效率,提高人机交互效率。
另外,通过支持对战控制进程执行对战行为处理逻辑,以及支持场景控制进程执行光环处理逻辑,从而使得不同的进程更有针对性地实现逻辑的执行过程,通过光环处理逻辑和对战行为处理逻辑之间的解耦合,提高了业务逻辑的独立性,进而降低了回合制对战的维护和扩展难度。同时,由于原属于第一服务器或第二服务器的底层代码,无需同时兼容第一服务器和第二服务器,因此大大降低了代码的开发难度,有利于提升客户端和服务器之间的交互效率,降低系统的维护成本。
另外,在回合制对战中,通过支持第一服务器向第二服务器发送用于生成光环渲染信息的光环信息从而使得第二服务器将光环渲染信息发送至客户端中用于处理与世界场景相关联内容的场景控制进程,该场景控制进程向客户端中的对战控制进程提供该光环渲染信息,实现了对战控制侧和场景控制侧之间的信息交互,从而使得对战控制侧和场景控制侧之间能够表现一致,提高了对战控制侧和场景控制侧之间的融合度。
另外,介绍了综合对战渲染标签、对战渲染信息以及光环渲染信息实现渲染光环的过程,通过对战渲染标签指示光环的显示位置和显示时机,不仅可以减少数据的传输量,还可以实现对战行为的渲染过程和光环的渲染过程之间自然顺滑的衔接,从而降低每次交互过程中的数据传输量,提高了战斗的渲染效果;此外,通过战斗控制侧和场景控制侧在光环渲染方面的同步情况,也进一步提高了世界场景和战斗场景之间的融合性,在提升画面显示真实性的同时,解决了在战斗控制侧和场景控制侧进行数据重复处理的问题。
请参考图11,其示出了本申请一个实施例提供的基于回合制对战的信息提供方法的流程图,该方法各步骤的执行主体可以是图3所示方案实施环境中的服务器320,如第一服务器321和第二服务器322,该方法可以包括如下几个步骤(步骤1101~步骤1104):
步骤1101,第一服务器接收对战控制进程发送的对战行为的发起请求,根据发起请求生成对战行为对应的对战渲染信息;其中,对战行为是回合制对战的参与方发起的行为,对战渲染信息用于对对战行为进行渲染。
对战行为的发起请求用于请求发起对战行为的渲染,以及获取对战行为渲染所需的信息, 即对战渲染信息。该发起请求中可以包括有对战行为的标识信息,诸如技能的标识信息、虚拟道具的标识信息等。
第一服务器可以根据对战行为的标识信息,获取与对战行为相关的数据。第一服务器可以通过执行对战行为处理逻辑,对对战行为进行逻辑处理,生成对战渲染信息。第一服务器与上述实施例介绍相同,这里不再赘述。
本申请实施例中的回合制对战是在本申请实施例提供的回合制RPG机制下运行的。该回合制对战可以是指上述宠物虚拟角色与敌方单位之间进行回合制的对战。回合制对战的参与方可以是指上述宠物虚拟角色,也可以是指宠物虚拟角色的敌方单位。示例性地,在主控虚拟角色遇到敌方单位时,玩家可以选择开启回合制对战。
在一个示例中,回合制对战开启过程中,回合制对战的对战画面的提供过程可以如下所示:
1、第二服务器接收场景控制进程发送的回合制对战的参与方对应的基础信息和战斗场景对应的环境信息。
可选地,在确定开启回合制对战的情况下,场景控制进程拉取参与方对应的基础信息和战斗场景对应的环境信息,并将其上传至第二服务器。
2、第二服务器向第一服务器发送基础信息和环境信息。
第二服务器在接收到基础信息和环境信息之后,将基础信息和环境信息发送至第一服务器。
3、第一服务器根据基础信息和环境信息生成回合制对战的对战画面,向对战控制进程发送回合制对战的对战画面。
对战画面可以是指宠物虚拟角色与敌方单位在战斗场景中对战的画面。对战控制进程在接收到回合制对战的对战画面之后,可以在用户界面中显示回合制对战的对战画面。
步骤1102,第一服务器在确定对战行为触发光环的情况下,向第二服务器发送调用请求;其中,调用请求包括用于表征光环的光环信息,光环对回合制对战的战斗场景中的元素产生影响。
第一服务器在接收到对战行为的发起请求之后,还检测对战行为是否触发光环。在确定对战行为触发光环的情况下,第一服务器根据光环信息生成调用请求,并通过RPC的方式,将该调用请求发送至第二服务器。在确定对战行为不触发光环的情况下,第一服务器不进行调用请求的生成,仅执行对战渲染信息的生成。
步骤1103,第二服务器根据调用请求生成光环渲染信息,向场景控制进程发送光环渲染信息,光环渲染信息由场景控制进程发送给对战控制进程,光环渲染信息用于对光环进行渲染。
第二服务器在接收到调用请求之后,根据调用请求中的光环信息确定出光环,并对该光环进行逻辑处理,生成光环渲染信息。第二服务器将光环渲染信息发送至场景控制进程,以通过场景控制进程将光环渲染信息转发至对战控制进程。
在一个示例中,第二服务器在接收到调用请求之后,先确定光环对应的地块,再确定对光环对应的地块上是否存在历史光环。若光环对应的地块已存在历史光环,则第二服务器根据历史光环和光环之间的关系,生成光环渲染信息;其中,在历史光环和光环之间为覆盖关系的情况下,光环渲染信息用于取消历史光环对地块的影响,添加光环对地块的影响;或者,在历史光环和光环之间为互斥关系的情况下,光环渲染信息用于保持历史光环对地块的影响。
示例性地,在地块有火属性光环的基础上添加水属性光环会走到覆盖逻辑,即在该地块上取消火属性光环以及其影响力,并在该地块上添加水属性光环以及影响力。而在地块有水属性光环的基础上添加火属性光环会走到互斥逻辑,也即继续保持历史光环对该地块的影响。
若光环对应的地块不存在历史光环,则在该地块的类型支持光环(如草属性的地块支持火属性的光环)生效的情况下,生成光环渲染信息,在该地块的类型不支持光环(如水属性 的地块不支持火属性的光环)生效的情况下,不进行光环渲染信息的生成。
在上述过程中,借助光环与历史光环的比较情况,由第二服务器差异性决定光环渲染信息的生成情况。当历史光环和光环之间为覆盖关系,第二服务器生成的光环渲染信息用于取消历史光环对地块的影响,并添加光环对地块的影响,从而能够利用当前的光环更生动且及时地展现对战情况,避免显示历史光环而使得光环显示不生动的局限性;当历史光环和光环之间为互斥关系,第二服务器生成的光环渲染信息用于保持历史光环对地块的影响,从而避免对战情况与虚拟场景显示情况之间的排斥而导致画面显示割裂的问题,通过综合考虑历史光环和当前对战对应的光环,在保持场景显示真实性的基础上实现场景渲染过程。
步骤1104,第一服务器在接收到来自第二服务器的光环渲染信息的下发成功通知之后,向对战控制进程发送对战渲染信息。
第二服务器在向场景控制进程发送光环渲染信息之后,向第一服务器发送下发成功通知,以告知第一服务器光环渲染信息下发成功,可以向对战控制进程发送对战渲染信息。如此,可以确保对战控制进程在大致相同的时间内,接收到光环渲染信息和对战渲染信息。对战控制进程进而可以在根据对战渲染信息对对战行为进行渲染的过程中,根据光环渲染信息对光环进行渲染。
在一个示例中,可以通过对战行为对应的对战渲染标签来指示光环的显示时机和显示位置,该对战渲染标签的提供过程可以如下:
1、第一服务器在确定对战行为触发光环的情况下,根据对战行为生成对战渲染标签;其中,对战渲染标签用于指示光环的显示时机和显示位置。
该对战渲染标签可以包括回合制对战的标识信息、对战行为的标识信息、光环的标识信息、光环对应的位置(如地块、受影响角色等)、对战行为的渲染细节等。
2、第一服务器向第二服务器发送对战渲染标签。
第一服务器可以将对战渲染标签和光环信息打包成调用请求,并通过RPC的方式,将该调用请求发送至第二服务器。
3、第二服务器向场景控制进程发送对战渲染标签,对战渲染标签由场景控制进程发送给对战控制进程。
上述过程介绍了对战渲染标签的相关内容。通过对战渲染标签能够更加直观地指示光环的显示位置和显示时机,提高了画面渲染过程的针对性,降低了需要多次交互的数据传输量,在提高战斗渲染效果的同时提升了画面渲染效率;此外,战斗控制侧和场景控制侧的同步过程也进一步提高了世界场景和战斗场景之间的融合性,解决了在战斗控制侧和场景控制侧进行数据重复处理的问题。
第二服务器在接收到调用请求之后,生成光环渲染信息,并将光环渲染信息和对战渲染标签一起发送至场景控制进程,以通过场景控制进程将光环渲染信息和对战渲染标签发送给对战控制进程。对战控制进程根据对战渲染标签所指示的显示时机和显示位置,在根据对战渲染信息对对战行为进行渲染的过程中,根据光环渲染信息对光环进行渲染。
在一个可选的实施例中,回合制对战包括中间回合对战和最后一回对战,中间回合对战是指回合制对战中除最后一回对战之外的对战。
在一个示例中,在中间回合对战触发光环的情况下,第二服务器向场景控制进程发送光环渲染信息和第一服务器生成的对战渲染标签。其中,中间回合对战是指回合制对战除去最后一回对战之外的对战。
或者,在最后一回对战触发光环的情况下,第二服务器向场景控制进程发送光环渲染信息、第一服务器生成的对战渲染标签和回合制对战结束通知。第二服务器通过发送回合制对战结束通知,来告知场景控制进程回合制对战即将结束,可以进行光环渲染的衔接准备。
在上述过程中,分别介绍了中间回合对战和最后一回对战下第二服务器发送的内容差异。当处于中间回合对战时,考虑到对战尚未结束,第二服务器通过发送光环渲染信息和对战渲 染标签的过程,指示场景控制进程继续基于该内容进行场景渲染的后续流程;当处于最后一回对战时,考虑到对战即将结束,第二服务器通过发送光环渲染信息、对战渲染标签以及对局结束通知的过程,指示场景控制进程基于该内容进行场景渲染后停止数据接收及场景渲染过程,避免额外数据传输的资源浪费情况。
在本申请实施例中,宠物虚拟角色作为攻击方时,可能会产生主动位移。宠物虚拟角色作为受击方时,可能会产生被动位移,在受到技能攻击时,会有被击退的位移过程。随着宠物虚拟角色的位置发生变化,其所处的环境(包括所处地块的环境、天气的环境)也有可能发生变化,从而对宠物虚拟角色后续回合的战斗产生影响。因此,在每一回合对战中,需要重新确定宠物虚拟角色与虚拟世界之间的环境亲和度(简称亲和度,即上述亲和效果),其可以包括如下内容。
1、第一服务器在接收到对战控制进程发送的对战行为的渲染结束通知之后,向第二服务器发送更新环境信息的获取请求;其中,更新环境信息是指战斗场景经过光环影响后的环境信息。
该渲染结束通知用于向第一服务器告知对战行为和光环完成渲染。更新环境信息的获取请求用于请求获取更新环境信息。该环境信息可以包括天气信息、地块信息和时间信息。
2、第二服务器向第一服务器发送更新环境信息。
第二服务器在拉取更新环境信息之后,向第一服务器发送更新环境信息。
3、第一服务器根据更新环境信息,生成亲和度渲染动画;其中,亲和度渲染动画用于表征参与方与更新环境信息之间的亲和度,亲和度影响参与方的技能强度。
可选地,第一服务器可以根据参与方与环境之间的亲和度,生成亲和度渲染动画,其过程可以如下:
1、第一服务器根据更新环境信息,确定参与方对应的地块势能和天气势能;其中,地块势能用于表示地块的类型对参与方所产生的影响,天气势能用于表示天气和时间对参与方所产生的影响。
其中,可以根据时间信息和天气信息来确定天气势能。
2、第一服务器根据参与方对应的地块势能和天气势能,确定参与方对应的环境势能。
可选地,该环境势能可以包括两个部分:天气势能和地块势能。
3、第一服务器根据参与方对应的环境势能和参与方的属性信息,生成亲和度渲染动画。
第一服务器可以根据参与方对应的环境势能和参与方的属性信息,确定参与方与环境之间的亲和度,再根据亲和度生成亲和度渲染动画。其中,亲和度的确定方法与上述实施例介绍相同。
4、第一服务器向对战控制进程发送亲和度渲染动画。
在上述过程中,介绍了通过第一服务器根据参与方与环境之间的亲和度生成亲和度渲染动画的过程。第一服务器作为负责处理对战情况的服务器,能够在参与方发起对战行为的情况下确定各种更新环境信息对参与方产生的影响,从而实现对参与方进行实时分析的目的,进而能够实时地确定环境对参与方的技能强度造成的影响,该内容表征了参与方和环境之间的亲和度,依照实时确定的亲和度生成亲和度渲染动画,能够更加及时地将战斗场景经过光环影响后的环境情况展现出来,在大大提升画面显示真实性的同时,丰富了画面的展示效果,便于玩家通过亲和力渲染动画决定下一步对战情况,提升人机交互效率。
其中,介绍了根据更新环境信息生成亲和力渲染动画的内容。在得到表征战斗场景经过光环影响后的更新环境信息后,确定其中表征地块类型对参与方所产生影响的地块势能,以及确定表征天气和时间对参与方产生影响的天气势能,从而综合地块势能以及天气势能所表征的环境势能,以及参与方的属性信息,更加全面地考虑各种因素以及各种因素之间的影响情况,从而更加及时且准确地生成环境势能对参与方技能强度造成影响的亲和度渲染动画,提高亲和度渲染动画的真实性,在给予玩家充分的作战体验的同时,保证了作战场景和世界 场景体验的统一性,避免场景产生割裂感,提升不同场景间数据交互的高效性。
对战控制进程在接收到亲和度渲染动画之后,在战斗场景中进行亲和度渲染动画的显示。
综上所述,本申请实施例提供的技术方案,不再由两个服务器分别执行对战行为的环境生成和非对战行为的环境生成,而是在对战行为下由第一服务器将少量的光环信息发送至第二服务器,从而即便在对战行为下也是通过第二服务器处理与环境生成相关的数据,避免第二服务器需要将环境数据整体更新复制至第一服务器而导致数据传输量较大的问题,不仅大大降低了第一服务器和第二服务器之间的数据交互传输量,也能够通过在第二服务器侧维护一份场景信息的方式,避免第一服务器和第二服务器均维护一份场景数据的冗余问题,进而有效降低了服务器的维护开销,也有利于提高界面显示效率,提高人机交互效率。
另外,通过支持对战控制进程执行对战行为处理逻辑,以及支持场景控制进程执行光环处理逻辑,从而使得不同的进程更有针对性地实现逻辑的执行过程,通过光环处理逻辑和对战行为处理逻辑之间的解耦合,提高了业务逻辑的独立性,进而降低了回合制对战的维护和扩展难度。同时,由于原属于第一服务器或第二服务器的底层代码,无需同时兼容第一服务器和第二服务器,因此大大降低了代码的开发难度,有利于提升客户端和服务器之间的交互效率,降低系统的维护成本。
另外,通过支持第一服务器下发对战渲染信息,第二服务器下发光环渲染信息,避免了第一服务器和第二服务器均要下发对战渲染信息或光环渲染信息,而导致的客户端在处理相同渲染信息的时候需要兼容多种渲染信息来源的问题,从而降低了客户端处理渲染信息的压力。
请参考图12和图13,其示出了本申请另一个实施例提供的基于回合制对战的界面显示方法的流程图,该方法各步骤的执行主体可以是图1所示方案实施环境中的终端310或服务器320,该方法可以包括如下几个步骤(步骤1201~步骤1219):
步骤1201,场景控制进程显示世界场景对应的画面。
步骤1202,场景控制进程在确定开启回合制对战的情况下,拉取回合制对战的参与方对应的基础信息和战斗场景对应的环境信息,并向第二服务器发送基础信息和环境信息。
步骤1203,第二服务器将基础信息和环境信息发送至第一服务器。
步骤1204,第一服务器根据基础信息和环境信息,生成战斗场景对应的战斗画面,并将战斗场景对应的战斗画面发送至战斗控制进程。
步骤1205,战斗控制进程显战斗场景的战斗画面。
步骤1206,战斗控制进程在检测到参与方发起对战行为的情况下,向第一服务器发送对战行为的发起请求。
步骤1207,第一服务器在接收到对战行为的发起请求之后,根据发起请求生成对战行为对应的对战渲染信息,并在确定对战行为触发光环的情况下,向第二服务器发送调用请求,该调用请求包括用于表征光环的光环信息和用于指示光环的显示时机和显示位置的对战渲染标签。
步骤1208,第二服务器根据调用请求生成光环渲染信息,向场景控制进程发送该光环渲染信息和对战渲染标签,以及向第一服务器发送光环渲染信息的下发成功通知,光环渲染信息用于对光环进行渲染。
步骤1209,场景控制进程对光环渲染信息和对战渲染标签进行缓存,并向对战控制进程发送光环渲染信息和对战渲染标签。
步骤1210,第二服务器在接收到光环渲染信息的下发成功通知之后,向对战控制进程发送对战渲染信息。
步骤1211,对战控制进程根据对战渲染标签所指示的显示时机和显示位置,在根据对战渲染信息对对战行为进行渲染的过程中,根据光环渲染信息对光环进行渲染。
步骤1212,对战控制进程向第一服务器发送对战行为的渲染结束通知。
步骤1213,第一服务器向第二服务器发送更新环境信息的获取请求;其中,更新环境信息是指战斗场景经过光环影响后的环境信息。
步骤1214,第二服务器向第一服务器发送更新环境信息。
步骤1215,第一服务器根据更新环境信息生成亲和度渲染动画,并将其发送至对战控制进程。
步骤1216,对战控制进程显示亲和度渲染动画。
步骤1217,战斗控制进程在确定回合制对战结束之后,停止运行。
步骤1218,场景控制进程在确定回合制对战结束之后,根据缓存的光环渲染信息对光环进行渲染。
步骤1219,在阈值时长后,场景控制进程取消渲染显示光环,恢复显示世界场景对应的画面。
在一些实施例中,参考图14,在回合制对战的回合战斗中,最左侧的竖向流程是战斗控制端的回合间循环主流程,其和场景控制侧发生交互的主要在两个时间点发生交互:第一个时间点是在玩家选择完技能之后,判断该技能是否会触发光环,在技能触发光环的情况下,战斗控制侧需要和场景控制侧需发生交互。战斗控制侧需要向场景控制侧发送光环渲染信息的调取请求,以请求获取光环渲染信息。然后根据光环是否添加成功,来决定是否修改虚拟世界(如地块),并且结束场景控制侧的流程。
第二个时间点是在回合制对战未结束,并且确定技能和光环的渲染结束之后,对战控制侧的第一服务器需要从场景控制侧的第二服务器拉取更新环境信息,并且根据更新环境信息生成亲和度渲染动画,并下发给对战控制侧的对战控制进程进行显示。
综上所述,本申请实施例提供的技术方案,通过对战控制进程和第一服务器支持对战行为处理逻辑的实现,以获取对战渲染信息,通过场景控制进程和第二服务器支持光环处理逻辑的实现,以获取光环渲染信息,而无需在对战控制侧和场景控制侧均维护一份对战行为处理逻辑和光环处理逻辑,以及仅需在场景控制侧维护一份场景信息,从而降低了回合制对战在业务逻辑和数据维护上的冗余,进而降低了维护开销。
下述为本申请装置实施例,可以用于执行本申请方法实施例。对于本申请装置实施例中未披露的细节,请参照本申请方法实施例。
请参考图15,其示出了本申请一个实施例提供的基于回合制对战的界面显示装置的框图。该装置具有实现上述方法示例的功能,所述功能可以由硬件实现,也可以由硬件执行相应的软件实现。该装置可以是上文介绍的计算机设备,也可以设置在计算机设备中。如图15所示,该装置1500包括:对战控制模块1501和场景控制模块1502。
对战控制模块1501,用于在检测到回合制对战的参与方发起对战行为的情况下,向第一服务器发送所述对战行为的发起请求,所述第一服务器用于对所述对战行为进行处理。
所述对战控制模块1501,还用于接收来自所述第一服务器的对战渲染信息;其中,所述对战渲染信息用于对所述对战行为进行渲染。
场景控制模块1502,用于接收来自第二服务器的光环渲染信息,所述第二服务器用于对虚拟世界的世界环境进行处理;其中,所述光环渲染信息是所述第二服务器根据所述第一服务器发送的光环信息生成的信息,所述光环信息用于表征所述对战行为所触发的光环,所述光环对所述回合制对战的战斗场景中的元素产生影响。
所述对战控制模块1501,还用于在基于所述对战渲染信息对所述对战行为进行渲染的过程中,根据所述光环渲染信息对所述光环进行渲染。
在一些实施例中,所述场景控制模块1502,还用于通过所述场景控制进程接收来自所述第二服务器的对战渲染标签;其中,所述对战渲染标签是所述第一服务器在所述对战行为满足触发所述光环的情况下,根据所述对战行为生成的。
所述场景控制模块1502,还用于所述场景控制进程向所述对战控制进程发送所述对战渲 染标签。
所述对战控制模块1501,还用于通过所述对战控制进程依照所述对战渲染标签所指示的显示时机和显示位置,在基于所述对战渲染信息对所述对战行为进行渲染的过程中,根据所述光环渲染信息对所述光环进行渲染。
在一些实施例中,所述场景控制模块1502,还用于:
通过所述场景控制进程缓存所述光环渲染信息;
通过所述场景控制进程根据缓存的所述光环渲染信息对所述光环进行渲染。
在一些实施例中,所述场景控制模块1502,还用于在进入所述回合制对战的情况下,通过所述场景控制进程获取所述参与方对应的基础信息和所述战斗场景对应的环境信息,以及向所述第二服务器发送所述基础信息和所述环境信息。
所述对战控制模块1501,还用于通过所述对战控制进程接收来自所述第一服务器的所述回合制对战的对战画面;其中,所述对战画面是所述第一服务器根据所述第二服务器发送的所述基础信息和所述环境信息生成的。
所述对战控制模块1501,还用于通过所述对战控制进程显示所述对战画面。
在一些实施例中,所述对战控制模块1501,还用于通过所述对战控制进程向所述第一服务器发送所述对战行为的渲染结束通知。
所述对战控制模块1501,还用于通过所述对战控制进程接收来自所述第一服务器的亲和度渲染动画;其中,所述亲和度渲染动画是所述第一服务器根据所述第二服务器发送的更新环境信息生成的动画,所述更新环境信息是指所述战斗场景经过所述光环影响后的环境信息,所述亲和度渲染动画用于表征所述参与方与所述更新环境信息之间的亲和度,所述亲和度影响所述参与方的技能强度。
所述对战控制模块1501,还用于通过所述对战控制进程显示所述亲和度渲染动画。
综上所述,本申请实施例提供的技术方案,通过第一服务器支持对战行为处理逻辑的实现以获取对战渲染信息,通过第二服务器支持光环处理逻辑的实现以获取光环渲染信息,由于上述过程能够通过在场景控制侧维护一份场景信息的方式,避免在对战控制侧和场景控制侧均维护一份对战行为处理逻辑和光环处理逻辑,从而显著降低了回合制对战在业务逻辑和数据维护上的冗余,进而降低了维护开销。
需要说明的是,上述实施例提供的装置,在实现其功能时,仅以上述各功能模块的划分进行举例说明,实际应用中,可以根据需要而将上述功能分配由不同的功能模块完成,即将设备的内部结构划分成不同的功能模块,以完成以上描述的全部或者部分功能。另外,上述实施例提供的装置与方法实施例属于同一构思,其实现过程详见方法实施例,这里不再赘述。
图16出了本申请一个实施例提供的计算机系统的结构框图。该计算机系统1600包括:第一终端120、服务器140、第二终端160和第三终端180。
第一终端120安装和运行有支持虚拟世界的应用程序。该应用程序可以是三维地图程序、虚拟现实应用程序、增强现实程序、RPG程序、回合制游戏程序、回合制RPG程序中的任意一种。第一终端120是第一用户使用的终端,第一用户使用第一终端120控制位于虚拟世界中的第一虚拟角色进行活动。
第一终端120通过无线网络或有线网络与服务器140相连。
服务器140包括一台服务器、多台服务器、云计算平台和虚拟化中心中的至少一种。示例性的,服务器140包括处理器144和存储器142,存储器142又包括接收模块1421、控制模块1422和发送模块1423,接收模块1421用于接收客户端发送的请求,如检测敌方虚拟角色的位置请求;控制模块1422用于控制虚拟世界画面的渲染;发送模块1423用于向客户端发送响应,如向客户端发送第三虚拟角色的位置。服务器140用于为支持三维虚拟世界的应用程序提供后台服务。
第二终端160安装和运行有支持虚拟世界的应用程序。第二终端160是第二用户使用的 终端,第二用户使用第二终端160控制位于虚拟世界中的第二虚拟角色进行活动,第二虚拟角色同样作为主控虚拟角色。第三终端180安装和运行有支持虚拟世界的应用程序。第三终端180是第三用户使用的终端,第三用户使用第三终端180控制位于虚拟世界中的第三虚拟角色进行活动。
可选地,第一虚拟角色、第二虚拟角色和第三虚拟角色处于同一虚拟世界中。第一虚拟角色和第二虚拟角色属于不同阵营,第二虚拟角色和第三虚拟角色属于同一阵营。
可选地,第一终端120、第二终端160和第三终端180上安装的应用程序是相同的,或三个终端上安装的应用程序是不同操作系统平台(安卓或IOS)上的同一类型应用程序。第一终端120可以泛指多个终端中的一个,第二终端160可以泛指多个终端中的一个,第三终端180可以泛指多个终端中的一个,本实施例仅以第一终端120、第二终端160和第三终端180来举例说明。第一终端120、第二终端160和第三终端180的设备类型相同或不同,该设备类型包括:智能手机、智能手表、智能电视、平板电脑、电子书阅读器、MP3播放器、MP4播放器、膝上型便携计算机和台式计算机中的至少一种。以下实施例以终端包括智能手机来举例说明。
本领域技术人员可以知晓,上述终端的数量可以更多或更少。比如上述终端可以仅为一个,或者上述终端为几十个或几百个,或者更多数量。本申请实施例对终端的数量和设备类型不加以限定。
图17示出了本申请一个实施例提供的计算机设备1700的结构框图。该计算机设备1700可以是便携式移动终端,比如:智能手机、平板电脑、MP3播放器(Moving Picture Experts Group Audio Layer III,动态影像专家压缩标准音频层面3)、MP4(Moving Picture Experts Group Audio Layer IV,动态影像专家压缩标准音频层面4)播放器。计算机设备1700还可能被称为用户设备、便携式终端等其他名称。
通常,计算机设备1700包括有:处理器1701和存储器1702。
处理器1701可以包括一个或多个处理核心,比如4核心处理器、8核心处理器等。
存储器1702可以包括一个或多个计算机可读存储介质,该计算机可读存储介质可以是有形的和非暂态的。
在一些实施例中,计算机设备1700还可选包括有:外围设备接口1703和至少一个外围设备。具体地,外围设备包括:射频电路1704、触摸显示屏1705、摄像头1706、音频电路1707和电源1708中的至少一种。
在一些实施例中,计算机设备1700还包括有一个或多个传感器1709。该一个或多个传感器1709包括但不限于:加速度传感器1710、陀螺仪传感器1711、压力传感器1712、光学传感器1713以及接近传感器1714。
本领域技术人员可以理解,图17中示出的结构并不构成对计算机设备1700的限定,可以包括比图示更多或更少的组件,或者组合某些组件,或者采用不同的组件布置。
在一个示例性实施例中,还提供了一种计算机可读存储介质,所述存储介质中存储有计算机程序,所述计算机程序在被处理器执行时以实现上述基于回合制对战的界面显示方法,或上述基于回合制对战的信息提供方法。
可选地,该计算机可读存储介质可以包括:ROM(Read-Only Memory,只读存储器)、RAM(Random-Access Memory,随机存储器)、SSD(Solid State Drives,固态硬盘)或光盘等。其中,随机存取记忆体可以包括ReRAM(Resistance Random Access Memory,电阻式随机存取记忆体)和DRAM(Dynamic Random Access Memory,动态随机存取存储器)。
在一个示例性实施例中,还提供了一种计算机程序产品或计算机程序,所述计算机程序产品或计算机程序包括计算机指令,所述计算机指令存储在计算机可读存储介质中。计算机设备的处理器从所述计算机可读存储介质中读取所述计算机指令,所述处理器执行所述计算机指令,使得所述计算机设备执行上述基于回合制对战的界面显示方法,或上述基于回合制 对战的信息提供方法。
需要说明的是,本申请所涉及的信息(包括但不限于对象设备信息、对象个人信息等)、数据(包括但不限于用于分析的数据、存储的数据、展示的数据等)以及信号,均为经对象授权或者经过各方充分授权的,且相关数据的收集、使用和处理需要遵守相关地区的相关法律法规和标准。例如,本申请中涉及到的虚拟角色、操作、世界场景、战斗场景等都是在充分授权的情况下获取的。
应当理解的是,在本文中提及的“多个”是指两个或两个以上。“和/或”,描述关联对象的关联关系,表示可以存在三种关系,例如,A和/或B,可以表示:单独存在A,同时存在A和B,单独存在B这三种情况。字符“/”一般表示前后关联对象是一种“或”的关系。另外,本文中描述的步骤编号,仅示例性示出了步骤间的一种可能的执行先后顺序,在一些其它实施例中,上述步骤也可以不按照编号顺序来执行,如两个不同编号的步骤同时执行,或者两个不同编号的步骤按照与图示相反的顺序执行,以上所述仅为本申请的示例性实施例,并不用以限制本申请,凡在本申请的精神和原则之内,所作的任何修改、等同替换、改进等,均应包含在本申请的保护范围之内。

Claims (20)

  1. 一种基于回合制对战的界面显示方法,所述方法由客户端执行,所述方法包括:
    在检测到回合制对战的参与方发起对战行为的情况下,向第一服务器发送所述对战行为的发起请求,所述第一服务器用于对所述对战行为进行处理;
    接收来自所述第一服务器的对战渲染信息;其中,所述对战渲染信息用于对所述对战行为进行渲染;
    接收来自第二服务器的光环渲染信息,所述第二服务器用于对虚拟世界的世界环境进行处理;其中,所述光环渲染信息是所述第二服务器根据所述第一服务器发送的光环信息生成的信息,所述光环信息用于表征所述对战行为所触发的光环,所述光环对所述回合制对战的战斗场景中的元素产生影响;
    在基于所述对战渲染信息对所述对战行为进行渲染的过程中,根据所述光环渲染信息对所述光环进行渲染。
  2. 根据权利要求1所述的方法,其中,所述客户端中运行有对战控制进程和场景控制进程;
    所述对战控制进程在检测到所述参与方发起所述对战行为的情况下,向所述第一服务器发送所述对战行为的所述发起请求,并接收来自所述第一服务器的所述对战渲染信息;
    所述场景控制进程接收来自所述第二服务器的所述光环渲染信息,并向所述对战控制进程发送所述光环渲染信息;
    所述对战控制进程在基于所述对战渲染信息对所述对战行为进行渲染的过程中,根据所述光环渲染信息对所述光环进行渲染。
  3. 根据权利要求2所述的方法,其中,所述方法还包括:
    所述场景控制进程接收来自所述第二服务器的对战渲染标签;其中,所述对战渲染标签是所述第一服务器在所述对战行为满足触发所述光环的情况下,根据所述对战行为生成的标签;
    所述场景控制进程向所述对战控制进程发送所述对战渲染标签;
    所述对战控制进程在基于所述对战渲染信息对所述对战行为进行渲染的过程中,根据所述光环渲染信息对所述光环进行渲染,包括:
    所述对战控制进程依照所述对战渲染标签所指示的显示时机和显示位置,在基于所述对战渲染信息对所述对战行为进行渲染的过程中,根据所述光环渲染信息对所述光环进行渲染。
  4. 根据权利要求2所述的方法,其中,所述方法还包括:
    所述场景控制进程缓存所述光环渲染信息;
    所述场景控制进程根据缓存的所述光环渲染信息对所述光环进行渲染。
  5. 根据权利要求2所述的方法,其中,所述方法还包括:
    在进入所述回合制对战的情况下,所述场景控制进程获取所述参与方对应的基础信息和所述战斗场景对应的环境信息,以及向所述第二服务器发送所述基础信息和所述环境信息;
    所述对战控制进程接收来自所述第一服务器的所述回合制对战的对战画面;其中,所述对战画面是所述第一服务器根据所述第二服务器发送的所述基础信息和所述环境信息生成的画面;
    所述对战控制进程显示所述对战画面。
  6. 根据权利要求2所述的方法,其中,所述对战控制进程在基于所述对战渲染信息对所述对战行为进行渲染的过程中,根据所述光环渲染信息对所述光环进行渲染之后,还包括:
    所述对战控制进程向所述第一服务器发送所述对战行为的渲染结束通知;
    所述对战控制进程接收来自所述第一服务器的亲和度渲染动画;其中,所述亲和度渲染动画是所述第一服务器根据所述第二服务器发送的更新环境信息生成的动画,所述更新环境 信息是指所述战斗场景经过所述光环影响后的环境信息,所述亲和度渲染动画用于表征所述参与方与所述更新环境信息之间的亲和度,所述亲和度影响所述参与方的技能强度;
    所述对战控制进程显示所述亲和度渲染动画。
  7. 一种基于回合制对战的信息提供方法,其中,所述方法由服务器执行,所述服务器包括第一服务器和第二服务器,所述方法包括:
    所述第一服务器接收对战控制进程发送的对战行为的发起请求,根据所述发起请求生成所述对战行为对应的对战渲染信息;其中,所述对战行为是回合制对战的参与方发起的行为,所述对战渲染信息用于对所述对战行为进行渲染;
    所述第一服务器在确定所述对战行为触发光环的情况下,向所述第二服务器发送调用请求;其中,所述调用请求包括用于表征所述光环的光环信息,所述光环对所述回合制对战的战斗场景中的元素产生影响;
    所述第二服务器根据所述调用请求生成光环渲染信息,向场景控制进程发送所述光环渲染信息,所述光环渲染信息由所述场景控制进程发送给所述对战控制进程,所述光环渲染信息用于对所述光环进行渲染;
    所述第一服务器在接收到来自所述第二服务器的所述光环渲染信息的下发成功通知后,向所述对战控制进程发送所述对战渲染信息。
  8. 根据权利要求7所述的方法,其中,所述方法还包括:
    所述第一服务器在确定所述对战行为触发所述光环的情况下,根据所述对战行为生成对战渲染标签;其中,所述对战渲染标签用于指示所述光环的显示时机和显示位置;
    所述第一服务器向所述第二服务器发送所述对战渲染标签;
    所述第二服务器向所述场景控制进程发送所述对战渲染标签,所述对战渲染标签由所述场景控制进程发送给所述对战控制进程。
  9. 根据权利要求7所述的方法,其中,所述回合制对战包括中间回合对战和最后一回对战,所述中间回合对战是指所述回合制对战中除所述最后一回对战之外的对战;
    所述方法还包括:
    在所述中间回合对战触发光环的情况下,所述第二服务器向所述场景控制进程发送所述光环渲染信息和所述第一服务器生成的对战渲染标签;
    或者,
    在所述最后一回对战触发光环的情况下,所述第二服务器向所述场景控制进程发送所述光环渲染信息、所述对战渲染标签和回合制对战结束通知。
  10. 根据权利要求7所述的方法,其中,所述第二服务器根据所述调用请求生成光环渲染信息,包括:
    若所述光环对应的地块已存在历史光环,则所述第二服务器根据所述历史光环和所述光环之间的关系,生成所述光环渲染信息;
    其中,在所述历史光环和所述光环之间为覆盖关系的情况下,所述光环渲染信息用于取消所述历史光环对地块的影响,并添加所述光环对地块的影响;或者,在所述历史光环和所述光环之间为互斥关系的情况下,所述光环渲染信息用于保持所述历史光环对地块的影响。
  11. 根据权利要求7所述的方法,其中,所述方法还包括:
    所述第二服务器接收所述场景控制进程发送的所述参与方对应的基础信息和所述战斗场景对应的环境信息;
    所述第二服务器向所述第一服务器发送所述基础信息和所述环境信息;
    所述第一服务器根据所述基础信息和所述环境信息生成所述回合制对战的对战画面,向所述对战控制进程发送所述对战画面。
  12. 根据权利要求7所述的方法,其中,所述方法还包括:
    所述第一服务器在接收到所述对战控制进程发送的所述对战行为的渲染结束通知后,向 所述第二服务器发送更新环境信息的获取请求;其中,所述更新环境信息是指所述战斗场景经过所述光环影响后的环境信息;
    所述第二服务器向所述第一服务器发送所述更新环境信息;
    所述第一服务器根据所述更新环境信息,生成亲和度渲染动画;其中,所述亲和度渲染动画用于表征所述参与方与所述更新环境信息之间的亲和度,所述亲和度影响所述参与方的技能强度;
    所述第一服务器向所述对战控制进程发送所述亲和度渲染动画。
  13. 根据权利要求12所述的方法,其中,所述第一服务器根据所述更新环境信息,生成亲和度渲染动画,包括:
    所述第一服务器根据所述更新环境信息,确定所述参与方对应的地块势能和天气势能;其中,所述地块势能用于表示地块的类型对所述参与方所产生的影响,所述天气势能用于表示天气和时间对所述参与方所产生的影响;
    所述第一服务器根据所述参与方对应的地块势能和天气势能,确定所述参与方对应的环境势能;
    所述第一服务器根据所述参与方对应的环境势能和所述参与方的属性信息,生成所述亲和度渲染动画。
  14. 一种基于回合制对战的界面显示装置,所述装置包括:
    对战控制模块,用于在检测到回合制对战的参与方发起对战行为的情况下,向第一服务器发送所述对战行为的发起请求,所述第一服务器用于对所述对战行为进行处理;
    所述对战控制模块,还用于接收来自所述第一服务器的对战渲染信息;其中,所述对战渲染信息用于对所述对战行为进行渲染;
    场景控制模块,用于接收来自第二服务器的光环渲染信息,所述第二服务器用于对虚拟世界的世界环境进行处理;其中,所述光环渲染信息是所述第二服务器根据所述第一服务器发送的光环信息生成的信息,所述光环信息用于表征所述对战行为所触发的光环,所述光环对所述回合制对战的战斗场景中的元素产生影响;
    所述对战控制模块,还用于在基于所述对战渲染信息对所述对战行为进行渲染的过程中,根据所述光环渲染信息对所述光环进行渲染。
  15. 根据权利要求14所述的装置,其中,
    所述控制模块,还用于通过对战控制进程在检测到所述参与方发起所述对战行为的情况下,向所述第一服务器发送所述对战行为的所述发起请求,并接收来自所述第一服务器的所述对战渲染信息;
    所述场景控制模块,还用于通过所述场景控制进程接收来自所述第二服务器的所述光环渲染信息,并向所述对战控制进程发送所述光环渲染信息;
    所述对战控制模块,还用于通过所述对战控制进程在基于所述对战渲染信息对所述对战行为进行渲染的过程中,根据所述光环渲染信息对所述光环进行渲染。
  16. 根据权利要求15所述的装置,其中,
    所述场景控制模块,还用于通过所述场景控制进程接收来自所述第二服务器的对战渲染标签;其中,所述对战渲染标签是所述第一服务器在所述对战行为满足触发所述光环的情况下,根据所述对战行为生成的标签;所述场景控制进程向所述对战控制进程发送所述对战渲染标签;
    所述对战控制模块,还用于通过所述对战控制进程依照所述对战渲染标签所指示的显示时机和显示位置,在基于所述对战渲染信息对所述对战行为进行渲染的过程中,根据所述光环渲染信息对所述光环进行渲染。
  17. 一种计算机系统,所述计算机系统包括客户端和服务器,所述客户端用于执行如权利要求1至6任一项所述的基于回合制对战的界面显示方法,所述服务器用于执行如权利要求 7至13任一项所述的基于回合制对战的信息提供方法。
  18. 一种计算机设备,所述计算机设备包括处理器和存储器,所述存储器中存储有计算机程序,所述计算机程序由所述处理器加载并执行以实现如权利要求1至6任一项所述的基于回合制对战的界面显示方法,或者实现如权利要求7至13任一项所述的基于回合制对战的信息提供方法。
  19. 一种计算机可读存储介质,所述计算机可读存储介质中存储有计算机程序,所述计算机程序由处理器加载并执行以实现如权利要求1至6任一项所述的基于回合制对战的界面显示方法,或者实现如权利要求7至13任一项所述的基于回合制对战的信息提供方法。
  20. 一种计算机程序产品,所述计算机程序产品包括计算机指令,所述计算机指令存储在计算机可读存储介质中,处理器从所述计算机可读存储介质读取并执行所述计算机指令,以实现如权利要求1至6任一项所述的基于回合制对战的界面显示方法,或者实现如权利要求7至13任一项所述的基于回合制对战的信息提供方法。
PCT/CN2023/099637 2022-08-19 2023-06-12 基于回合制对战的界面显示方法、信息提供方法及系统 WO2024037153A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
KR1020247016180A KR20240093633A (ko) 2022-08-19 2023-06-12 턴제 배틀 기반 인터페이스 표시 방법, 턴제 배틀 기반 정보 제공 방법 및 시스템
US18/677,754 US20240316451A1 (en) 2022-08-19 2024-05-29 Turn-based battle-based interface display method, turn-based battle-based information providing method, and system

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202211000457.3 2022-08-19
CN202211000457.3A CN117618929A (zh) 2022-08-19 2022-08-19 基于回合制对战的界面显示方法、信息提供方法及系统

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US18/677,754 Continuation US20240316451A1 (en) 2022-08-19 2024-05-29 Turn-based battle-based interface display method, turn-based battle-based information providing method, and system

Publications (1)

Publication Number Publication Date
WO2024037153A1 true WO2024037153A1 (zh) 2024-02-22

Family

ID=89940553

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2023/099637 WO2024037153A1 (zh) 2022-08-19 2023-06-12 基于回合制对战的界面显示方法、信息提供方法及系统

Country Status (4)

Country Link
US (1) US20240316451A1 (zh)
KR (1) KR20240093633A (zh)
CN (1) CN117618929A (zh)
WO (1) WO2024037153A1 (zh)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013039388A1 (en) * 2011-09-13 2013-03-21 Koninklijke Jumbo B.V. Method of performing multi-user operations on a tablet type computer, computer program product and an arrangement comprising a tablet type computer and game pieces
CN109621413A (zh) * 2018-12-28 2019-04-16 腾讯科技(深圳)有限公司 游戏画面的渲染显示方法、装置、终端及存储介质
CN109847356A (zh) * 2019-02-25 2019-06-07 腾讯科技(深圳)有限公司 一种回合制游戏的数据处理方法、装置、终端及服务器
CN111346370A (zh) * 2020-02-18 2020-06-30 腾讯科技(深圳)有限公司 战斗内核的运行方法、装置、设备及介质
CN111475240A (zh) * 2020-03-25 2020-07-31 西安万像电子科技有限公司 数据处理方法及系统
CN111589121A (zh) * 2020-04-03 2020-08-28 北京冰封互娱科技有限公司 信息的显示方法和装置、存储介质、电子装置
CN113244614A (zh) * 2021-06-07 2021-08-13 腾讯科技(深圳)有限公司 图像画面展示方法、装置、设备及存储介质
CN113384879A (zh) * 2021-05-14 2021-09-14 完美世界征奇(上海)多媒体科技有限公司 游戏数据展示方法以及装置

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013039388A1 (en) * 2011-09-13 2013-03-21 Koninklijke Jumbo B.V. Method of performing multi-user operations on a tablet type computer, computer program product and an arrangement comprising a tablet type computer and game pieces
CN109621413A (zh) * 2018-12-28 2019-04-16 腾讯科技(深圳)有限公司 游戏画面的渲染显示方法、装置、终端及存储介质
CN109847356A (zh) * 2019-02-25 2019-06-07 腾讯科技(深圳)有限公司 一种回合制游戏的数据处理方法、装置、终端及服务器
CN111346370A (zh) * 2020-02-18 2020-06-30 腾讯科技(深圳)有限公司 战斗内核的运行方法、装置、设备及介质
CN111475240A (zh) * 2020-03-25 2020-07-31 西安万像电子科技有限公司 数据处理方法及系统
CN111589121A (zh) * 2020-04-03 2020-08-28 北京冰封互娱科技有限公司 信息的显示方法和装置、存储介质、电子装置
CN113384879A (zh) * 2021-05-14 2021-09-14 完美世界征奇(上海)多媒体科技有限公司 游戏数据展示方法以及装置
CN113244614A (zh) * 2021-06-07 2021-08-13 腾讯科技(深圳)有限公司 图像画面展示方法、装置、设备及存储介质

Also Published As

Publication number Publication date
KR20240093633A (ko) 2024-06-24
CN117618929A (zh) 2024-03-01
US20240316451A1 (en) 2024-09-26

Similar Documents

Publication Publication Date Title
WO2022151946A1 (zh) 虚拟角色的控制方法、装置、电子设备、计算机可读存储介质及计算机程序产品
CN105705211B (zh) 游戏系统、游戏控制方法和游戏控制程序
TWI818351B (zh) 多人在線對戰程式中的消息發送方法、裝置、終端及媒體
CN114339438B (zh) 基于直播画面的互动方法、装置、电子设备及存储介质
JP2023126292A (ja) 情報表示方法、装置、機器及びプログラム
CN112843682B (zh) 数据同步方法、装置、设备及存储介质
CN112416196A (zh) 虚拟对象的控制方法、装置、设备及计算机可读存储介质
CN111686449A (zh) 虚拟对象的控制方法、装置、终端及存储介质
US12029977B2 (en) Method and apparatus for generating special effect in virtual environment, device, and storage medium
WO2024119725A1 (zh) 增益虚拟物品发送方法、装置、移动终端和存储介质
WO2024037153A1 (zh) 基于回合制对战的界面显示方法、信息提供方法及系统
CN114307150B (zh) 虚拟对象之间的互动方法、装置、设备、介质及程序产品
CN113144617B (zh) 虚拟对象的控制方法、装置、设备及计算机可读存储介质
WO2024037151A1 (zh) 基于回合制对战的界面显示方法、装置、设备及介质
US12121822B2 (en) Method, apparatus, and terminal for transmitting message in multiplayer online battle program, and medium
WO2024152670A1 (zh) 虚拟场馆的生成方法、装置、设备、介质及程序产品
WO2023231557A1 (zh) 虚拟对象的互动方法、装置、设备、存储介质及程序产品
US20240350919A1 (en) Method and apparatus for controlling virtual object, device, storage medium, and program product
CN117654039A (zh) 基于可拾取道具的交互方法、装置、设备、介质及产品
CN116650954A (zh) 一种游戏进程的控制方法、装置、电子设备及存储介质
CN118681207A (zh) 基于虚拟道具的互动方法、装置、设备、介质及产品
CN117018617A (zh) 游戏控制的方法、装置、计算机设备及存储介质
CN116999850A (zh) 一种虚拟资源的转移方法、相关装置、设备以及存储介质
CN118662888A (zh) 虚拟资源获取方法、装置、电子设备及存储介质
CN115671722A (zh) 虚拟场景中对象动作的显示方法、装置、设备及程序产品

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23854058

Country of ref document: EP

Kind code of ref document: A1