CN114146412A - Display control method and device in game, electronic equipment and storage medium - Google Patents

Display control method and device in game, electronic equipment and storage medium Download PDF

Info

Publication number
CN114146412A
CN114146412A CN202111372253.8A CN202111372253A CN114146412A CN 114146412 A CN114146412 A CN 114146412A CN 202111372253 A CN202111372253 A CN 202111372253A CN 114146412 A CN114146412 A CN 114146412A
Authority
CN
China
Prior art keywords
game
attack
type
interaction
weapon
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111372253.8A
Other languages
Chinese (zh)
Inventor
孙德中
周琳凤
李振
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Netease Hangzhou Network Co Ltd
Original Assignee
Netease Hangzhou Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netease Hangzhou Network Co Ltd filed Critical Netease Hangzhou Network Co Ltd
Priority to CN202111372253.8A priority Critical patent/CN114146412A/en
Publication of CN114146412A publication Critical patent/CN114146412A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/53Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game
    • A63F13/537Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game using indicators, e.g. showing the condition of a game character on screen
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/837Shooting of targets
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/30Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device
    • A63F2300/308Details of the user interface
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/8076Shooting

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Human Computer Interaction (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The embodiment of the application discloses a display control method, a display control device, electronic equipment and a storage medium in a game; the method comprises the following steps: determining an interaction source of an interaction behavior launched by other game objects born by the own game role; and generating a target three-dimensional indication model in the three-dimensional model container, wherein the target three-dimensional indication model comprises a first indication end and a second indication end, the direction from the second indication end to the first indication end points to the interaction source, and the direction from the first indication end to the second indication end points to the center of the three-dimensional model container. Because the relative position relation between the three-dimensional model container for generating the target three-dimensional indication model and the own game character is fixed, the interactive behaviors born by the own game character can be stably and intuitively checked at the relatively fixed position no matter how the game player operates the own game character. And in the case that the interactive behavior is an attack behavior initiated by other game objects, the game player can be quickly positioned to the injury source.

Description

Display control method and device in game, electronic equipment and storage medium
Technical Field
The present application relates to the field of game technologies, and in particular, to a display control method and apparatus in a game, an electronic device, and a storage medium.
Background
In some existing shooting games, when a player-controlled self-player character is attacked, the indication of the attack direction is usually realized by a plane indicator displayed on a screen.
Disclosure of Invention
However, the applicant has found in long-term research that the related technical solutions have at least the following problems: in the terrain mainly comprising multi-storey buildings and three-dimensional structures, only a plane indicator is used, so that offensive power which may appear from any direction cannot be intuitively indicated, and a player cannot quickly locate an injury source.
The embodiment of the application provides a display control method and device in a game, electronic equipment and a storage medium, which can solve the problem that a player cannot quickly locate an injury source in the prior art.
The embodiment of the application provides a display control method in a game, a terminal device provides a graphical user interface, the content displayed in the graphical user interface at least comprises a game scene and a three-dimensional model container, the game scene comprises a game role of a self party controlled and operated by the terminal device, the relative position relation between the three-dimensional model container and the game role of the self party is fixed, and the three-dimensional model container comprises a three-dimensional indication model;
the method comprises the following steps:
determining an interaction source of an interaction behavior launched by other game objects borne by the self game character;
and generating a target three-dimensional indication model in the three-dimensional model container, wherein the target three-dimensional indication model comprises a first indication end and a second indication end, the direction from the second indication end to the first indication end points to the interaction source of the interaction behaviors launched by the other game objects, and the direction from the first indication end to the second indication end points to the center of the three-dimensional model container.
The embodiment of the application also provides a display control device in the game, a graphical user interface is provided through terminal equipment, the content displayed in the graphical user interface at least comprises a game scene and a three-dimensional model container, the game scene comprises a game role of a self party controlled and operated through the terminal equipment, the relative position relationship between the three-dimensional model container and the game role of the self party is fixed, and the three-dimensional model container comprises a three-dimensional indication model;
the device comprises:
the interaction source determining unit is used for determining the interaction source of the interaction behavior launched by other game objects born by the own game role;
and the model generating unit is used for generating a target three-dimensional indication model in the three-dimensional model container, wherein the target three-dimensional indication model comprises a first indication end and a second indication end, the direction from the second indication end to the first indication end points to the interaction source of the interaction behaviors launched by the other game objects, and the direction from the first indication end to the second indication end points to the center of the three-dimensional model container.
In some embodiments, the apparatus further comprises:
the interaction type determining unit is used for determining the interaction type of the interaction behavior launched by other game objects born by the own game role;
the target map determining unit is used for determining a target map corresponding to the interaction type of the interaction behavior initiated by the other game object according to the corresponding relation between the interaction type and the model map; wherein, the target three-dimensional indication model is pasted with the target map.
In some embodiments, the interactive behavior launched by the other game object is an attack behavior launched by an attacker, the interaction source is an attack source, and the interaction type is an attack type; an interaction type determination unit comprising:
and the attack type subunit is used for determining the attack type borne by the own game role according to the attack source, the attack weapon, the attack parameter and the displacement information of the own game role, wherein the attack type borne by the own game role is the interaction type.
In some embodiments, the interaction type determining unit further includes:
the attack data receiving subunit is configured to receive attack data information sent by a server, where the attack data information is information corresponding to the attack behavior launched by the attacker toward the own game character, and the attack data information includes an attack source, an attack weapon, and attack parameters.
In some embodiments, the attack weapon is a projectile weapon, the attack source is a position where the projectile weapon shoots a projectile, and the attack parameter is an attack angle;
the attack type subunit is specifically used for judging whether the missile of the missile weapon hits the game role of the own party or not;
and if the own game role is hit, determining that the attack type borne by the own game role is a first attack type.
In some embodiments, the attack type subunit is further specifically configured to, if the own game character is not hit, determine whether a projectile of the projectile weapon passes through a capsule wrapping the own game character, where the capsule is a capsule structure wrapping the own game character;
and if the attack type borne by the own game role passes through the capsule body wrapping the own game role, determining that the attack type borne by the own game role is a second attack type.
In some embodiments, the attack weapon is an explosive weapon, the attack source is a drop point of the explosive weapon, and the attack parameter is an explosion radius;
the attack type subunit is specifically used for determining the explosion spread range of the explosive weapons according to the drop points and the explosion radii of the explosive weapons;
and if the own game role enters the explosion wave coverage range, determining that the attack type borne by the own game role is a third attack type.
In some embodiments, the apparatus further comprises:
an icon generating unit, configured to generate an explosion type icon at a preset distance from the target three-dimensional indication model, where the explosion type icon is used to indicate a weapon type of the explosive weapon.
In some embodiments, the interactive behavior launched by the other game object is an attack behavior launched by an attacker, the interaction source is an attack source, and the interaction type is an attack type; an interaction type determination unit comprising:
the data receiving subunit is configured to receive attack data information sent by a server, where the attack data information is information corresponding to the attack behavior launched by an attacker toward the own game character, and the attack data information includes an attack source, an attack weapon, and an attack parameter;
the primary screening attack determining subunit is used for determining the type of primary screening attack borne by the own game role according to the attack source, the attack weapon, the attack parameters and the displacement information of the own game role;
the attack verification subunit is used for sending the displacement information of the own game role and the preliminary screening attack type to the server so that the server can verify the preliminary screening attack type;
and the final selection attack subunit is used for determining a final selection attack type according to the feedback information returned by the server, wherein the final selection attack type is the interaction type.
In some embodiments, the three-dimensional model container comprises N three-dimensional indicating models, N being a positive integer; the interactive behaviors launched by the other game objects have corresponding prompt duration;
the device further comprises:
a first quantity determination unit, configured to, when the quantity of the interactive behaviors that the own game character receives at the same time is not greater than N, perform, for each interactive behavior that the own game character receives, "determine an interaction source of an interactive behavior launched by another game object that the own game character receives";
the second quantity determination unit is used for determining N interactive behaviors which are within the prompt duration and have the highest priority according to the triggering time of the interactive behaviors and the priority of the interactive types when the behavior quantity of the interactive behaviors born by the own game character at the same time is larger than N; for each of the N interactive behaviors, the step of "determining an interaction source of an interactive behavior launched by another game object to which the own game character is subjected" is performed.
The embodiment of the present application further provides a computer-readable storage medium, where a plurality of instructions are stored, where the instructions are suitable for being loaded by a processor to execute the steps in the display control method in any game provided in the embodiment of the present application.
According to the display control method in the game, the interactive behaviors launched by other game objects born by the own game role can be obtained, and the interactive sources corresponding to the interactive behaviors are determined. A target three-dimensional pointing model is then generated in a three-dimensional model container having a fixed relative positional relationship with the own game character. And a first pointing end of the target three-dimensional pointing model may point to the interaction source, a second pointing end of the target three-dimensional pointing model may point to the center of the three-dimensional model container, and the first pointing end and the second pointing end are opposite in orientation.
In the application, the interaction source of the interaction behavior can be intuitively indicated in a manner that the first indication end of the target three-dimensional indication model indicates the interaction source, and the second indication end indicates the center of the three-dimensional model container, and because the relative position relationship between the three-dimensional model container generating the target three-dimensional indication model and the own game character is fixed, no matter how the game player controlling the own game character controls the own game character, the game player can stably and intuitively view the relative orientation of the interaction behavior borne by the own game character in the game scene. And in the case that the interactive behavior is an attack behavior initiated by other game objects, the game player can be quickly positioned to the injury source.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
FIG. 1a is a schematic view of a scene of a display control method in a game according to an embodiment of the present application;
FIG. 1b is a schematic flow chart illustrating a method for controlling display in a game according to an embodiment of the present application;
FIG. 1c is a diagram of an application scenario of the display control method in a game;
FIG. 1d (1) reflects another embodiment of the three-dimensional modeled container of FIG. 1 c;
FIG. 1d (2) reflects yet another embodiment of the three-dimensional model container of FIG. 1 c;
FIG. 1e is a diagram of another application scenario of the in-game display control method;
FIG. 1f is a diagram of another application scenario of an in-game display control method;
FIG. 2 is a flow chart illustrating a method for controlling display in a game according to another embodiment of the present application;
FIG. 3 is a schematic diagram of a display control apparatus in a game according to an embodiment of the present application;
fig. 4 is a schematic structural diagram of an electronic device provided in an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The embodiment of the application provides a display control method and device in a game, electronic equipment and a storage medium.
The display control device in the game may be specifically integrated in an electronic device, and the electronic device may be a terminal, a server, or other devices. The terminal can be a mobile phone, a tablet Computer, an intelligent bluetooth device, a notebook Computer, or a Personal Computer (PC), and the like; the server may be a single server or a server cluster composed of a plurality of servers.
In some embodiments, the in-game display control apparatus may also be integrated in a plurality of electronic devices, for example, the in-game display control apparatus may be integrated in a plurality of servers, and the in-game display control method of the present application is implemented by the plurality of servers.
In some embodiments, the server may also be implemented in the form of a terminal.
For example, referring to fig. 1a, the electronic device described above may perform the following method: providing a graphical user interface through a terminal device, wherein the content displayed in the graphical user interface at least comprises a game scene and a three-dimensional model container, the game scene comprises a game character of a party controlled and operated by the terminal device, the relative position relation between the three-dimensional model container and the game character of the party is fixed, and the three-dimensional model container comprises a three-dimensional indication model; the method comprises the following steps: determining an interaction source of an interaction behavior launched by other game objects borne by the self game character; and generating a target three-dimensional indication model in the three-dimensional model container, wherein the target three-dimensional indication model comprises a first indication end and a second indication end, the direction from the second indication end to the first indication end points to the interaction source of the interaction behaviors launched by the other game objects, and the direction from the first indication end to the second indication end points to the center of the three-dimensional model container.
The display control method in the game in one embodiment of the disclosure can be operated on a terminal device or a server. The terminal device may be a local terminal device. When the display control method in the game is operated on the server, the method can be implemented and executed based on a cloud interaction system, wherein the cloud interaction system comprises the server and the client device.
In an optional embodiment, various cloud applications may be run under the cloud interaction system, for example: and (5) cloud games. Taking a cloud game as an example, a cloud game refers to a game mode based on cloud computing. In the running mode of the cloud game, the running main body of the game program and the game picture presenting main body are separated, the storage and the running of the display control method in the game are finished on a cloud game server, and the client equipment is used for receiving and sending data and presenting the game picture, for example, the client equipment can be display equipment with a data transmission function close to a user side, such as a terminal, a television, a computer, a palm computer and the like; however, the terminal device performing display control in the game is a cloud game server in the cloud. When a game is played, a user operates the client device to send an operation instruction, such as an operation instruction of touch operation, to the cloud game server, the cloud game server runs the game according to the operation instruction, data such as a game picture and the like are encoded and compressed and returned to the client device through a network, and finally, the client device decodes the data and outputs the game picture.
In an alternative embodiment, the terminal device may be a local terminal device. Taking a game as an example, the local terminal device stores a game program and is used for presenting a game screen. The local terminal device is used for interacting with a user through a graphical user interface, namely, a game program is downloaded and installed and operated through the electronic device conventionally. The manner in which the local terminal device provides the graphical user interface to the user may include a variety of ways, for example, it may be rendered for display on a display screen of the terminal or provided to the user by holographic projection. For example, the local terminal device may include a display screen for presenting a graphical user interface including a game screen and a processor for running the game, generating the graphical user interface, and controlling display of the graphical user interface on the display screen.
A game scene (or referred to as a virtual scene) is a virtual scene that an application program displays (or provides) when running on a terminal or a server. Optionally, the virtual scene is a simulated environment of the real world, or a semi-simulated semi-fictional virtual environment, or a purely fictional virtual environment. The virtual scene is any one of a two-dimensional virtual scene and a three-dimensional virtual scene, and the virtual environment can be sky, land, sea and the like, wherein the land comprises environmental elements such as deserts, cities and the like. For example, in a sandbox type 3D shooting game, the virtual scene is a 3D game world for the user to control the virtual object to play against, and an exemplary virtual scene may include: at least one element selected from the group consisting of mountains, flat ground, rivers, lakes, oceans, deserts, sky, plants, buildings, and vehicles.
The game interface is an interface corresponding to an application program provided or displayed through a graphical user interface, the interface comprises a graphical user interface and a game picture for interaction of a user, and the game picture is a picture of a game scene.
In alternative embodiments, game controls (e.g., skill controls, behavior controls, functionality controls, etc.), indicators (e.g., display control indicators in the game, character indicators, etc.), information presentation areas (e.g., number of clicks, game play time, etc.), or game setting controls (e.g., system settings, stores, coins, etc.) may be included in the UI interface.
In an optional embodiment, the game screen is a display screen corresponding to a virtual scene displayed by the terminal device, and the game screen may include a game object performing game logic in the virtual scene, a Non-Player Character (NPC), an Artificial Intelligence (AI) Character, and other virtual objects.
For example, in some embodiments, the content displayed in the graphical user interface at least partially comprises a game scene, wherein the game scene comprises at least one game object.
In some embodiments, the game objects in the game scene comprise virtual objects, i.e., user objects, manipulated by the player user.
The game object refers to a virtual object in a virtual scene, including a game character, which is a dynamic object that can be controlled, i.e., a dynamic virtual object. Alternatively, the dynamic object may be a virtual character, a virtual animal, an animation character, or the like. The virtual object is a character controlled by a user through an input device, or an AI set in a virtual environment match-up through training, or an NPC set in a virtual scene match-up.
Optionally, the virtual object is a virtual character playing a game in a virtual scene. Optionally, the number of virtual objects in the virtual scene match is preset, or dynamically determined according to the number of clients participating in the match, which is not limited in the embodiment of the present application.
In one possible implementation, the user can control the virtual object to play the game behavior in the virtual scene, and the game behavior can include moving, releasing skills, using props, dialog, and the like, for example, controlling the virtual object to run, jump, crawl, and the like, and can also control the virtual object to fight with other virtual objects using the skills, virtual props, and the like provided by the application program.
The virtual camera is a necessary component for game scene pictures, is used for presenting the game scene pictures, one game scene at least corresponds to one virtual camera, two or more than two virtual cameras can be used as game rendering windows according to actual needs, the game rendering windows are used for capturing and presenting picture contents of a game world for a user, and the viewing angles of the game world, such as a first person viewing angle and a third person viewing angle, of the user can be adjusted by setting parameters of the virtual camera.
In an optional implementation manner, an embodiment of the present invention provides a display control method in a game, where a graphical user interface is provided by a terminal device, where the terminal device may be the aforementioned local terminal device, and may also be the aforementioned client device in a cloud interaction system.
The following are detailed below. The numbers in the following examples are not intended to limit the order of preference of the examples.
In this embodiment, a display control method in a game is provided, as shown in fig. 1b, where the display control method in a game is applied to a terminal, and a specific flow of the method may be as follows:
110. determining an interaction source of interaction behavior instigated by other game objects borne by the self game character.
The own game character is a game character which is controlled by a game player through an input device. The input device may be a mouse, a keyboard, or the like, or may be a touch screen of the terminal device. The specific type of input device should not be construed as limiting the application.
The other game objects are objects which can generate interactive behaviors with own game roles in a game scene; the game role can be a defense tower, a neutral creature and the like, and can also be other game roles in the same or different camps with the own game role. Wherein, other game characters can be controlled by corresponding game players and can also be controlled by AI. The particular type of other game objects should not be construed as limiting the application.
The interactive behavior is the behavior launched by other game objects and capable of influencing the state of the own game character in the game scene. For example, the interactive behavior may be an aggressive behavior, a therapeutic behavior, an alert behavior, and the like.
The interaction source is a position point which triggers the interaction behavior in the game scene. For different interaction behaviors, the corresponding interaction sources are different. For example, a therapeutic action may be initiated by another game object through its own skills with curative effect, and the interactive source of the therapeutic action is the other game object; the alert behavior can be initiated by alert props with acousto-optic effects thrown by other game objects, and then the interaction source of the alert behavior is the alert prop.
For the same interactive behavior, the corresponding interactive sources are different because the game props used by other game objects are different. For example, the attack behavior may be launched by a shooting weapon held by another game object, or may be launched by an explosion weapon thrown by another game object. For an attack launched by a projectile weapon, the interaction source may be a launch point of the projectile weapon; for an attack launched by an explosive weapon, the source of interaction may be the drop point of the explosive weapon. The launching weapon can be a gun, an arrow, an energy gun and the like, and the explosion weapon can be a flash bomb, a smoke bomb, a bomb and the like.
120. And generating a target three-dimensional indication model in the three-dimensional model container, wherein the target three-dimensional indication model comprises a first indication end and a second indication end, the direction from the second indication end to the first indication end points to the interaction source of the interaction behaviors launched by the other game objects, and the direction from the first indication end to the second indication end points to the center of the three-dimensional model container.
The three-dimensional model container is a coordinate area for generating the three-dimensional indication model and allowing the three-dimensional indication model to stop, a fixed relative position relation exists between the coordinate area and the virtual camera representing the viewpoint of the game player, the three-dimensional model container can move along with the movement of the game character of the player and is always positioned at a preset position of a graphical user interface in the visual angle of the game player, so that the game player can visually and rapidly check the three-dimensional indication model. Referring to fig. 1c for details, the center of the three-dimensional model container a may be located directly below the sight bead; in some embodiments, the center of the three-dimensional model container a may also overlap with the sight bead.
In an alternative embodiment, the three-dimensional model container is positioned in front of the self game character, and the first indicating end or the second indicating end is positioned on the scene ground of the game scene. Therefore, compared with the existing indication scheme displayed on the UI layer of the user interface, the interactive source information of the three-dimensional direction can be better displayed; compared with other embodiments disclosed in the application, the interactive source information display mode is more rendering type and is blended into the game, and the display information is located in the visual field focus range of the player.
Referring to fig. 1c for details, the three-dimensional model container may be a translucent ellipsoidal container and is located at a position below a projectile weapon held by a game player. It should be understood that the three-dimensional model container may be ellipsoidal, as shown at a in fig. 1 c; other shapes are possible, for example, spherical, see a in (1) in fig. 1d for details; it may be square, see a in (2) in fig. 1d for details. The specific shape of the three-dimensional model container should not be construed as limiting the application. Wherein B shown in fig. 1c, fig. 1d (1), and fig. 1d (2) are all target three-dimensional indication models.
Optionally, in some embodiments, the three-dimensional model container may also be transparent, that is, in the view of the game player, the three-dimensional model container may not be seen, but only the three-dimensional indication model generated by the three-dimensional model container, please see B in fig. 1e for details.
The target three-dimensional indication model is a three-dimensional indication model used for indicating an interaction source. The target three-dimensional indication model comprises a first indication end and a second indication end which are opposite in direction, wherein the first indication end points to the interaction source, and the second indication end points to the center of the three-dimensional model container.
Referring to fig. 1c to 1e for details, the target three-dimensional indicating model may be a triangular pyramid model, wherein the sharp end of the triangular pyramid may be the first indicating end, and the bottom surface of the triangular pyramid opposite to the sharp end may be the second indicating end. It should be understood that the target three-dimensional indicating model may be a triangular pyramid model, and may also be in other shapes, such as three-dimensional arrow, and the specific shape of the target three-dimensional indicating model should not be construed as limiting the present application.
The center positions of the three-dimensional model containers with different shapes are different. If the three-dimensional model container is ellipsoidal, the center of the three-dimensional model container is the intersection point of two symmetry axes of the ellipsoid, i.e. the point O in fig. 1 c; if the three-dimensional model container is spherical, the center of the three-dimensional model container is a sphere center point, i.e. the Ox point in (1) in fig. 1 d; if the three-dimensional model container is in a cube shape, the center of the three-dimensional model container is the intersection point of two cube diagonals of the cube, i.e. Oy point in (2) in fig. 1 d.
In the above embodiment, the interactive behavior launched by the other game object borne by the own game character may be obtained, and the interactive source corresponding to the interactive behavior may be determined. A target three-dimensional pointing model is then generated in a three-dimensional model container having a fixed relative positional relationship with the own game character. And a first pointing end of the target three-dimensional pointing model may point to the interaction source, a second pointing end of the target three-dimensional pointing model may point to the center of the three-dimensional model container, and the first pointing end and the second pointing end are opposite in orientation. Through the relatively fixed position relation between the target three-dimensional indication model and the three-dimensional model container, the direction of the second indication end of the target three-dimensional indication model towards the first indication end becomes more intuitive, and the understanding threshold of a game player is favorably reduced.
In the application, the interaction source of the interaction behavior can be intuitively indicated in a manner that the first indication end of the target three-dimensional indication model indicates the interaction source, and the second indication end indicates the center of the three-dimensional model container, and the relative position relationship between the three-dimensional model container generating the target three-dimensional indication model and the own game character is fixed, so that no matter how a game player controlling the own game character controls the own game character, the game player can stably and intuitively check the interaction behavior borne by the own game character, and the interaction behavior is always relatively stably located at a preset position in a game scene.
Under the condition that the interactive behavior is an attack behavior initiated by other game objects, the embodiment of the application can also enable the game player to quickly locate the injury source.
Optionally, in a specific implementation manner, before step 120, the method provided in the embodiment of the present application may further include the following steps 111 to 112:
and step 111, determining the interaction type of the interaction behavior launched by other game objects born by the own game character.
The interaction type is used to reflect the manifestation of the interaction behavior. For the same interaction behavior, the method can be divided into a plurality of types due to different triggering modes. For example, taking the example that the interactive behavior is a therapeutic behavior, the therapeutic behavior may be triggered by other game objects through skills with curative effects, or may be triggered by virtual drugs or virtual foods provided by other game objects; both of the above-described trigger patterns trigger therapeutic behavior, but appear differently. Therefore, the treatment behaviors respectively corresponding to the two triggering modes can be considered as two interactive types of the treatment behaviors, namely two treatment types.
For another example, taking the example that the interactive behavior is an attack behavior, the attack behavior may be triggered by a launching weapon held by another game object, or an explosion weapon thrown by another game object; although both of the above two triggering methods trigger the attack behavior, the two triggering methods have different expression forms. Therefore, the attack behaviors respectively corresponding to the two triggering modes can be regarded as two interaction types of the attack behaviors, namely two attack types.
Step 112, determining a target map corresponding to the interaction type of the interaction behavior initiated by the other game object according to the corresponding relation between the interaction type and the model map; wherein, the target three-dimensional indication model is pasted with the target map.
The model map is a map that can be attached to the surface of a three-dimensional pointing model. In order to enable the interaction behaviors of different interaction types to be visually distinguished by game players, the corresponding relation between the interaction types and the model maps can be established in advance. In particular, the interaction types and model maps may correspond one-to-one.
In the above embodiment, before generating the target three-dimensional indication model, in addition to determining the interaction source of the interaction behavior, the interaction type of the interaction behavior may be determined. And after the interaction type of the interaction behavior launched by other game objects is determined, the target map corresponding to the interaction type can be obtained, so that the target map can be pasted for the target three-dimensional indication model when the target three-dimensional indication model is generated. The target map is pasted on the target three-dimensional indication model, so that a game player can visually know the interaction source and the interaction type while viewing the target three-dimensional indication model, the understanding cost of the game player is reduced, and the method is favorable for acquiring information as much as possible in a fast-paced game environment.
Optionally, for convenience of description, the following description is continued by taking as an example that the interactive behavior launched by the other game object is an attack behavior launched by an attacker, the interactive source is an attack source, and the interactive type is an attack type. In a specific embodiment, the step 111 may specifically include the following step S2:
step S2, determining an attack type borne by the own game character according to the attack source, the attack weapon, the attack parameter, and the displacement information of the own game character, wherein the attack type borne by the own game character is the interaction type.
The terminal equipment can determine the range of the damage caused by the attack behavior according to the attack source, the attack weapon and the attack parameter, and then compares the displacement information of the game of the own party with the range of the damage caused by the attack behavior, so as to determine the attack type born by the role of the game of the own party.
Optionally, in a specific implementation manner, before the step S2, an embodiment of the present application may further include the following step S1:
step S1, receiving attack data information sent by a server, where the attack data information is information corresponding to the attack behavior launched by the attacker toward the own game character, and the attack data information includes an attack source, an attack weapon, and attack parameters.
The attacking party is a game character which initiates an attack to the own game character, and the game character can be controlled by other players and also can be controlled by the AI. It should be understood that the body that handles the attacker should not be construed as limiting the application. The attack source is the point in the game scene from which the attack action came. The attack parameter is a parameter reflecting the attack effect. An attack weapon is a virtual weapon held by an attacker. The attack origin and attack parameters may vary from one weapon to another.
In a specific embodiment, if the attack weapon is not a projectile weapon, the attack source is a position where the projectile weapon shoots a projectile, and the attack parameter is an attack angle. The step S2 may specifically include the following steps a1 to a 4:
step A1, judging whether the missile of the missile weapon hits the game role of the own party, if so, executing step A2; if not, go to step A3.
The projectile is an object which is launched by a launching weapon and can cause damage to a game character in a game scene. The projectile may be a bullet fired by a firearm or may be an arrow fired by a bow, and the particular type of projectile should not be construed as limiting the application.
Optionally, in a specific embodiment, the step a1 specifically includes the following steps: determining the shooting range of the shooting weapon according to the attack angle and the attack source of the shooting weapon; and judging whether the missile of the missile-type weapon hits the game role of the own party or not according to the missile range and the displacement information of the game role of the own party. In the above embodiment, the damage range of the weapon can be determined according to the attack angle and the attack source, so that the damage range of the weapon can be compared with the displacement information of the own game character, and whether the missile hits the own game character or not can be further determined. And judging according to whether the damage range comprises the displacement track of the own game role or not, so that the judgment accuracy can be effectively improved.
If the transmitter hits the own game character, the own game character is shown to be suffered shooting injury, and the step A2 is skipped; if the projectile misses the own game character, it indicates that the own game character has not suffered a shooting injury, and the process goes to step a 3.
Step A2, determining the attack type borne by the self-game character as a first attack type.
The first attack type is a type that characterizes the own game character as suffering a shooting injury.
And A3, judging whether the missile of the missile weapon passes through the capsule body wrapping the game role of the own party, if so, executing the step A4.
The capsule body is a capsule-shaped structure wrapping the game role of the own party. If the projectile of the projectile weapon passes through the capsule body wrapping the game character, the risk of shooting injury to the game character is indicated.
Optionally, in a specific embodiment, the step a3 may specifically include the following steps: determining the shooting range of the shooting weapon according to the attack angle and the attack source of the shooting weapon; and judging whether the projectile of the projectile weapon penetrates through a capsule body wrapping the game role of the own party or not according to the shooting range and the displacement information of the game role of the own party. In the above embodiment, the damage range of the weapon may be determined according to the attack angle and the attack source, so that the damage range of the weapon may be compared with the displacement information of the own game character, and whether the projectile penetrates through the capsule wrapping the own game character may be determined. And judging according to whether the damage range comprises the displacement track of the own game role or not, so that the judgment accuracy can be effectively improved.
Step A4, determining the attack type borne by the self-game character as a second attack type.
The second attack type is a type that characterizes a self game character as having a risk of sustaining a shooting injury.
In the above embodiment, it may be determined whether the own game character is hit according to a relationship between a range in which an attack behavior can cause an injury and displacement information of the own game character, and if so, the attack type is determined to be the first attack type; if not, further judgment is carried out: and judging whether the transmitter exceeds the capsule body, if so, determining that the attack type is a second attack type. The situation that own game role is subjected to shooting injury and the risk of bearing the shooting injury are obtained through two-step judgment, and the situation can be visually displayed in the target three-dimensional indication model, so that game players can quickly know the situation of own game role and quickly react, information can be timely and accurately obtained by the game players in a fast-paced game, and the game experience is smoother.
In another specific embodiment, if the attack weapon is not an explosive weapon, the attack source is a drop point of the explosive weapon, and the attack parameter is an explosion radius. Step S2 may specifically include the following steps B1 to B2.
And step B1, determining the explosion wave range of the explosive weapon according to the drop point and the explosion radius of the explosive weapon.
The falling point of the explosive weapon is the position of the explosive weapon falling in the game scene after other game objects throw the explosive weapon. The radius of detonation is the maximum radius at which such explosive weapons can cause injury. Therefore, by drawing a circle with the drop point as the center and the explosion radius as the radius, the range of the explosive weapon in the game scene, which can cause the explosion injury, namely the explosion spread range, can be obtained.
And step B2, if the game character of the own party enters the explosion wave coverage range, determining that the attack type borne by the game character of the own party is a third attack type.
If the own game character enters the explosion coverage range, the risk that the own game character possibly bears the explosion attack is shown, so that the attack type borne by the own game character can be determined as a third attack type representing the risk of bearing the explosion attack with high probability.
Optionally, after determining that the type of attack borne by the own game character is the third attack type, it may be further determined that, in step 120, the target three-dimensional indication model generated in the three-dimensional model container is attached with a target map corresponding to the third attack type.
Further, an explosion type icon may be generated at a preset distance from the target three-dimensional indication model, wherein the explosion type icon is used for indicating the weapon type of the explosive weapon. Referring to fig. 1f for details, C in fig. 1f is an explosion type icon.
The weapon types are used to reflect the types of injuries caused by explosive weapons, and the types of injuries caused by different types of explosive weapons are different. For example, explosive weapons may include types of weapons such as bombs, flash bombs, smoke bombs, and poison bombs, each of which may have a respective explosive type icon.
In the above embodiment, the target three-dimensional indication model to which the target map corresponding to the third attack type is attached may indicate where the own game character is to be exposed to the risk of explosion attack, and may display an explosion type icon near the target three-dimensional indication model, which is beneficial for a game player to intuitively know what kind of explosion weapon the own game player is exposed to in the game, and to control the own game character to rapidly avoid, thereby increasing the playability of the game.
Optionally, in another specific embodiment, the step 111 may specifically include the following step C1 to step C4:
step C1, receiving attack data information sent by the server, where the attack data information is information corresponding to the attack behavior launched by the attacker toward the own game character, and the attack data information includes an attack source, an attack weapon, and attack parameters.
And step C2, determining the type of the prescreening attack born by the own game role according to the attack source, the attack weapon, the attack parameters and the displacement information of the own game role.
Steps C1 to C2 correspond to steps S1 to S2, and are not repeated herein.
And step C3, sending the displacement information of the own game role and the prescreening attack type to the server so that the server can verify the prescreening attack type.
And step C4, determining a final selection attack type according to the feedback information returned by the server, wherein the final selection attack type is the interaction type.
In the above-described embodiment, after the terminal device determines the attack type borne by the own game character by using the attack data information and the displacement information of the own game character, the attack type may be defined as the initially screened attack type calculated by the terminal device. And then the initial screening attack type and the displacement information of the own game role are sent to a server together, and the server checks the initial screening attack type and the displacement information. If the server passes the verification, the server can return feedback information representing that the verification passes, and the terminal equipment can determine the primary screening attack type as a final selection attack type; if the server fails to verify, the server may return the attack type identified by the server to the terminal device, where the attack type identified by the server is the above-mentioned selected attack type. The final selection attack type obtained through the server verification is used as the attack type borne by the own game role, so that the misjudgment of the terminal equipment caused by data transmission delay can be avoided, and the fairness of the game is improved.
Optionally, in a specific embodiment, the three-dimensional model container includes N three-dimensional indicating models, where N is a positive integer; and the interactive behaviors launched by the other game objects have corresponding prompt duration. The method provided by the embodiment of the application may further include the following steps 130 to 150:
step 130, if the number of the interactive behaviors borne by the own game character at the same time is not more than N, the step 110 is skipped for each interactive behavior borne by the own game character.
The three-dimensional model container can simultaneously display N three-dimensional indication models at most, so that the interactive behaviors can be simultaneously displayed in the three-dimensional model container under the condition that the behavior number of the interactive behaviors borne by the own game character at the same time is less than or equal to N. Optionally, N three-dimensional indication models can be created in advance and bound with the three-dimensional model container, so that rapid display of the three-dimensional indication models can be realized when positions need to be indicated.
For each interactive behavior, because the interactive type of each interactive behavior is different, the corresponding target map is also different. Therefore, even if the three-dimensional indication models which are not more than N are displayed at the same time, confusion is not caused, and a game player in a game environment can intuitively know more information. Wherein N may be 2 or 4, it should be understood that the specific value represented by N should not be construed as a limitation to the present application.
Step 140, if the number of the interactive behaviors borne by the own game character at the same time is greater than N, determining N interactive behaviors with the highest priority within the prompt duration according to the trigger time of the interactive behaviors and the priority of the interactive types.
The priority is used to reflect the importance of the interaction type, and as described above, the following description will proceed by taking the interaction behavior as an example of an attack behavior, which can be generally considered as: the type of representing the own game character to be suffered from shooting injury is higher, the type of representing the high probability to be suffered from explosion attack risk is inferior, and the type of representing the own game character to be suffered from shooting injury risk is inferior. Therefore, in order of priority from high to low, the first attack type may be considered to be higher than the third attack type, which is higher than the second attack type.
Optionally, for a plurality of interactive behaviors that the own game character suffers at the same time, the interactive behaviors may be sorted according to the priority of the interaction type, and the N interactive behaviors with the highest priority may be selected.
It should be appreciated that the above-mentioned highest priority N interactive behaviors are dynamically changed, subject to both the duration of the prompt and the priority of the newly received interactive behaviors. The specific process of dynamic change of the N interaction behaviors with the highest priority may be as follows:
under the condition that the N interactive behaviors are still within the prompt duration, when a new interactive behavior is received, comparing the priority of the new interactive behavior with the lowest priority of the priorities of the N interactive behaviors, and if the priority of the new interactive behavior is lower than the lowest priority, keeping the N interactive behaviors with the highest priority unchanged; and if the priority of the new interactive behavior is higher than the lowest priority, removing the interactive behavior corresponding to the lowest priority in the priorities of the N interactive behaviors, and adding the new interactive behavior into the N interactive behaviors with the highest priority.
After the new interactive behaviors are added into the N interactive behaviors with the highest priority, the N interactive behaviors can be ranked according to the priority order to obtain the interactive behavior with the lowest priority from the N interactive behaviors with the highest priority so as to perform comparison next time.
If there is no newly received interactive behavior, each of the N interactive behaviors with the highest priority is automatically removed by the terminal device when the respective prompt duration arrives.
And 150, skipping to the step 110 for each interactive behavior in the N interactive behaviors.
In the above embodiment, the N interactive behaviors with the highest dynamically changing priority are acquired, and the process skips to step 110, so as to generate N target three-dimensional indication models, so that when a game player receives more harm information, the harm information with the high priority can be intuitively and rapidly acquired, and the harmfulness information can be responded to, thereby helping the game character controlled by the game player to survive in the game environment as long as possible, so that the game player can play more smoothly, and the number of active users in the game is increased.
In the display control method in the game provided by the embodiment of the application, the interaction source of the interaction behavior can be intuitively indicated in a manner that the first indication end of the target three-dimensional indication model indicates the interaction source, and the second indication end indicates the center of the three-dimensional model container, and the relative position relationship between the three-dimensional model container generating the target three-dimensional indication model and the own game character is fixed, so that no matter how a game player controlling the own game character controls the own game character, the game player can stably and intuitively view the relative position of the interaction behavior borne by the own game character in a game scene.
In the application, when the interactive behavior is an attack behavior initiated by other game objects, the game player can be quickly positioned to the injury source.
The method described in the above embodiments is further described in detail below.
In this embodiment, the method of the embodiment of the present application will be described in detail by taking an example that the interactive behavior is an attack behavior.
As shown in fig. 2, a specific flow of a display control method in a game is as follows:
201. the target attack source of the attack behavior launched by the enemy game character born by the own game character is determined.
202. The target attack type of the attack behavior launched by the enemy game character and borne by the own game character is determined.
Step 202 includes the following steps 2021 to 2022:
2021. receiving attack data information sent by a server, wherein the attack data information comprises an attack source, an attack weapon and attack parameters
2022. And determining the attack type borne by the own game role according to the attack data information and the displacement information of the own game role.
In a specific embodiment, if the attack weapon is a projectile weapon, the attack source is a position where the projectile weapon shoots a projectile, and the attack parameter is an attack angle. Step 2022 specifically includes the following steps:
and judging whether the missile of the missile-type weapon hits the game role of the own party.
And if the own game role is hit, determining that the attack type borne by the own game role is a first attack type.
If the game role of the own party is not hit, judging whether the missile of the missile type weapon passes through a capsule body wrapping the game role of the own party, wherein the capsule body is a capsule-shaped structure wrapping the game role of the own party.
And if the attack type borne by the own game role passes through the capsule body wrapping the own game role, determining that the attack type borne by the own game role is a second attack type.
The step of judging whether the missile of the missile-type weapon hits the game role of the own party specifically comprises the following steps: determining the shooting range of the shooting weapon according to the attack angle and the attack source of the shooting weapon; and judging whether the missile of the missile-type weapon hits the game role of the own party or not according to the missile range and the displacement information of the game role of the own party.
The step of judging whether the missile of the missile weapon passes through the capsule wrapping the game role of the own party specifically comprises the following steps: determining the shooting range of the shooting weapon according to the attack angle and the attack source of the shooting weapon; and judging whether the projectile of the projectile weapon penetrates through a capsule body wrapping the game role of the own party or not according to the shooting range and the displacement information of the game role of the own party.
In another specific embodiment, if the attack weapon is an explosive weapon, the attack source is a drop point of the explosive weapon, and the attack parameter is an explosion radius. Step 2022 specifically includes the following steps:
determining the explosion wave range of the explosive weapons according to the drop points and the explosion radii of the explosive weapons; and if the own game role enters the explosion wave coverage range, determining that the attack type borne by the own game role is a third attack type.
203. And determining a target map corresponding to the target attack type according to the corresponding relation between the attack type and the model map.
204. And generating a target three-dimensional indicating model attached with a target map in the three-dimensional model container, wherein a first indicating end of the target three-dimensional indicating model points to a target attack source, a second indicating end of the target three-dimensional indicating model points to the center of the three-dimensional model container, and the first indicating end and the second indicating end face in opposite directions.
205. And if the attack type is the first attack type, generating a target three-dimensional indication model attached with a first target map in the three-dimensional model container, wherein a first indication end of the target three-dimensional indication model points to a target attack source, and a second indication end of the target three-dimensional indication model points to the center of the three-dimensional model container.
The first target map corresponds to a first attack type.
206. And if the attack type is a second attack type, generating a target three-dimensional indication model attached with a second target map in the three-dimensional model container, wherein a first indication end of the target three-dimensional indication model points to a target attack source, and a second indication end of the target three-dimensional indication model points to the center of the three-dimensional model container.
The second target map corresponds to a second attack type.
207. And if the attack type is a third attack type, generating a target three-dimensional indication model attached with a third target map in the three-dimensional model container, wherein a first indication end of the target three-dimensional indication model points to a target attack source, and a second indication end of the target three-dimensional indication model points to the center of the three-dimensional model container.
The third target map corresponds to a third attack type.
208. And generating an explosion type icon at a preset distance from the target three-dimensional indication model, wherein the explosion type icon is used for indicating the weapon type of the explosion weapon reflected by the third attack type.
From the above, the attack source of the attack behavior can be intuitively indicated in a manner that the first indication end of the target three-dimensional indication model indicates the attack source, and the second indication end indicates the center of the three-dimensional model container, and the map of the target three-dimensional indication model corresponds to the attack type. Meanwhile, because the relative position relationship between the three-dimensional model container for generating the target three-dimensional indication model and the own game role is fixed, no matter how the game player controlling the own game role controls the own game role, the game player can stably and intuitively check the attack type of the attack action born by the own game role and the relative position in the game scene.
The embodiment of the application can enable a game player to quickly locate the injury source.
In order to better implement the method, the embodiment of the present application further provides an in-game display control apparatus, where the in-game display control apparatus may be specifically integrated in an electronic device, and the electronic device may be a terminal. The terminal can be a mobile phone, a tablet computer, an intelligent Bluetooth device, a notebook computer, a personal computer and other devices.
For example, in the present embodiment, the method of the present embodiment will be described in detail by taking an example in which a display control device in a game is specifically integrated in a terminal.
For example, as shown in fig. 3, the in-game display control device may include:
an interaction source determining unit 301, configured to determine an interaction source of an interaction behavior launched by another game object borne by the own game character;
a model generating unit 302, configured to generate a target three-dimensional indication model in the three-dimensional model container, where the target three-dimensional indication model includes a first indication end and a second indication end, a direction pointing from the second indication end to the first indication end points to an interaction source of an interaction behavior launched by the other game object, and a direction pointing from the first indication end to the second indication end points to a center of the three-dimensional model container.
In some embodiments, the apparatus further comprises:
the interaction type determining unit is used for determining the interaction type of the interaction behavior launched by other game objects born by the own game role;
the target map determining unit is used for determining a target map corresponding to the interaction type of the interaction behavior initiated by the other game object according to the corresponding relation between the interaction type and the model map; wherein, the target three-dimensional indication model is pasted with the target map.
In some embodiments, the interactive behavior launched by the other game object is an attack behavior launched by an attacker, the interaction source is an attack source, and the interaction type is an attack type; an interaction type determination unit comprising:
and the attack type subunit is used for determining the attack type borne by the own game role according to the attack source, the attack weapon, the attack parameter and the displacement information of the own game role, wherein the attack type borne by the own game role is the interaction type.
In some embodiments, the interaction type determining unit further includes:
the attack data receiving subunit is configured to receive attack data information sent by a server, where the attack data information is information corresponding to the attack behavior launched by the attacker toward the own game character, and the attack data information includes an attack source, an attack weapon, and attack parameters.
In some embodiments, the attack weapon is a projectile weapon, the attack source is a position where the projectile weapon shoots a projectile, and the attack parameter is an attack angle;
the attack type subunit is specifically used for judging whether the missile of the missile weapon hits the game role of the own party or not;
and if the own game role is hit, determining that the attack type borne by the own game role is a first attack type.
In some embodiments, the attack type subunit is further specifically configured to, if the own game character is not hit, determine whether a projectile of the projectile weapon passes through a capsule wrapping the own game character, where the capsule is a capsule structure wrapping the own game character;
and if the attack type borne by the own game role passes through the capsule body wrapping the own game role, determining that the attack type borne by the own game role is a second attack type.
In some embodiments, the attack weapon is an explosive weapon, the attack source is a drop point of the explosive weapon, and the attack parameter is an explosion radius;
the attack type subunit is specifically used for determining the explosion spread range of the explosive weapons according to the drop points and the explosion radii of the explosive weapons;
and if the own game role enters the explosion wave coverage range, determining that the attack type borne by the own game role is a third attack type.
In some embodiments, the apparatus further comprises:
an icon generating unit, configured to generate an explosion type icon at a preset distance from the target three-dimensional indication model, where the explosion type icon is used to indicate a weapon type of the explosive weapon.
In some embodiments, the interactive behavior launched by the other game object is an attack behavior launched by an attacker, the interaction source is an attack source, and the interaction type is an attack type; an interaction type determination unit comprising:
the data receiving subunit is configured to receive attack data information sent by a server, where the attack data information is information corresponding to the attack behavior launched by an attacker toward the own game character, and the attack data information includes an attack source, an attack weapon, and an attack parameter;
the primary screening attack determining subunit is used for determining the type of primary screening attack borne by the own game role according to the attack source, the attack weapon, the attack parameters and the displacement information of the own game role;
the attack verification subunit is used for sending the displacement information of the own game role and the preliminary screening attack type to the server so that the server can verify the preliminary screening attack type;
and the final selection attack subunit is used for determining a final selection attack type according to the feedback information returned by the server, wherein the final selection attack type is the interaction type.
In some embodiments, the three-dimensional model container comprises N three-dimensional indicating models, N being a positive integer; the interactive behaviors launched by the other game objects have corresponding prompt duration;
the device further comprises:
a first quantity determination unit, configured to, when the quantity of the interactive behaviors that the own game character receives at the same time is not greater than N, perform, for each interactive behavior that the own game character receives, "determine an interaction source of an interactive behavior launched by another game object that the own game character receives";
the second quantity determination unit is used for determining N interactive behaviors which are within the prompt duration and have the highest priority according to the triggering time of the interactive behaviors and the priority of the interactive types when the behavior quantity of the interactive behaviors born by the own game character at the same time is larger than N; for each of the N interactive behaviors, the step of "determining an interaction source of an interactive behavior launched by another game object to which the own game character is subjected" is performed.
In a specific implementation, the above units may be implemented as independent entities, or may be combined arbitrarily to be implemented as the same or several entities, and the specific implementation of the above units may refer to the foregoing method embodiments, which are not described herein again.
As can be seen from the above, in the present application, the interaction source of the interaction behavior can be intuitively indicated by the way that the first indication end of the target three-dimensional indication model indicates the interaction source, and the second indication end indicates the center of the three-dimensional model container, and since the relative position relationship between the three-dimensional model container generating the target three-dimensional indication model and the own game character is fixed, no matter how the game player controlling the own game character operates the own game character, the game player can stably and intuitively check the relative orientation of the interaction behavior borne by the own game character in the game scene. And in the case that the interactive behavior is an attack behavior initiated by other game objects, the game player can be quickly positioned to the injury source.
The embodiment of the application also provides the electronic equipment which can be equipment such as a terminal and a server. The terminal can be a mobile phone, a tablet computer, an intelligent Bluetooth device, a notebook computer, a personal computer and the like; the server may be a single server, a server cluster composed of a plurality of servers, or the like.
In some embodiments, the in-game display control apparatus may also be integrated in a plurality of electronic devices, for example, the in-game display control apparatus may be integrated in a plurality of servers, and the in-game display control method of the present application is implemented by the plurality of servers.
In this embodiment, the electronic device of this embodiment is described in detail as an example, for example, as shown in fig. 4, it shows a schematic structural diagram of the electronic device according to the embodiment of the present application, specifically:
the electronic device may include components such as a processor 401 of one or more processing cores, memory 402 of one or more computer-readable storage media, a power supply 403, an input module 404, and a communication module 405. Those skilled in the art will appreciate that the electronic device configuration shown in fig. 4 does not constitute a limitation of the electronic device and may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components. Wherein:
the processor 401 is a control center of the electronic device, connects various parts of the whole electronic device by various interfaces and lines, performs various functions of the electronic device and processes data by running or executing software programs and/or modules stored in the memory 402 and calling data stored in the memory 402, thereby performing overall monitoring of the electronic device. In some embodiments, processor 401 may include one or more processing cores; in some embodiments, processor 401 may integrate an application processor, which primarily handles operating systems, user interfaces, applications, etc., and a modem processor, which primarily handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor 401.
The memory 402 may be used to store software programs and modules, and the processor 401 executes various functional applications and data processing by operating the software programs and modules stored in the memory 402. The memory 402 may mainly include a program storage area and a data storage area, wherein the program storage area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data created according to use of the electronic device, and the like. Further, the memory 402 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device. Accordingly, the memory 402 may also include a memory controller to provide the processor 401 access to the memory 402.
The electronic device also includes a power supply 403 for supplying power to the various components, and in some embodiments, the power supply 403 may be logically coupled to the processor 401 via a power management system, such that the power management system may manage charging, discharging, and power consumption. The power supply 403 may also include any component of one or more dc or ac power sources, recharging systems, power failure detection circuitry, power converters or inverters, power status indicators, and the like.
The electronic device may also include an input module 404, the input module 404 operable to receive input numeric or character information and generate keyboard, mouse, joystick, optical or trackball signal inputs related to user settings and function control.
The electronic device may also include a communication module 405, and in some embodiments the communication module 405 may include a wireless module, through which the electronic device may wirelessly transmit over short distances, thereby providing wireless broadband internet access to the user. For example, the communication module 405 may be used to assist a user in sending and receiving e-mails, browsing web pages, accessing streaming media, and the like.
Although not shown, the electronic device may further include a display unit and the like, which are not described in detail herein. Specifically, in this embodiment, the processor 401 in the electronic device loads the executable file corresponding to the process of one or more application programs into the memory 402 according to the following instructions, and the processor 401 runs the application program stored in the memory 402, thereby implementing various functions as follows:
determining an interaction source of an interaction behavior launched by other game objects borne by the self game character; and generating a target three-dimensional indication model in the three-dimensional model container, wherein the target three-dimensional indication model comprises a first indication end and a second indication end, the direction from the second indication end to the first indication end points to the interaction source of the interaction behaviors launched by the other game objects, and the direction from the first indication end to the second indication end points to the center of the three-dimensional model container.
The above operations can be implemented in the foregoing embodiments, and are not described in detail herein.
It will be understood by those skilled in the art that all or part of the steps of the methods of the above embodiments may be performed by instructions or by associated hardware controlled by the instructions, which may be stored in a computer readable storage medium and loaded and executed by a processor.
To this end, the present application provides a computer-readable storage medium, in which a plurality of instructions are stored, where the instructions can be loaded by a processor to execute the steps in the display control method in any game provided by the present application. For example, the instructions may perform the steps of:
determining an interaction source of an interaction behavior launched by other game objects borne by the self game character; and generating a target three-dimensional indication model in the three-dimensional model container, wherein the target three-dimensional indication model comprises a first indication end and a second indication end, the direction from the second indication end to the first indication end points to the interaction source of the interaction behaviors launched by the other game objects, and the direction from the first indication end to the second indication end points to the center of the three-dimensional model container.
Wherein the storage medium may include: read Only Memory (ROM), Random Access Memory (RAM), magnetic or optical disks, and the like.
According to an aspect of the application, a computer program product or computer program is provided, comprising computer instructions, the computer instructions being stored in a computer readable storage medium. The processor of the computer device reads the computer instructions from the computer-readable storage medium, and the processor executes the computer instructions to cause the computer device to perform the method provided in the various alternative implementations provided in the embodiments described above.
Since the instructions stored in the storage medium can execute the steps in any game display control method provided in the embodiments of the present application, the beneficial effects that can be achieved by any game display control method provided in the embodiments of the present application can be achieved, which are detailed in the foregoing embodiments and will not be described herein again.
The foregoing detailed description is directed to a method, an apparatus, an electronic device, and a computer-readable storage medium for controlling display in a game according to embodiments of the present application, and specific examples are applied herein to explain the principles and implementations of the present application, and the descriptions of the foregoing embodiments are only used to help understand the method and the core idea of the present application; meanwhile, for those skilled in the art, according to the idea of the present application, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present application.

Claims (14)

1. A display control method in a game is characterized in that a terminal device provides a graphical user interface, the content displayed in the graphical user interface at least comprises a game scene and a three-dimensional model container, the game scene comprises a self game role controlled and operated by the terminal device, the relative position relationship between the three-dimensional model container and the self game role is fixed, and the three-dimensional model container comprises a three-dimensional indication model;
the method comprises the following steps:
determining an interaction source of an interaction behavior launched by other game objects borne by the self game character;
and generating a target three-dimensional indication model in the three-dimensional model container, wherein the target three-dimensional indication model comprises a first indication end and a second indication end, the direction from the second indication end to the first indication end points to the interaction source of the interaction behaviors launched by the other game objects, and the direction from the first indication end to the second indication end points to the center of the three-dimensional model container.
2. The method of claim 1, wherein prior to generating the target three-dimensional pointing model in the three-dimensional model container, the method further comprises:
determining the interaction type of the interaction behavior launched by other game objects born by the self game character;
determining a target map corresponding to the interaction type of the interaction behavior launched by the other game objects according to the corresponding relation between the interaction type and the model map; wherein, the target three-dimensional indication model is pasted with the target map.
3. The method of claim 2, wherein the interactive behavior launched by the other game object is an attack behavior launched by an attacker, the interaction source is an attack source, and the interaction type is an attack type;
the determining the interaction type of the interaction behavior launched by the other game objects born by the self game character comprises the following steps:
determining the attack type borne by the game role of the own party according to the attack source, the attack weapon, the attack parameter and the displacement information of the game role of the own party, wherein the attack type borne by the game role of the own party is the interaction type.
4. The method of claim 3, wherein the attack weapon is a projectile weapon, the attack source is a location where the projectile weapon shoots a projectile, and the attack parameter is an attack angle;
the determining the type of the attack born by the game role of the own party according to the attack source, the attack weapon, the attack parameter and the displacement information of the game role of the own party comprises the following steps:
judging whether the missile of the missile weapon hits the game role of the own party;
and if the own game role is hit, determining that the attack type borne by the own game role is a first attack type.
5. The method of claim 4, wherein said determining whether the projectile of said projectile-like weapon hits said self game character comprises:
determining the shooting range of the shooting weapon according to the attack angle and the attack source of the shooting weapon;
and judging whether the missile of the missile-type weapon hits the game role of the own party or not according to the missile range and the displacement information of the game role of the own party.
6. The method of claim 4, wherein after said determining whether the projectile of said projectile-like weapon hits said self game character, said method further comprises:
if the game role of the own party is not hit, judging whether a propellant of the projectile weapon penetrates through a capsule body wrapping the game role of the own party, wherein the capsule body is a capsule-shaped structure wrapping the game role of the own party;
and if the attack type borne by the own game role passes through the capsule body wrapping the own game role, determining that the attack type borne by the own game role is a second attack type.
7. The method of claim 6, wherein said determining whether the projectile of said projectile-type weapon passes through a capsule that encapsulates said self game character comprises:
determining the shooting range of the shooting weapon according to the attack angle and the attack source of the shooting weapon;
and judging whether the projectile of the projectile weapon penetrates through a capsule body wrapping the game role of the own party or not according to the shooting range and the displacement information of the game role of the own party.
8. The method of claim 3, wherein the attacking weapon is an exploding weapon, the attack source is a drop point of the exploding weapon, and the attack parameter is a radius of explosion;
the determining the type of the attack born by the game role of the own party according to the attack source, the attack weapon, the attack parameter and the displacement information of the game role of the own party comprises the following steps:
determining the explosion wave range of the explosive weapons according to the drop points and the explosion radii of the explosive weapons;
and if the own game role enters the explosion wave coverage range, determining that the attack type borne by the own game role is a third attack type.
9. The method of claim 8, wherein after generating the target three-dimensional pointing model in the three-dimensional model container, the method further comprises:
generating an explosion type icon at a preset distance from the target three-dimensional indication model, wherein the explosion type icon is used for indicating the weapon type of the explosion weapon.
10. The method of claim 2, wherein the interactive behavior launched by the other game object is an attack behavior launched by an attacker, the interaction source is an attack source, and the interaction type is an attack type;
the determining the interaction type of the interaction behavior launched by the other game objects born by the self game character comprises the following steps:
receiving attack data information sent by a server, wherein the attack data information is information corresponding to the attack behavior launched towards the own game role by an attacker, and the attack data information comprises an attack source, an attack weapon and attack parameters;
determining the type of the primary screening attack born by the own game role according to the attack source, the attack weapon, the attack parameter and the displacement information of the own game role;
sending the displacement information of the own game role and the primary screening attack type to the server so that the server can check the primary screening attack type;
and determining a final selection attack type according to the feedback information returned by the server, wherein the final selection attack type is the interaction type.
11. The method of claim 1, wherein the three-dimensional model container includes N three-dimensional indicating models, N being a positive integer; the interactive behaviors launched by the other game objects have corresponding prompt duration;
the method further comprises the following steps:
if the behavior quantity of the interactive behaviors born by the own game character at the same time is not more than N, executing the step of determining the interactive source of the interactive behaviors launched by other game objects born by the own game character for each interactive behavior born by the own game character;
if the behavior quantity of the interactive behaviors born by the own game role at the same time is larger than N, determining N interactive behaviors with the highest priority within the prompt duration according to the triggering time of the interactive behaviors and the priority of the interactive types;
for each of the N interactive behaviors, the step of "determining an interaction source of an interactive behavior launched by another game object to which the own game character is subjected" is performed.
12. A display control device in a game is characterized in that a terminal device provides a graphical user interface, the content displayed in the graphical user interface at least comprises a game scene and a three-dimensional model container, the game scene comprises a self game role controlled and operated by the terminal device, the relative position relationship between the three-dimensional model container and the self game role is fixed, and the three-dimensional model container comprises a three-dimensional indication model;
the device comprises:
the interaction source determining unit is used for determining the interaction source of the interaction behavior launched by other game objects born by the own game role;
and the model generating unit is used for generating a target three-dimensional indication model in the three-dimensional model container, wherein the target three-dimensional indication model comprises a first indication end and a second indication end, the direction from the second indication end to the first indication end points to the interaction source of the interaction behaviors launched by the other game objects, and the direction from the first indication end to the second indication end points to the center of the three-dimensional model container.
13. An electronic device comprising a processor and a memory, the memory storing a plurality of instructions; the processor loads instructions from the memory to perform the steps of the in-game display control method according to any one of claims 1 to 11.
14. A computer-readable storage medium storing instructions adapted to be loaded by a processor to perform the steps of the method for controlling display in a game according to any one of claims 1 to 11.
CN202111372253.8A 2021-11-18 2021-11-18 Display control method and device in game, electronic equipment and storage medium Pending CN114146412A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111372253.8A CN114146412A (en) 2021-11-18 2021-11-18 Display control method and device in game, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111372253.8A CN114146412A (en) 2021-11-18 2021-11-18 Display control method and device in game, electronic equipment and storage medium

Publications (1)

Publication Number Publication Date
CN114146412A true CN114146412A (en) 2022-03-08

Family

ID=80456984

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111372253.8A Pending CN114146412A (en) 2021-11-18 2021-11-18 Display control method and device in game, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN114146412A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024082753A1 (en) * 2022-10-21 2024-04-25 网易(杭州)网络有限公司 Game indicator generation method and apparatus, computer device, and storage medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008245984A (en) * 2007-03-30 2008-10-16 Konami Digital Entertainment:Kk Game sound output device, sound image locating control method and program
US20100267451A1 (en) * 2009-04-20 2010-10-21 Capcom Co., Ltd. Game machine, program for realizing game machine, and method of displaying objects in game
JP2012166067A (en) * 2012-06-11 2012-09-06 Copcom Co Ltd Game system, game control method, program, and computer readable recording medium with program recorded therein
CN110548288A (en) * 2019-09-05 2019-12-10 腾讯科技(深圳)有限公司 Virtual object hit prompting method and device, terminal and storage medium

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008245984A (en) * 2007-03-30 2008-10-16 Konami Digital Entertainment:Kk Game sound output device, sound image locating control method and program
US20100267451A1 (en) * 2009-04-20 2010-10-21 Capcom Co., Ltd. Game machine, program for realizing game machine, and method of displaying objects in game
JP2012166067A (en) * 2012-06-11 2012-09-06 Copcom Co Ltd Game system, game control method, program, and computer readable recording medium with program recorded therein
CN110548288A (en) * 2019-09-05 2019-12-10 腾讯科技(深圳)有限公司 Virtual object hit prompting method and device, terminal and storage medium

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024082753A1 (en) * 2022-10-21 2024-04-25 网易(杭州)网络有限公司 Game indicator generation method and apparatus, computer device, and storage medium

Similar Documents

Publication Publication Date Title
CN110548288B (en) Virtual object hit prompting method and device, terminal and storage medium
WO2022017063A1 (en) Method and apparatus for controlling virtual object to recover attribute value, and terminal and storage medium
CN110732135B (en) Virtual scene display method and device, electronic equipment and storage medium
US20230013014A1 (en) Method and apparatus for using virtual throwing prop, terminal, and storage medium
CN110721468A (en) Interactive property control method, device, terminal and storage medium
WO2021227733A1 (en) Method and apparatus for displaying virtual prop, and device and storage medium
CN112121414B (en) Tracking method and device in virtual scene, electronic equipment and storage medium
US20220161138A1 (en) Method and apparatus for using virtual prop, device, and storage medium
CN111921198B (en) Control method, device and equipment of virtual prop and computer readable storage medium
US20230072503A1 (en) Display method and apparatus for virtual vehicle, device, and storage medium
WO2022156491A1 (en) Virtual object control method and apparatus, and device, storage medium and program product
US20230052088A1 (en) Masking a function of a virtual object using a trap in a virtual environment
US20230033530A1 (en) Method and apparatus for acquiring position in virtual scene, device, medium and program product
CN112057863A (en) Control method, device and equipment of virtual prop and computer readable storage medium
US20220379209A1 (en) Virtual resource display method and related apparatus
CN113457151A (en) Control method, device and equipment of virtual prop and computer readable storage medium
CN112057864A (en) Control method, device and equipment of virtual prop and computer readable storage medium
CN114146412A (en) Display control method and device in game, electronic equipment and storage medium
CN114307150B (en) Method, device, equipment, medium and program product for interaction between virtual objects
CN115430153A (en) Collision detection method, device, apparatus, medium, and program in virtual environment
CN112156472B (en) Control method, device and equipment of virtual prop and computer readable storage medium
CN113769392A (en) State processing method and device for virtual scene, electronic equipment and storage medium
CN113663329B (en) Shooting control method and device for virtual character, electronic equipment and storage medium
CN112843682B (en) Data synchronization method, device, equipment and storage medium
CN112891930B (en) Information display method, device, equipment and storage medium in virtual scene

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination