CN113398565A - Game control method, device, terminal and storage medium - Google Patents

Game control method, device, terminal and storage medium Download PDF

Info

Publication number
CN113398565A
CN113398565A CN202110801764.0A CN202110801764A CN113398565A CN 113398565 A CN113398565 A CN 113398565A CN 202110801764 A CN202110801764 A CN 202110801764A CN 113398565 A CN113398565 A CN 113398565A
Authority
CN
China
Prior art keywords
game
target
event
control
game scene
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110801764.0A
Other languages
Chinese (zh)
Other versions
CN113398565B (en
Inventor
胡佳胜
胡志鹏
程龙
刘勇成
袁思思
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Netease Hangzhou Network Co Ltd
Original Assignee
Netease Hangzhou Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netease Hangzhou Network Co Ltd filed Critical Netease Hangzhou Network Co Ltd
Priority to CN202110801764.0A priority Critical patent/CN113398565B/en
Publication of CN113398565A publication Critical patent/CN113398565A/en
Application granted granted Critical
Publication of CN113398565B publication Critical patent/CN113398565B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/214Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads
    • A63F13/2145Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads the surface being also a display device, e.g. touch screens
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/23Input arrangements for video game devices for interfacing with the game device, e.g. specific interfaces between game controller and console
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • A63F13/525Changing parameters of virtual cameras
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1068Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals being specially adapted to detect the point of contact of the player on a surface, e.g. floor mat, touch pad
    • A63F2300/1075Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals being specially adapted to detect the point of contact of the player on a surface, e.g. floor mat, touch pad using a touch screen
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/30Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device
    • A63F2300/308Details of the user interface

Abstract

The embodiment of the application discloses a method, a device, a terminal and a storage medium for game control; the graphical user interface comprises a first game scene picture for observing a game scene at a first game visual angle, wherein at least part of the first game scene picture comprises virtual characters, the virtual characters are virtual objects controlled by a player through a touch terminal, and the graphical user interface displays a visual field switching control; responding to the touch operation aiming at the visual field switching control, and determining whether a target game event exists in the game scene currently; if the target game event exists, determining a target position according to the target game event; and determining a second game visual angle according to the target position, and switching the first game scene picture into a second game scene picture for observing the game scene at the second game visual angle. In the embodiment of the application, the user can display the second game scene picture through convenient and efficient operation. Therefore, the scheme can improve the interaction efficiency.

Description

Game control method, device, terminal and storage medium
Technical Field
The application relates to the field of computers, in particular to a method, a device, a terminal and a storage medium for game control.
Background
The User Interface (UI) is a medium for man-machine interaction and information exchange between the system and the User, and the UI can display system information in a form acceptable to human, so that the User can conveniently and efficiently operate the computer to achieve bidirectional man-machine interaction. The UI interface often includes a small map that indicates the positions of virtual objects in a virtual scene (e.g., a game scene and a simulation scene) so as to guide a user to switch to view the virtual objects by clicking the small map.
However, in the group battle of the game, the identifiers of the virtual objects controlled by the two parties of the enemy and the my are relatively close to each other, and the user may not accurately click the identifier which the user wants to click, so that the game view angle of the user may not be quickly switched to the game scene with the group battle; in addition, the operation mode is not humanized enough, when a player performs other operations, the operation span of clicking on a small map is large, the operation difficulty is small, and the operation habit of the user is not met, so the current mode is low in efficiency.
Disclosure of Invention
The embodiment of the application provides a game control method, a game control device, a game control terminal and a storage medium, and interaction efficiency can be improved.
An embodiment of the present application provides a method for controlling a game, where a touch terminal provides a graphical user interface, the graphical user interface includes a first game scene picture for observing a game scene from a first game perspective, the first game scene picture at least partially includes a virtual character, and the virtual character is a virtual object controlled by a player through the touch terminal, the method includes:
displaying a view switching control on a graphical user interface;
responding to touch operation aiming at the visual field switching control, and determining whether a target game event exists in a game scene currently;
if the target game event exists, determining a target position according to the target game event;
and determining a second game visual angle according to the target position, and switching the first game scene picture into a second game scene picture for observing the game scene at the second game visual angle.
An embodiment of the present application further provides a game control apparatus, where a touch terminal provides a graphical user interface, the graphical user interface includes a first game scene picture for observing a game scene from a first game perspective, the first game scene picture at least partially includes a virtual character, and the virtual character is a virtual object controlled by a player through the touch terminal, the apparatus includes:
the control unit is used for displaying the visual field switching control on the graphical user interface;
the target unit is used for responding to the touch operation aiming at the visual field switching control and determining whether a target game event exists in a game scene at present; if the target game event exists, determining a target position according to the target game event;
and the second visual angle unit is used for determining a second game visual angle according to the target position and switching the first game scene picture into a second game scene picture for observing the game scene with the second game visual angle.
In some embodiments, the apparatus is further configured to:
and responding to the end of the touch operation, and switching the second game scene picture to the first game scene picture.
In some embodiments, the target game event includes at least one of: group war event, tower pushing event, field opening event, tower pushing event and dragon opening event.
In some embodiments, the view toggle control contains an identification of a virtual object corresponding to the target game event.
In some embodiments, the view switching control includes an identifier of a representative object of the target game event, where the representative object is one of a plurality of virtual objects corresponding to the target game event.
In some embodiments, a control unit to:
and responding to the view switching starting instruction, and displaying a view switching control on the graphical user interface.
In some embodiments, the graphical user interface further comprises a minimap, the display content of the minimap containing a first thumbnail of the first game scene screen; the target unit is further configured to:
and switching the first thumbnail of the small map into a second thumbnail corresponding to the second game scene picture.
In some embodiments, the graphical user interface further comprises a minimap, the method further comprising:
in response to a zoom-in instruction for the minimap, the minimap is zoomed in, and a view switching-on instruction is generated.
In some embodiments, a control unit to:
changing the skill control displayed on the graphical user interface to a view switching control.
In some embodiments, the step "when the skill control displayed on the graphical user interface is changed to the view switching control" further comprises:
and amplifying the skill control.
In some embodiments, the apparatus is further configured to:
and when the virtual object is detected to be in a non-combat state currently, generating a view switching opening instruction.
In some embodiments, the non-combat state includes at least one of: the virtual object is in a death state, the virtual object is in a non-departure state, and no enemy exists in the preset distance range of the virtual object.
In some embodiments, the target game event is a game event in which at least two virtual objects in the same game lineup as the virtual object controlled by the player through the touch terminal participate in the same battle.
In some embodiments, the apparatus is further configured to:
in the process of switching the first game scene picture to a second game scene picture in which the game scene is viewed at a second game perspective, the virtual object is controlled to move in the game scene in response to a movement operation for the virtual object.
In some embodiments, determining whether a target game event currently exists in the game scene comprises:
when the distance between teammates is smaller than a preset value and the teammates are in a fighting state, a target game event exists and corresponds to the teammates, and the teammates are virtual objects which are in the same game formation with the virtual objects controlled by the players through the touch control terminal;
and when the distance between the teammates in the game scene is not smaller than a preset value or the teammates are in a non-combat state, determining that the game scene does not exist at present. A target unit for:
determining the position of each virtual object corresponding to the target game event;
and determining a target position according to the position of each virtual object corresponding to the target game event. The embodiment of the present application further provides a terminal, which includes a memory storing a plurality of instructions, and the processor loads the instructions from the memory to execute the steps in any one of the game control methods provided in the embodiments of the present application.
The embodiment of the present application further provides a computer scale storage medium, where the computer readable storage medium stores a plurality of instructions, and the instructions are suitable for being loaded by a processor to execute the steps in any one of the game control methods provided in the embodiments of the present application.
The embodiment of the application can display the view switching control on the graphical user interface; responding to touch operation aiming at the visual field switching control, and determining whether a target game event exists in a game scene currently; if the target game event exists, determining a target position according to the target game event; and determining a second game visual angle according to the target position, and switching the first game scene picture into a second game scene picture for observing the game scene at the second game visual angle.
The method has the advantages that the visual field switching control is arranged in the graphical user interface, a user does not need to click the identification of the virtual object participating in the target game event in the small map, the target position to be observed can be selected directly by triggering the visual field switching control in the graphical user interface, the operation difficulty of the method is small, the operation habit of the user is well met, and the user can accurately trigger the visual field switching control which the user wants to trigger, so that the method is more accurate, simpler and more convenient to operate. Therefore, the efficiency of displaying the user control picture is improved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
FIG. 1a is a prior art interaction diagram;
FIG. 1b is a schematic diagram of a skill application control of a method of game control provided by an embodiment of the present application;
FIG. 1c is a schematic view of a switching control of a field of view of a method for game control according to an embodiment of the present disclosure;
FIG. 1d is a schematic flow chart of a method for game control according to an embodiment of the present application;
FIG. 1e is a schematic view of a switching control of a field of view of a method for game control according to an embodiment of the present disclosure;
FIG. 1f is a schematic switching diagram of a method for game control according to an embodiment of the present disclosure;
FIG. 2a is a schematic UI diagram of a common mode applied to a game scene by the method for game control according to the embodiment of the present application;
FIG. 2b is a schematic UI diagram of an observation mode applied in a game scene by the method for game control according to the embodiment of the present application;
FIG. 3 is a schematic structural diagram of a game control device provided in an embodiment of the present application;
fig. 4 is a schematic structural diagram of a terminal according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The embodiment of the application provides a game control method, a game control device, a terminal and a storage medium.
The game control device may be specifically integrated in an electronic device, and the electronic device may be a terminal, a server, or the like. The terminal can be a mobile phone, a tablet Computer, an intelligent bluetooth device, a notebook Computer, or a Personal Computer (PC), and the like; the server may be a single server or a server cluster composed of a plurality of servers.
In some embodiments, the game control apparatus may be integrated into a plurality of electronic devices, for example, the game control apparatus may be integrated into a plurality of servers, and the plurality of servers implement the game control method of the present application.
In some embodiments, the server may also be implemented in the form of a terminal.
Referring to fig. 1a, currently in an electronic game of MOBA (Multiplayer Online Battle Arena) type, an identification of a virtual object, such as an identification of a player character 01 currently operated by a player, an identification of a character 02 currently operated by a teammate a, an identification of a character 03 currently operated by a teammate b, an identification of a character 04 currently operated by an enemy first, an identification of a character 05 currently operated by an enemy second, an identification of a game monster 06, an identification of a defense tower 07, etc., may be displayed on a small map 00; when a teammate first, a teammate second, an enemy first and/or a teammate second participates in the target game event, in order to observe the scene picture of the target game event, a player may observe the scene picture of the target game event by clicking the identification of the teammate first character 02 or the identification of the teammate second character 03 on the small map 00. The player may also switch to view the game monster 06 by clicking on the identification of game monster 06, to view the defense tower 07 by clicking on the identification of defense tower 07, and so on.
Since the operation of the current player for observing the group battle is complicated in the above manner, the embodiment of the present application provides a method for controlling a game, and referring to fig. 1b, in an embodiment of the present solution, the electronic device may be a mobile terminal, an electronic game may be run in the mobile terminal, a graphical user interface is displayed on a screen of the mobile terminal, the graphical user interface includes a game scene picture viewed at a first game perspective, the first game scene picture at least partially includes a first game scene picture of a virtual character 01, and may further include a skill application area 10, the skill application area 10 may include a skill application control 11, a skill application control 12, a skill application control 13, and a skill application control 14, the first game scene picture is a picture viewed by adjusting the perspective to the associated position of the virtual character 01 to observe the interaction between the virtual character 01 and the virtual scene, the virtual character 01 is a virtual object manipulated by a real person of a user.
Then, referring to fig. 1c, if two field battles are currently breaking out in the current game, the visual field switching control 21 corresponding to the field battle 1 and the visual field switching control 22 corresponding to the field battle 2 may be displayed in the skill application area 10 of the user operation interface in response to the visual field switching on instruction. Assuming that a target position corresponding to the group play 1 can be determined in response to a user's trigger operation for the visual field switching control 21, a second game scene screen is displayed, the target position can be determined according to the positions of the virtual objects 02 and 03 participating in the group play 1, and the virtual objects 02 and 03 in the group play 1 at this time are observed by adjusting the viewing angle to the target position.
The following are detailed below. The numbers in the following examples are not intended to limit the order of preference of the examples.
In this embodiment, a method for game control is provided, and as shown in fig. 1d, a specific flow of the method for game control may be as follows:
101. and displaying a visual field switching control on the graphical user interface.
The graphical User Interface comprises a User Interface (UI) and a picture of a game scene, wherein the UI is a medium for performing human-computer interaction and information exchange between a system and a User, and the UI can display system information in a human-acceptable form, so that the User can conveniently and efficiently operate a computer to achieve bidirectional human-computer interaction.
The user operation interface can be composed of visual elements such as controls, characters, graphs, images, icons and input boxes. For example, in some embodiments, a minimap, skill application controls, a virtual rocker, a scoring panel, and the like may be included in the user-operated interface.
All the controls described below can be represented in the form of buttons, input boxes, graphics, text, and the like.
The scene of the game can be formed by virtual objects such as virtual characters, buildings, terrains, animals and plants and the like.
The virtual objects in the game scene may include virtual characters, virtual buildings, virtual props, and the like. For example, the virtual character may include a teammate character operated by a teammate, an enemy character operated by an enemy user, an arbitrary call, a virtual monster, a virtual NPC (non-player character), and the like; for example, a virtual building may include a virtual defense tower, a virtual spa, a virtual revival point, etc.; for example, the virtual items may include virtual pets, virtual rides, dropped virtual purses, and so forth.
The virtual object can be set according to actual requirements. For example, in some embodiments, the virtual object may be a teammate character of the virtual character, the teammate character is a teammate character in the same arrangement as the virtual character, the teammate character may be a virtual object controlled by a real person, or a virtual object controlled by a computer AI (Artificial Intelligence), and so on.
The user role is played by the user, and the virtual role can be controlled by the real person of the user, for example, the user can control the virtual role to interact with the virtual scene, for example, the user can control the virtual role to move, hunt, detect and the like in the virtual scene.
Wherein, the visual angle refers to the visual angle of the lens of the virtual camera, and the virtual camera can capture the game world for the player and present the game world on the screen.
In some embodiments, the perspective may be located at an associated position of the virtual character, wherein the associated position of the virtual character may be established according to actual requirements, the associated position refers to a position moving along with the movement of the virtual character, and the associated position may be an eye, a head above the head, a back above the body, and the like of the virtual character.
In some embodiments, a view toggle control may be displayed at the graphical user interface in response to a view toggle on instruction.
In some embodiments, the image user interface further includes a minimap, and thus, the minimap may be enlarged in response to an enlargement instruction for the minimap, and a view switching on instruction may be generated.
In some embodiments, a view switch on command may be generated when the virtual object is detected to be currently in a non-combat state.
In some embodiments, the non-combat state includes at least one of: the virtual object is in a death state, the virtual object is in a non-departure state, and no enemy exists in the preset distance range of the virtual object.
There are various ways to trigger the view switching on command. For example, in some embodiments, when it is detected that the user taps the back of the mobile terminal multiple times in succession, a view switch on instruction is triggered; for example, in some embodiments, a switching control may be included in the UI interface, and when it is detected that the user triggers the switching control by clicking, long-pressing, tapping, double-clicking, multiple-clicking, or the like, a view switching start instruction may be triggered; for example, in some embodiments, when it is detected that the user may trigger the view switching on instruction by shaking the mobile terminal, the user may also start view switching in the game setting interface to generate the view switching on instruction, or the view switching on control may be set in the graphical user interface, and the view switching on instruction may be generated by acquiring a trigger operation of the user on the view switching on control.
In other embodiments, the view switching on command may be triggered automatically by the terminal, for example, when the terminal detects that a target game event (such as a group battle) occurs, the view switching on command is generated automatically.
The target game event is a game event that at least two virtual objects which are in the same game battle with the virtual object controlled by the player through the touch terminal participate in the same battle.
Thus, in some embodiments, step 101 may comprise the steps of:
displaying a switching control in the skill application area;
and responding to the triggering operation of the user for the switching control, and displaying a visual field switching control on the graphical user interface, wherein the visual field switching control is correlated with the target game event.
Wherein the target game event is a game event. The game event refers to a special event occurring in a game scene, such as a group war event, a dragon refreshing event, a super soldier generating event, a tower pushing event, a wild event, a tower pushing event, a dragon playing event and the like.
In some embodiments, the target game event and the view switching control have a corresponding association relationship; in some embodiments, a target game event is associated with a plurality of virtual objects.
The graphical user interface displays that the horizon switching control may be correlated with the group war events, which may be in a one-to-one correspondence, e.g., referring to fig. 1c, group war 1 may correspond with horizon switching control 21 and group war 2 may correspond with horizon switching control 22.
The target game event is associated with a plurality of virtual objects, for example, referring to fig. 1c, the group battle 1 may be associated with the virtual objects 02, 03 at the same time, and the group battle 2 may be associated with the virtual objects 04, 05 at the same time.
In some embodiments, the association of the view switching control with another target game event may be updated in response to the other target game event.
Another target game event refers to a game event other than a target game event.
For example, when a dragon refresh event occurring in a game scene is monitored, the view switching control corresponding to the group battle 1 may be updated to the view switching control corresponding to the dragon refresh event.
In some embodiments, a plurality of virtual objects are associated with a target game event, and the video toggle control can include the identities of the virtual objects, e.g., the names, avatars, etc. of the virtual objects can be included in the view toggle control.
For example, in some embodiments, for example, in a group battle, the teammate role B, the teammate role C, the teammate role D, and the distance between any two teammate roles in the team mate role B, the teammate role C, and the teammate role D in the game scene is smaller than a preset value, at this time, the teammate role B, the teammate role C, and the teammate role D can be observed regardless of whether the observation is performed at the associated position of the teammate role B, the associated position of the teammate role C, or the associated position of the teammate role D, and therefore, the three teammate roles can be corresponding to the same visual field switching control, and the visual field switching control includes the identifiers of the three teammate roles.
For example, referring to fig. 1c, the identification of virtual object 02 and the identification of virtual object 03 are included in view switching control 21 in combination.
In some embodiments, when it is detected that the distance between two virtual objects in the game scene is greater than or equal to the preset value, the view switching control including the identifications of all the virtual objects corresponding to the target game event may be updated to the view switching control corresponding to each virtual object, where the view switching control includes the identification of the virtual object.
For example, referring to fig. 1e, in some embodiments, when the distance between two virtual objects in the same queue in the UI interface is too large, that is, when two virtual objects participating in the same team battle end the team battle and the player cannot observe the other virtual object through the visual angle of one of the virtual objects, the view switching control containing the identifier of the combined virtual object is switched to the view switching control corresponding to each of the two virtual objects.
In some embodiments, the view switching control may only contain an identification of a representative object of the target game event, where the representative object is one of a plurality of virtual objects corresponding to the target game event.
Optionally, the representative object is the highest output one of the plurality of virtual objects corresponding to the target game event; optionally, the representative object is the highest damaged one of the virtual objects corresponding to the target game event; optionally, the representative object is one of the plurality of virtual objects corresponding to the target game event, which has the highest treatment amount; optionally, the representative object is one of the plurality of virtual objects corresponding to the target game event, which has the highest killing/death ratio score.
In some embodiments, a view switch control may be displayed on the graphical user interface in response to a computer automatically generated view switch on instruction.
Among them, there are various ways in which a computer automatically generates a view switching on instruction. For example, in some embodiments, the computer may automatically generate a view switch on command each time a preset duration elapses; for example, in some embodiments, the computer may automatically generate the view switch open instruction upon detecting the occurrence of a preset game event, wherein the preset game event may include a group fight event, a virtual character kill event other enemy character events, a virtual character city return event, a virtual character transfer event, a virtual character purchase equipment event, and the like.
For example, in some embodiments, when it is detected that no enemy character is present near the virtual character, and the virtual character is stationary for a period of time, the computer may automatically generate a view switch on instruction.
For example, in some embodiments, whether a virtual character controlled by a player through a touch terminal is currently in a non-combat state is detected, and if so, a view switching on instruction is generated.
Wherein the non-combat state may include at least one of: the virtual object is in a death state, the virtual object is in a non-departure state, and no enemy exists in the preset distance range of the virtual object.
In some embodiments, in order to facilitate user operation and improve user experience, a view switching control corresponding to the virtual object may be displayed in a skill application region of the user operation interface.
For example, in some embodiments, a skill control displayed on the graphical user interface may be changed to a view switching control.
In some embodiments, the skill control may also be enlarged when the skill control displayed on the graphical user interface is changed to a view switching control. The amplified visual field switching control is more beneficial for a user to observe the mark on the visual field switching control clearly, so that accurate switching is realized.
The shape and size of the skill application area can be set according to actual requirements, for example, the skill application area can be a rectangular area, a circular area, a square area and the like.
In some embodiments, the view toggle control contains an identification of the target object. The view switching control may be represented in the form of a button, and the view switching control may include an identifier of the virtual object, such as a name, an avatar, a symbol of the virtual object, and the like.
In some embodiments, the target game event is a game event in which at least two virtual objects in the same game lineup as the virtual object controlled by the player through the touch terminal participate in the same battle.
For example, the target game event may be a group battle, the group battle includes at least two virtual objects participating in the same game battle, and the object identifier of the virtual object may be a character avatar corresponding to the virtual object.
The visual field switching controls can be arbitrarily placed in the skill application area, for example, the visual field switching controls can be arranged and displayed in the skill application area in a dot matrix manner, and the arrangement method can be left alignment, right alignment, upper alignment, lower alignment, middle alignment and the like.
In some embodiments, a skill application control may be included in the skill application area, and may be used to manipulate the virtual character to apply the skill in the virtual scene.
For example, referring to fig. 1b, in some embodiments, the screens displayed by the screen of the mobile terminal may include a user operation interface and a first game scene screen, wherein the user operation interface may include a skill application area 10 therein, and the skill application area 10 may include a skill application control 11, a skill application control 12, a skill application control 13, and a skill application control 14 therein.
When the user triggers the skill application control 11, the virtual character 01 can apply the virtual skill (energy wave) corresponding to the skill application control 11; when a user triggers the skill application control 12, the virtual character 01 can apply the virtual skill (anti-jia) corresponding to the skill application control 12; when the user triggers the skill application control 13, the virtual character 01 can apply the virtual skill (defense state) corresponding to the skill application control 13; when the user triggers the skill application control 14, the virtual character 01 may apply the virtual skill [ common attack ] corresponding to the skill application control 14.
Thus, in some embodiments, the step of "displaying a view switching control corresponding to the virtual object in the skill application region of the user interface" may comprise the steps of:
(1) determining the control position of the skill application control in the skill application area;
(2) canceling the display of the skill application control in the skill application area;
(3) and displaying the visual field switching control corresponding to the target game event at the control position.
For example, referring to fig. 1b, in some embodiments, the control position of skill application control 11 is (2000, 100), the control position of skill application control 12 is (2100, 200), the control position of skill application control 13 is (2200, 100), and the control position of skill application control 14 is (2300, 300); the skill delivery controls may be dismissed from display in the skill delivery area and the viewport toggle control 21 corresponding to target game event 1 may be displayed at control position (2100, 100) and the viewport toggle control 22 corresponding to target game event 2 may be displayed at control position (2100, 200).
In some embodiments, if the number of target game events is greater than the number of skill application controls, additional view switching controls corresponding to the additional target game events may be additionally displayed at preset positions in the skill application area.
In some embodiments, if the number of target game events is less than the number of skill application controls, the display of redundant skill application controls may be cancelled.
In some embodiments, after the step "displaying the view switching control corresponding to the target game event at the control position", the following steps may be further included:
responding to the visual field switching closing instruction, and canceling the display of the visual field switching control in the skill application area;
and displaying the skill application control at the control position.
In some embodiments, the view switch off command may be generated by the same triggering mode of operation as the view switch on command.
For example, in some embodiments, the view switch off instruction may be generated by triggering a finger of the view switch control again. For example, when responding to the touch operation of the visual field switching control for the first time, generating a visual field switching opening instruction so as to display the visual field switching control in the skill application area; and generating a visual field switching closing instruction when responding to the touch operation of the visual field switching control again, so that the visual field switching control is not displayed in the skill application area.
In some embodiments, the view switch off instruction may be generated by a different triggering mode of operation than the view switch control. For example, when the visual field switching control is clicked, a visual field switching opening instruction is generated, so that the visual field switching control is displayed in the skill application area; and when the visual field switching control is double-clicked, generating a visual field switching closing instruction, and thus canceling the display of the visual field switching control in the skill application area.
For example, as shown in fig. 1f, the skill application area may include a skill application control, the user operation interface may include a view switching open control [ see fig. ], and when the user clicks the view switching open control [ see fig. ], the view switching control is displayed in the skill application area, and the view switching close control [ skill ] is displayed in the user operation interface; and when the user clicks the visual field to switch off the skill, displaying a skill application control in the skill application area.
In some embodiments, the view toggle on control and/or the view toggle off control may be displayed in a skill application area of the user operational interface.
102. And responding to the touch operation aiming at the visual field switching control, and determining that the target game event currently exists in the game scene.
In some embodiments, the target game event comprises at least one of: group war event, tower pushing event, field hitting event, dragon hitting event. Alternatively, the target game event may be a group event.
In some embodiments, the step of determining whether a target game event currently exists in the game scene may comprise the steps of:
when the distance between teammates is smaller than a preset value and the teammates are in a fighting state, a target game event exists, the target game event corresponds to the teammates, and the teammates are virtual objects which are in the same game formation with the virtual objects controlled by the players through the touch terminals;
and when the distance between teammates in the game scene is not less than a preset value or the teammates are in a non-combat state, the game scene does not exist at present.
For example, when the distance between two teammates A, B, C is less than 10 meters and the teammates A, B, C are in a fighting state, it can be determined that a team of fighting events corresponding to the teammates A, B, C exists in the game scene.
103. If the target game event exists, a target position is determined according to the target game event.
The touch operation can be a single click, touch, double click, long press and the like.
For example, referring to fig. 1c, in response to a trigger operation of the user on the view switching control 21, a position corresponding to the virtual object 02 or the virtual object 03 corresponding to the view switching control 21 may be determined as a target position; in response to a user's trigger operation on the view switching control 22, a position corresponding to the virtual object 04 or the virtual object 05 corresponding to the view switching control 22 may be determined as a target position.
In some embodiments, step 103 may include the steps of:
determining the position of each virtual object corresponding to the target game event;
and determining a target position according to the position of each virtual object corresponding to the target game event.
In some embodiments, a target object may be randomly determined among a plurality of virtual objects corresponding to the target game event, and the position of the target object may be used as the target position.
In some embodiments, a target position may be obtained by performing calculation according to a position of each virtual object corresponding to the target game event, so that all the virtual objects corresponding to the target game event can be seen in the second game scene picture determined according to the target position.
In some embodiments, the target position may be averaged from the position of each virtual object corresponding to the target game event.
104. And determining a second game visual angle according to the target position, and switching the first game scene picture into a second game scene picture for observing the game scene at the second game visual angle.
Multiple identical target game events can occur simultaneously in the game scene, for example, multiple group battle events can occur simultaneously in the game scene; in addition, a plurality of target game events of different types can also occur simultaneously in the game scene, for example, a group battle event, a wild event, a dragon event and the like can exist simultaneously in the game scene.
In some embodiments, each target game event may correspond to only one view switching control. For example, when there are multiple groups of war events in the game scene at the same time, only one view switching control may be displayed in the graphical user interface: "sudden team war events"; for example, when there are a group war event, a wild event, and a dragon event in the game scene at the same time, 3 view switching controls may be displayed in the graphical user interface: "emergent battle events", "emergent fighting events" and "emergent fielding events".
Alternatively, each time a touch operation of the view switching control for one type of game event is responded, the viewing angle can be automatically switched to view each type of game event.
Wherein, optionally, the switching order of the observation visual angles can be ordered according to the group war event intensity from high to low. Wherein, the intensity can be determined according to the skill release frequency, the large recruitment and release times, the killing times, the death times, the number of participators and other group battle parameters of the virtual object in the group battle event.
For example, in some embodiments, the switching order may be ordered in order of the number of virtual objects participating in a group war event from more to less. For example, if a game scene contains 4-person fighting events, 7-person fighting events, and 5-person fighting events simultaneously, the player first clicks the view switching control to switch to the screen for viewing the 7-person fighting events, the player second clicks the view switching control to switch to the screen for viewing the 5-person fighting events, and the player third clicks the view switching control to switch to the screen for viewing the 4-person fighting events.
After the images are switched, the content of the game minimap is also changed adaptively, so in some embodiments, the graphical user interface further includes a minimap, and the display content of the minimap includes a first thumbnail of the first game scene image;
in some embodiments, the graphical user interface may further include a minimap, the display content of which may contain a first thumbnail of the first game scene screen;
accordingly, the first thumbnail of the minimap may also be switched to the second thumbnail corresponding to the second game scene screen at step 104.
In some embodiments, the minimap may also be enlarged in response to an enlargement instruction for the minimap, and a view switching on instruction is generated.
In some embodiments, step 104 may be to implement the screen switch by canceling the display of the first game scene screen and displaying the second game scene screen.
In some embodiments, the second game scene screen may be switched to the first game scene screen in response to the end of the touch operation.
In some embodiments, a thumbnail viewing area may be included in the user interface, and step 104 may be displaying a second game scene in the thumbnail viewing area.
In order to achieve an effect that a user can observe a teammate character while controlling the movement of the user character, in some embodiments, the virtual object is controlled to move in the game scene in response to a moving operation for the virtual object in the process of switching the first game scene screen to the second game scene screen in which the game scene is observed at the second game view angle.
As can be seen from the above, the embodiment of the application can display the view switching control on the graphical user interface; responding to touch operation aiming at the visual field switching control, and determining a target position according to the incidence relation between the visual field switching control and the target game event; according to the target position, as the user often controls the virtual object to release the virtual skill through the skill release area, the visual field switching control can be arranged in the skill release area, the user does not need to click the identification of the virtual object in the small map, and the target game event to be observed can be selected directly by triggering the visual field switching control in the skill release area. Therefore, the interaction efficiency is improved by the scheme. And determining a second game visual angle, and switching the first game scene picture into a second game scene picture for observing the game scene at the second game visual angle.
The user often controls the virtual object to release the virtual skill through the skill release area, so that the visual field switching control can be arranged in the skill release area, the user does not need to click the mark of the virtual object in the small map, and can select the target game event to be observed directly by triggering the visual field switching control in the skill release area. Therefore, the interaction efficiency is improved by the scheme.
The method described in the above embodiments is further described in detail below.
The interaction scheme provided by the embodiment of the application can be applied to various electronic game scenes. For example, the method of the embodiment of the present application will be described in detail by taking a MOBA-type mobile game as an example.
In the interaction scheme provided by the embodiment of the application, the player can control the virtual rocker with the left hand to control the player character controlled by the player to move in the game scene, and the player character with the right hand selects the virtual object control in the skill release area, so that the corresponding teammate character is observed. The game control method comprises the following specific processes:
a player can start a convenient observation function in a game setting, and referring to fig. 2a, a view switching start control can appear in a user interface with the convenient observation function started (see fig.).
In the normal mode, a view switching open control [ see fig ] may appear in the user interface.
And (II) the player can control the player character to move by using the left hand and click the view switching opening control (see the figure) by using the right hand, and then the player enters the observation mode from the common mode after clicking.
Referring to fig. 2b, in the observation mode, the skill buttons (i.e., skill applying controls) are changed to be the avatar buttons containing characters corresponding to a plurality of teammates participating in the team battle, forming a total avatar button.
In some embodiments, the total avatar button may be enlarged by a predetermined factor, for example, 1.5 times the size of the original skill button, so that the avatar is displayed more clearly.
In the observation mode, when the player clicks the field-of-view switching closing control [ skill ], the mode may be returned to the normal mode from the observation mode.
And (III) the player can control the player character to move by using the left hand and press the general head portrait button by using the right hand, and the picture can be switched to a group battle picture so that the player can observe all teammates participating in the team battle.
And (IV) when the player stops pressing the general head portrait button for a long time, the picture can be switched back to the picture of the player character, so that the player can continuously control the player character to play the game.
Therefore, in the embodiment of the application, when the player controls the player character to move, the player can observe the current situation of the teammate character through the skill key of the right hand, so that the teammate can be observed more quickly and conveniently in the MOBA game; in addition, the player can track and observe different players more conveniently through the scheme, so that the situation on the field can be quickly followed, and the situation of teammates can be better observed. Therefore, the scheme can improve the interaction efficiency.
In order to better implement the method, the embodiment of the present application further provides a game control apparatus, which may be specifically integrated in an electronic device, where the electronic device may be a terminal, a server, or the like. The terminal can be a mobile phone, a tablet computer, an intelligent Bluetooth device, a notebook computer, a personal computer and other devices; the server may be a single server or a server cluster composed of a plurality of servers.
For example, in the present embodiment, the method of the present embodiment will be described in detail by taking an example in which a game control device is specifically integrated in a mobile terminal.
For example, as shown in fig. 3, the game control apparatus may provide a graphical user interface through a touch terminal, where the graphical user interface includes a first game scene screen for viewing a game scene from a first game perspective, and the first game scene screen at least partially includes a virtual character, and the virtual character is a virtual object controlled by a player through the touch terminal, and the apparatus includes:
a control unit 301, configured to display a view switching control on a graphical user interface;
a target unit 302, configured to determine whether a target game event currently exists in a game scene in response to a touch operation for a view switching control; if the target game event exists, determining a target position according to the target game event;
the second view unit 303 is configured to determine a second game view according to the target position, and switch the first game scene picture to a second game scene picture for observing the game scene at the second game view.
In some embodiments, the apparatus is further configured to:
and responding to the end of the touch operation, and switching the second game scene picture to the first game scene picture.
In some embodiments, the apparatus is further configured to:
and updating the incidence relation between the visual field switching control and the target game event in response to the target game event.
In some embodiments, the view toggle control contains an identification of a virtual object corresponding to the target game event.
In some embodiments, the view switching control includes an identifier of a representative object of the target game event, where the representative object is one of a plurality of virtual objects corresponding to the target game event.
In some embodiments, a control unit to:
and responding to the view switching starting instruction, and displaying a view switching control on the graphical user interface.
In some embodiments, the graphical user interface further comprises a minimap, the display content of the minimap containing a first thumbnail of the first game scene screen; the target unit is further configured to:
and switching the first thumbnail of the small map into a second thumbnail corresponding to the second game scene picture.
In some embodiments, the graphical user interface further comprises a minimap, the method comprising:
in response to a zoom-in instruction for the minimap, the minimap is zoomed in, and a view switching-on instruction is generated.
In some embodiments, a control unit to:
changing the skill control displayed on the graphical user interface to a view switching control.
In some embodiments, the step "when the skill control displayed on the graphical user interface is changed to the view switching control" further comprises:
and amplifying the skill control.
In some embodiments, the apparatus is further configured to:
and when the virtual object is detected to be in a non-combat state currently, generating a view switching opening instruction.
In some embodiments, the non-combat state includes at least one of: the virtual object is in a death state, the virtual object is in a non-departure state, and no enemy exists in the preset distance range of the virtual object.
In some embodiments, the target game event is a game event in which at least two virtual objects in the same game lineup as the virtual object controlled by the player through the touch terminal participate in the same battle.
In some embodiments, the apparatus is further configured to:
in the process of switching the first game scene picture to a second game scene picture in which the game scene is viewed at a second game perspective, the virtual object is controlled to move in the game scene in response to a movement operation for the virtual object.
In some embodiments, a control unit to:
when the distance between at least two virtual objects which are in the same game formation with the virtual object controlled by the player through the touch terminal in a game scene is smaller than a preset value and a target game event exists, determining a combined virtual object identifier for representing the at least two virtual objects;
displaying a visual field switching control containing a combined virtual object identifier on a graphical user interface;
a target unit for:
responding to touch operation aiming at the visual field switching control, if the visual field switching control acted by the touch operation contains the combined virtual object, determining that the visual field switching control has an association relation with at least two virtual objects, and determining the position of one virtual object in the at least two virtual objects as a target position.
In some embodiments, the apparatus is further configured to:
when the fact that the distance between the at least two virtual objects in the game scene is larger than or equal to a preset value is detected, the visual field switching control containing the combined virtual object identification is switched to the visual field switching control corresponding to each virtual object in the at least two virtual objects, and the visual field switching control contains the identification of the virtual object.
In a specific implementation, the above units may be implemented as independent entities, or may be combined arbitrarily to be implemented as the same or several entities, and the specific implementation of the above units may refer to the foregoing method embodiments, which are not described herein again.
As can be seen from the above, in the game control apparatus of the present embodiment, the control unit displays the view switching control on the graphical user interface; the target unit responds to touch operation aiming at the visual field switching control and determines a target position according to the incidence relation between the visual field switching control and the target game event; and the second visual angle unit determines a second game visual angle according to the target position, and switches the first game scene picture into a second game scene picture for observing the game scene at the second game visual angle.
Therefore, the interaction efficiency can be improved.
Correspondingly, the embodiment of the present application further provides a computer device, where the computer device may be a terminal or a server, and the terminal may be a terminal device such as a smart phone, a tablet computer, a notebook computer, a touch screen, a virtual machine, a Personal computer, and a Personal Digital Assistant (PDA).
As shown in fig. 4, fig. 4 is a schematic structural diagram of a computer device 400 according to an embodiment of the present application, where the computer device 400 includes a processor 401 having one or more processing cores, a memory 402 having one or more computer-readable storage media, and a computer program stored in the memory 402 and running on the processor. The processor 401 is electrically connected to the memory 402. Those skilled in the art will appreciate that the computer device configurations illustrated in the figures are not meant to be limiting of computer devices and may include more or fewer components than those illustrated, or some components may be combined, or a different arrangement of components.
The processor 401 is a control center of the computer device 400, connects the respective parts of the entire computer device 400 using various interfaces and lines, performs various functions of the computer device 400 and processes data by running or loading software programs and/or modules stored in the memory 402 and calling data stored in the memory 402, thereby monitoring the computer device 400 as a whole.
In the embodiment of the present application, the processor 401 in the computer device 400 loads instructions corresponding to processes of one or more application programs into the memory 402 according to the following steps, and the processor 401 runs the application programs stored in the memory 402, thereby implementing various functions:
displaying a view switching control on a graphical user interface;
responding to touch operation aiming at the visual field switching control, and determining a target object according to the incidence relation between the visual field switching control and the virtual object;
and determining a second game visual angle according to the position of the target object in the game scene, and switching the first game scene picture into a second game scene picture for observing the game scene at the second game visual angle.
The above operations can be implemented in the foregoing embodiments, and are not described in detail herein.
Optionally, as shown in fig. 4, the computer device 400 further includes: touch-sensitive display screen 403, radio frequency circuit 404, audio circuit 405, input unit 406 and power 407. The processor 401 is electrically connected to the touch display screen 403, the radio frequency circuit 404, the audio circuit 405, the input unit 406, and the power source 407. Those skilled in the art will appreciate that the computer device configuration illustrated in FIG. 4 does not constitute a limitation of computer devices, and may include more or fewer components than those illustrated, or some components may be combined, or a different arrangement of components.
The touch display screen 403 may be used for displaying a graphical user interface and receiving an operation instruction generated by a user acting on the graphical user interface. The touch display screen 403 may include a display panel and a touch panel. The display panel may be used, among other things, to display information entered by or provided to a user and various graphical user interfaces of the computer device, which may be made up of graphics, text, icons, video, and any combination thereof. Alternatively, the Display panel may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like. The touch panel may be used to collect touch operations of a user on or near the touch panel (for example, operations of the user on or near the touch panel using any suitable object or accessory such as a finger, a stylus pen, and the like), and generate corresponding operation instructions, and the operation instructions execute corresponding programs. Alternatively, the touch panel may include two parts, a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 401, and can receive and execute commands sent by the processor 401. The touch panel may overlay the display panel, and when the touch panel detects a touch operation thereon or nearby, the touch panel may transmit the touch operation to the processor 401 to determine the type of the touch event, and then the processor 401 may provide a corresponding visual output on the display panel according to the type of the touch event. In the embodiment of the present application, the touch panel and the display panel may be integrated into the touch display screen 403 to realize input and output functions. However, in some embodiments, the touch panel and the touch panel can be implemented as two separate components to perform the input and output functions. That is, the touch display screen 403 may also be used as a part of the input unit 406 to implement an input function.
In the embodiment of the present application, a virtual application is executed by the processor 401 to generate a graphical user interface on the touch display screen 403, where a virtual scene on the graphical user interface includes at least one skill control area, and the skill control area includes at least one skill control. The touch display screen 403 is used for presenting a graphical user interface and receiving an operation instruction generated by a user acting on the graphical user interface.
The rf circuit 404 may be used for transceiving rf signals to establish wireless communication with a network device or other computer device via wireless communication, and for transceiving signals with the network device or other computer device.
The audio circuit 405 may be used to provide an audio interface between a user and a computer device through speakers, microphones. The audio circuit 405 may transmit the electrical signal converted from the received audio data to a speaker, and convert the electrical signal into a sound signal for output; on the other hand, the microphone converts the collected sound signal into an electrical signal, which is received by the audio circuit 405 and converted into audio data, which is then processed by the audio data output processor 401, and then sent to, for example, another computer device via the radio frequency circuit 404, or output to the memory 402 for further processing. The audio circuit 405 may also include an earbud jack to provide communication of a peripheral headset with the computer device.
The input unit 406 may be used to receive input numbers, character information, or user characteristic information (e.g., fingerprint, iris, facial information, etc.), and to generate keyboard, mouse, joystick, optical, or trackball signal inputs related to user settings and function control.
The power supply 407 is used to power the various components of the computer device 400. Optionally, the power source 407 may be logically connected to the processor 401 through a power management system, so as to implement functions of managing charging, discharging, power consumption management, and the like through the power management system. The power supply 407 may also include one or more dc or ac power sources, recharging systems, power failure detection circuitry, power converters or inverters, power status indicators, or any other component.
Although not shown in fig. 4, the computer device 400 may further include a camera, a sensor, a wireless fidelity module, a bluetooth module, etc., which are not described in detail herein.
In the foregoing embodiments, the descriptions of the respective embodiments have respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to related descriptions of other embodiments.
As can be seen from the above, the computer device provided in this embodiment may display the view switching control on the graphical user interface; responding to touch operation aiming at the visual field switching control, and determining a target position according to the incidence relation between the visual field switching control and the target game event; and determining a second game visual angle according to the target position, and switching the first game scene picture into a second game scene picture for observing the game scene at the second game visual angle.
It will be understood by those skilled in the art that all or part of the steps of the methods of the above embodiments may be performed by instructions or by associated hardware controlled by the instructions, which may be stored in a computer readable storage medium and loaded and executed by a processor.
To this end, the present application provides a computer-readable storage medium, in which a plurality of computer programs are stored, and the computer programs can be loaded by a processor to execute the steps in any one of the methods for game control provided by the embodiments of the present application. For example, the computer program may perform the steps of:
displaying a view switching control on a graphical user interface;
responding to touch operation aiming at the visual field switching control, and determining whether a target game event exists in a game scene currently;
if the target game event exists, determining a target position according to the target game event;
and determining a second game visual angle according to the target position, and switching the first game scene picture into a second game scene picture for observing the game scene at the second game visual angle.
The above operations can be implemented in the foregoing embodiments, and are not described in detail herein.
Wherein the storage medium may include: read Only Memory (ROM), Random Access Memory (RAM), magnetic or optical disks, and the like.
Since the computer program stored in the storage medium can execute the steps in any of the game control methods provided in the embodiments of the present application, the beneficial effects that can be achieved by any of the game control methods provided in the embodiments of the present application can be achieved, and detailed descriptions are omitted here for the sake of detail in the foregoing embodiments.
The foregoing detailed description is directed to a method, an apparatus, a storage medium, and a computer device for game control provided in the embodiments of the present application, and specific examples are applied herein to illustrate the principles and implementations of the present application, and the descriptions of the foregoing embodiments are only used to help understand the method and the core ideas of the present application; meanwhile, for those skilled in the art, according to the idea of the present application, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present application.

Claims (18)

1. A method for controlling a game, wherein a graphical user interface is provided through a touch terminal, the graphical user interface includes a first game scene picture for viewing a game scene from a first game perspective, the first game scene picture at least partially includes a virtual character, and the virtual character is a virtual object controlled by a player through the touch terminal, the method includes:
displaying a view switching control on the graphical user interface;
responding to the touch operation aiming at the visual field switching control, and determining whether a target game event exists in a game scene currently;
if the target game event exists, determining a target position according to the target game event;
and determining a second game visual angle according to the target position, and switching the first game scene picture into a second game scene picture for observing a game scene with the second game visual angle.
2. The method of claim 1, wherein the target game event comprises at least one of: group war event, tower pushing event, field hitting event, dragon hitting event.
3. The method of claim 1, further comprising:
and responding to the end of the touch operation, and switching the second game scene picture to the first game scene picture.
4. The method of claim 1, wherein the horizon-switching control contains an identification of a virtual object corresponding to the target game event.
5. The method of claim 4, wherein the view switching control includes an identification of a representative object of the target game event, the representative object being one of a plurality of virtual objects corresponding to the target game event.
6. The method of claim 1, wherein displaying a view switching control on the graphical user interface comprises:
and responding to a visual field switching starting instruction, and displaying the visual field switching control on the graphical user interface.
7. The method of claim 1, wherein the graphical user interface further comprises a minimap, the display content of the minimap containing a first thumbnail of the first game scene screen;
if the target game event exists, after a target position is determined according to the target game event, the method further comprises the following steps:
and switching the first thumbnail of the small map to a second thumbnail corresponding to the second game scene picture.
8. The method of claim 6, wherein the graphical user interface further comprises a minimap, the method further comprising:
and responding to a magnification instruction aiming at the small map, magnifying the small map and generating a visual field switching opening instruction.
9. The method of claim 1, wherein displaying a view switching control on the graphical user interface comprises:
and changing the skill control displayed on the graphical user interface into the view switching control.
10. The method of claim 9, wherein when the skill control displayed on the graphical user interface is changed to a view switching control, the method further comprises:
and amplifying the skill control.
11. The method of claim 6, further comprising:
and when the virtual object is detected to be in a non-combat state currently, generating a view switching opening instruction.
12. The method of claim 11, wherein the non-combat state comprises at least one of: the virtual object is in a death state, the virtual object is in a non-departure state, and no enemy exists in the preset distance range of the virtual object.
13. The method of claim 1, wherein the target game event is a game event in which at least two virtual objects in the same game battle as the virtual object controlled by the player through the touch terminal participate in the same battle.
14. The method of claim 1, further comprising:
and in the process of switching the first game scene picture to a second game scene picture for observing a game scene at the second game visual angle, controlling the virtual object to move in the game scene in response to the movement operation aiming at the virtual object.
15. The method of claim 5, wherein determining whether a target game event currently exists in a game scene comprises:
when the distance between teammates is smaller than a preset value and the teammates are in a fighting state, a target game event exists and corresponds to the teammates, and the teammates are virtual objects which are in the same game formation with the virtual objects controlled by the players through the touch control terminal;
when the distance between the teammates in the game scene is not smaller than a preset value or the teammates are in a non-combat state, a target game event does not exist in the game scene at present;
the responding to the touch operation of the view switching control, and if the target game event exists, determining a target position according to the target game event, including:
determining the position of each virtual object corresponding to the target game event;
and determining a target position according to the position of each virtual object corresponding to the target game event.
16. An apparatus for controlling a game, wherein a graphical user interface is provided through a touch terminal, the graphical user interface includes a first game scene screen for viewing a game scene from a first game perspective, the first game scene screen at least partially includes a virtual character, and the virtual character is a virtual object controlled by a player through the touch terminal, the apparatus comprising:
the control unit is used for displaying a visual field switching control on the graphical user interface;
the target unit is used for responding to the touch operation aiming at the visual field switching control and determining whether a target game event exists in a game scene at present; if the target game event exists, determining a target position according to the target game event;
and the second visual angle unit is used for determining a second game visual angle according to the target position and switching the first game scene picture into a second game scene picture for observing a game scene with the second game visual angle.
17. A terminal comprising a processor and a memory, said memory storing a plurality of instructions; the processor loads instructions from the memory to perform the steps of the method of game control according to any one of claims 1 to 15.
18. A computer readable storage medium storing instructions adapted to be loaded by a processor to perform the steps of the method of game control according to any of claims 1 to 15.
CN202110801764.0A 2021-07-15 2021-07-15 Game control method, game control device, terminal and storage medium Active CN113398565B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110801764.0A CN113398565B (en) 2021-07-15 2021-07-15 Game control method, game control device, terminal and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110801764.0A CN113398565B (en) 2021-07-15 2021-07-15 Game control method, game control device, terminal and storage medium

Publications (2)

Publication Number Publication Date
CN113398565A true CN113398565A (en) 2021-09-17
CN113398565B CN113398565B (en) 2024-02-13

Family

ID=77686501

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110801764.0A Active CN113398565B (en) 2021-07-15 2021-07-15 Game control method, game control device, terminal and storage medium

Country Status (1)

Country Link
CN (1) CN113398565B (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113694529A (en) * 2021-09-23 2021-11-26 网易(杭州)网络有限公司 Game picture display method and device, storage medium and electronic equipment
CN113797527A (en) * 2021-09-30 2021-12-17 腾讯科技(深圳)有限公司 Game processing method, device, equipment, medium and program product
CN113827969A (en) * 2021-09-27 2021-12-24 网易(杭州)网络有限公司 Interaction method and device for game objects
CN114210067A (en) * 2021-12-07 2022-03-22 腾讯科技(深圳)有限公司 Control method and device of virtual prop, storage medium and electronic equipment
WO2024055811A1 (en) * 2022-09-16 2024-03-21 腾讯科技(深圳)有限公司 Message display method and apparatus, device, medium, and program product
WO2024078324A1 (en) * 2022-10-14 2024-04-18 腾讯科技(深圳)有限公司 Virtual object control method and apparatus, and storage medium and electronic device

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140038708A1 (en) * 2012-07-31 2014-02-06 Cbs Interactive Inc. Virtual viewpoint management system
US20180071632A1 (en) * 2016-09-14 2018-03-15 Bandai Namco Entertainment Inc. Game system, server, and information storage medium
CN109331468A (en) * 2018-09-26 2019-02-15 网易(杭州)网络有限公司 Display methods, display device and the display terminal at game visual angle
CN109675311A (en) * 2019-01-10 2019-04-26 网易(杭州)网络有限公司 Display control method, device, storage medium, processor and terminal in game
CN109806596A (en) * 2019-03-20 2019-05-28 网易(杭州)网络有限公司 Game picture display methods and device, storage medium, electronic equipment
CN111773711A (en) * 2020-07-27 2020-10-16 网易(杭州)网络有限公司 Game visual angle control method and device, storage medium and electronic device
CN112619137A (en) * 2021-01-06 2021-04-09 网易(杭州)网络有限公司 Game picture switching method and device, electronic equipment and storage medium
CN112791404A (en) * 2021-01-12 2021-05-14 网易(杭州)网络有限公司 Control method and device for virtual object in game and touch terminal

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140038708A1 (en) * 2012-07-31 2014-02-06 Cbs Interactive Inc. Virtual viewpoint management system
US20180071632A1 (en) * 2016-09-14 2018-03-15 Bandai Namco Entertainment Inc. Game system, server, and information storage medium
CN109331468A (en) * 2018-09-26 2019-02-15 网易(杭州)网络有限公司 Display methods, display device and the display terminal at game visual angle
CN109675311A (en) * 2019-01-10 2019-04-26 网易(杭州)网络有限公司 Display control method, device, storage medium, processor and terminal in game
CN109806596A (en) * 2019-03-20 2019-05-28 网易(杭州)网络有限公司 Game picture display methods and device, storage medium, electronic equipment
CN111773711A (en) * 2020-07-27 2020-10-16 网易(杭州)网络有限公司 Game visual angle control method and device, storage medium and electronic device
CN112619137A (en) * 2021-01-06 2021-04-09 网易(杭州)网络有限公司 Game picture switching method and device, electronic equipment and storage medium
CN112791404A (en) * 2021-01-12 2021-05-14 网易(杭州)网络有限公司 Control method and device for virtual object in game and touch terminal

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113694529A (en) * 2021-09-23 2021-11-26 网易(杭州)网络有限公司 Game picture display method and device, storage medium and electronic equipment
CN113827969A (en) * 2021-09-27 2021-12-24 网易(杭州)网络有限公司 Interaction method and device for game objects
CN113797527A (en) * 2021-09-30 2021-12-17 腾讯科技(深圳)有限公司 Game processing method, device, equipment, medium and program product
CN113797527B (en) * 2021-09-30 2023-07-14 腾讯科技(深圳)有限公司 Game processing method, device, equipment, medium and program product
CN114210067A (en) * 2021-12-07 2022-03-22 腾讯科技(深圳)有限公司 Control method and device of virtual prop, storage medium and electronic equipment
CN114210067B (en) * 2021-12-07 2023-07-25 腾讯科技(深圳)有限公司 Virtual prop control method and device, storage medium and electronic equipment
WO2024055811A1 (en) * 2022-09-16 2024-03-21 腾讯科技(深圳)有限公司 Message display method and apparatus, device, medium, and program product
WO2024078324A1 (en) * 2022-10-14 2024-04-18 腾讯科技(深圳)有限公司 Virtual object control method and apparatus, and storage medium and electronic device

Also Published As

Publication number Publication date
CN113398565B (en) 2024-02-13

Similar Documents

Publication Publication Date Title
CN113398565B (en) Game control method, game control device, terminal and storage medium
CN113101652A (en) Information display method and device, computer equipment and storage medium
CN113633963A (en) Game control method, device, terminal and storage medium
CN113350783B (en) Game live broadcast method and device, computer equipment and storage medium
CN113426124B (en) Display control method and device in game, storage medium and computer equipment
CN113082709A (en) Information prompting method and device in game, storage medium and computer equipment
WO2023005234A1 (en) Virtual resource delivery control method and apparatus, computer device, and storage medium
CN113398566A (en) Game display control method and device, storage medium and computer equipment
CN112843719A (en) Skill processing method, skill processing device, storage medium and computer equipment
CN112843716A (en) Virtual object prompting and viewing method and device, computer equipment and storage medium
CN113332724B (en) Virtual character control method, device, terminal and storage medium
CN115193064A (en) Virtual object control method and device, storage medium and computer equipment
CN114225412A (en) Information processing method, information processing device, computer equipment and storage medium
CN114159789A (en) Game interaction method and device, computer equipment and storage medium
CN114159788A (en) Information processing method, system, mobile terminal and storage medium in game
CN113413595A (en) Information prompting method and device, electronic equipment and storage medium
CN116139483A (en) Game function control method, game function control device, storage medium and computer equipment
CN114504819A (en) Game scene control method and device, computer equipment and storage medium
CN113426115A (en) Game role display method and device and terminal
CN113332721B (en) Game control method, game control device, computer equipment and storage medium
CN116850594A (en) Game interaction method, game interaction device, computer equipment and computer readable storage medium
CN115193062A (en) Game control method, device, storage medium and computer equipment
CN116870472A (en) Game view angle switching method and device, computer equipment and storage medium
CN116421968A (en) Virtual character control method, device, electronic equipment and storage medium
CN117654028A (en) Game display control method and device, computer equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant