CN113633963A - Game control method, device, terminal and storage medium - Google Patents

Game control method, device, terminal and storage medium Download PDF

Info

Publication number
CN113633963A
CN113633963A CN202110800941.3A CN202110800941A CN113633963A CN 113633963 A CN113633963 A CN 113633963A CN 202110800941 A CN202110800941 A CN 202110800941A CN 113633963 A CN113633963 A CN 113633963A
Authority
CN
China
Prior art keywords
virtual object
switching control
game
game scene
visual field
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110800941.3A
Other languages
Chinese (zh)
Inventor
胡佳胜
胡志鹏
程龙
刘勇成
袁思思
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Netease Hangzhou Network Co Ltd
Original Assignee
Netease Hangzhou Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netease Hangzhou Network Co Ltd filed Critical Netease Hangzhou Network Co Ltd
Priority to CN202110800941.3A priority Critical patent/CN113633963A/en
Publication of CN113633963A publication Critical patent/CN113633963A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/214Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads
    • A63F13/2145Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads the surface being also a display device, e.g. touch screens
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/23Input arrangements for video game devices for interfacing with the game device, e.g. specific interfaces between game controller and console
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/53Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game
    • A63F13/537Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game using indicators, e.g. showing the condition of a game character on screen
    • A63F13/5378Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game using indicators, e.g. showing the condition of a game character on screen for displaying an additional top view, e.g. radar screens or maps
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1068Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals being specially adapted to detect the point of contact of the player on a surface, e.g. floor mat, touch pad
    • A63F2300/1075Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals being specially adapted to detect the point of contact of the player on a surface, e.g. floor mat, touch pad using a touch screen
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/30Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device
    • A63F2300/303Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device for displaying additional data, e.g. simulating a Head Up Display
    • A63F2300/307Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device for displaying additional data, e.g. simulating a Head Up Display for displaying an additional window with a view from the top of the game field, e.g. radar screen
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/30Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device
    • A63F2300/308Details of the user interface

Abstract

The embodiment of the application discloses a method, a device, a terminal and a storage medium for game control; according to the embodiment of the application, a graphical user interface can be provided through the touch terminal, the graphical user interface comprises a first game scene picture for observing a game scene at a first game visual angle, at least part of the first game scene picture comprises virtual characters, and the virtual characters are virtual objects controlled by a player through the touch terminal; responding to a first touch operation aiming at the visual field switching control, and determining a target object according to the incidence relation between the visual field switching control and the virtual object; and determining a second game visual angle according to the position of the target object in the game scene, and switching the first game scene picture into a second game scene picture for observing the game scene at the second game visual angle. In the embodiment of the application, the user can display the second game scene picture through convenient and efficient operation. Therefore, the scheme can improve the interaction efficiency.

Description

Game control method, device, terminal and storage medium
Technical Field
The application relates to the field of computers, in particular to a method, a device, a terminal and a storage medium for game control.
Background
The User Interface (UI) is a medium for man-machine interaction and information exchange between the system and the User, and the UI can display system information in a form acceptable to human, so that the User can conveniently and efficiently operate the computer to achieve bidirectional man-machine interaction. The UI interface often includes a small map that indicates the positions of virtual objects in a virtual scene (e.g., a game scene and a simulation scene) so as to guide a user to switch to view the virtual objects by clicking the small map.
However, when the identifications of the virtual objects are close to each other, the user may not accurately click on the identification that the user wishes to click on, and thus, the user may not accurately observe the virtual object that the user wishes to observe; in addition, the operation mode is not humanized enough, when a player performs other operations, the operation span of clicking on a small map is large, the operation difficulty is large, and the operation habit of the user is not met, so that the current interaction mode is low in efficiency.
Disclosure of Invention
The embodiment of the application provides a game control method, a game control device, a game control terminal and a storage medium, and interaction efficiency can be improved.
An embodiment of the present application provides a method for controlling a game, where a touch terminal provides a graphical user interface, the graphical user interface includes a first game scene picture for observing a game scene from a first game perspective, the first game scene picture at least partially includes a virtual character, and the virtual character is a virtual object controlled by a player through the touch terminal, the method includes:
displaying a view switching control on a graphical user interface;
responding to a first touch operation aiming at the visual field switching control, and determining a target object according to the incidence relation between the visual field switching control and the virtual object;
and determining a second game visual angle according to the position of the target object in the game scene, and switching the first game scene picture into a second game scene picture for observing the game scene at the second game visual angle.
An embodiment of the present application further provides a game control apparatus, where a touch terminal provides a graphical user interface, the graphical user interface includes a first game scene picture for observing a game scene from a first game perspective, the first game scene picture at least partially includes a virtual character, and the virtual character is a virtual object controlled by a player through the touch terminal, the apparatus includes:
the control unit is used for displaying the visual field switching control on the graphical user interface;
the target unit is used for responding to a first touch operation aiming at the visual field switching control and determining a target object according to the incidence relation between the visual field switching control and the virtual object;
and the second visual angle unit is used for determining a second game visual angle according to the position of the target object in the game scene and switching the first game scene picture into a second game scene picture for observing the game scene at the second game visual angle.
In some embodiments, the apparatus is further configured to:
and responding to the end of the first touch operation, and switching the second game scene picture to the first game scene picture.
In some embodiments, the apparatus is further configured to:
and updating the incidence relation between the visual field switching control and the virtual object in response to the second touch operation aiming at the visual field switching control.
In some embodiments, the apparatus is further configured to:
and updating the association relation between the visual field switching control and the virtual object in response to the preset game event.
In some embodiments, the view switching control includes a plurality of sub-controls, each corresponding to a respective one of the virtual objects.
In some embodiments, the view toggle control contains an identification of the target object.
In some embodiments, a control unit to:
and responding to the view switching starting instruction, and displaying a view switching control on the graphical user interface.
In some embodiments, the graphical user interface further comprises a minimap, the display content of the minimap containing a first thumbnail of the first game scene screen; the target unit is further configured to:
and switching the first thumbnail of the small map into a second thumbnail corresponding to the second game scene picture.
In some embodiments, the graphical user interface further comprises a minimap, the method further comprising:
in response to a zoom-in instruction for the minimap, the minimap is zoomed in, and a view switching on instruction is generated.
In some embodiments, a control unit to:
changing the skill control displayed on the graphical user interface to a view switching control.
In some embodiments, the step "when the skill control displayed on the graphical user interface is changed to the view switching control" further comprises:
and amplifying the skill control.
In some embodiments, the method further comprises:
and detecting whether the virtual object controlled by the player through the touch terminal is in a non-combat state currently, and if so, generating a view switching opening instruction.
In some embodiments, the non-combat state includes at least one of: the virtual object is in a death state, the virtual object is in a non-departure state, and no enemy exists in the preset distance range of the virtual object.
In some embodiments, the target object is a virtual object in the same game play as the virtual object controlled by the player through the touch terminal.
In some embodiments, the method further comprises:
in the process of switching the first game scene picture to a second game scene picture in which the game scene is viewed at a second game perspective, the virtual object is controlled to move in the game scene in response to a movement operation for the virtual object.
In some embodiments, a control unit to:
when the distance between the first virtual object and the second virtual object in the game scene is smaller than a preset value, determining a combined virtual object identifier for representing the first virtual object and the second virtual object;
displaying a visual field switching control containing a combined virtual object identifier on a graphical user interface;
a target unit for:
responding to the touch operation aiming at the visual field switching control, if the visual field switching control acted by the touch operation contains the combined virtual object identifier, determining that the visual field switching control has an incidence relation with the first virtual object and the second virtual object respectively, and determining the first virtual object or the second virtual object as a target object.
In some embodiments, the apparatus is further configured to:
when the fact that the distance between the first virtual object and the second virtual object in the game scene is larger than or equal to a preset value is detected, the visual field switching control containing the combined virtual object identification is switched into a first visual field switching control and a second visual field switching control, the first visual field switching control contains the identification of the first virtual object, and the second visual field switching control contains the identification of the second virtual object.
The embodiment of the application also provides a terminal, which comprises a memory and a control unit, wherein the memory stores a plurality of instructions; the processor loads instructions from the memory to execute the steps of any one of the methods for game control provided by the embodiments of the present application.
Embodiments of the present application further provide a computer-readable storage medium, where a plurality of instructions are stored, where the instructions are suitable for being loaded by a processor to perform any of the steps in the method for controlling a game provided in the embodiments of the present application.
The embodiment of the application can display the view switching control on the graphical user interface; responding to a first touch operation aiming at the visual field switching control, and determining a target object according to the incidence relation between the visual field switching control and the virtual object; and determining a second game visual angle according to the position of the target object in the game scene, and switching the first game scene picture into a second game scene picture for observing the game scene at the second game visual angle.
The method has the advantages that the visual field switching control is arranged in the graphical user interface, the user does not need to click the identification of the virtual object in the small map, the virtual object to be observed can be selected directly by triggering the visual field switching control in the graphical user interface, the operation difficulty of the method is small, the operation habit of the user is met, and the user can accurately trigger the visual field switching control which the user wants to trigger, so that the method is more accurate and simpler to operate. Therefore, the efficiency of displaying the user control picture is improved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
FIG. 1a is a prior art interaction diagram;
FIG. 1b is a schematic diagram of a skill application control of a method of game control provided by an embodiment of the present application;
FIG. 1c is a schematic view of a switching control of a field of view of a method for game control according to an embodiment of the present disclosure;
FIG. 1d is a schematic flow chart of a method for game control according to an embodiment of the present application;
FIG. 1e is a schematic switching diagram of a method for game control according to an embodiment of the present disclosure;
fig. 1f is a schematic view of a switching control of a visual field corresponding to an object group in the method for controlling a game according to the embodiment of the present application;
FIG. 2a is a schematic UI diagram of a common mode applied to a game scene by the method for game control according to the embodiment of the present application;
FIG. 2b is a schematic UI diagram of an observation mode applied in a game scene by the method for game control according to the embodiment of the present application;
FIG. 3 is a schematic structural diagram of a game control device provided in an embodiment of the present application;
fig. 4 is a schematic structural diagram of a terminal according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The embodiment of the application provides a game control method, a game control device, a terminal and a storage medium.
The game control device may be specifically integrated in an electronic device, and the electronic device may be a terminal, a server, or the like. The terminal can be a mobile phone, a tablet Computer, an intelligent bluetooth device, a notebook Computer, or a Personal Computer (PC), and the like; the server may be a single server or a server cluster composed of a plurality of servers.
In some embodiments, the game control apparatus may be integrated into a plurality of electronic devices, for example, the game control apparatus may be integrated into a plurality of servers, and the plurality of servers implement the game control method of the present application.
In some embodiments, the server may also be implemented in the form of a terminal.
Referring to fig. 1a, currently in an electronic game of MOBA (Multiplayer Online Battle Arena) type, an identification of a virtual object, such as an identification of a player character 01 currently operated by a player, an identification of a character 02 currently operated by a teammate, an identification of a game monster 03, an identification of a defense tower 04, etc., may be displayed on a small map 00; the player can switch to the picture of observing the teammate character 02 by clicking the identification of the teammate character 02 on the minimap 00, to the picture of observing the game wilderness 03 by clicking the identification of the game wilderness 03, to the picture of observing the defense tower 04 by clicking the identification of the defense tower 04, and so on.
Since the operation of the player observing teammates in the above manner is complicated at present, this embodiment proposes a game control method, and referring to fig. 1b, the electronic device may be a mobile terminal, an electronic game may be run in the mobile terminal, a graphical user interface may be displayed on a screen of the mobile terminal, the graphical user interface includes a first game scene screen for observing a game scene at a first game viewing angle, the first game scene screen at least partially includes a first game scene screen of a virtual character 01, and may include a skill applying area 10, the skill applying area 10 may include a skill applying control 11, a skill applying control 12, a skill applying control 13, and a skill applying control 14, the first game scene screen is a screen for observing the interaction between the virtual character 01 and the virtual scene by adjusting the viewing angle to the associated position of the virtual character 01, the virtual character 01 is a virtual object manipulated by a real person of a user.
Then, referring to fig. 1c, in response to the view switching on instruction, a view switching control 21 corresponding to the virtual object 01, a view switching control 22 corresponding to the virtual object 02, a view switching control 23 corresponding to the virtual object 03, and a view switching control 24 corresponding to the virtual object 04 may be displayed in the skill applying region 10 of the user operation interface. It is assumed that, in response to a user's trigger operation for the field-of-view switching control 22, the target object 02 can be determined among the virtual objects; displaying a second game scene picture, wherein the second game scene picture is a picture for observing the interaction between the target object 02 and the virtual scene by adjusting the view angle to the associated position of the target object, that is, the second game scene picture is a picture for observing the game scene at the corresponding game view angle (that is, the second game view angle) of the target object 02.
The following are detailed below. The numbers in the following examples are not intended to limit the order of preference of the examples.
In this embodiment, a method for game control is provided, and as shown in fig. 1d, a specific flow of the method for game control may be as follows:
101. and displaying a visual field switching control on the graphical user interface.
The graphical User Interface comprises a User Interface (UI) and a picture of a game scene, wherein the UI is a medium for performing human-computer interaction and information exchange between a system and a User, and the UI can display system information in a human-acceptable form, so that the User can conveniently and efficiently operate a computer to achieve bidirectional human-computer interaction.
The user operation interface can be composed of visual elements such as controls, characters, graphs, images, icons and input boxes. For example, in some embodiments, a minimap, skill application controls, a virtual rocker, a scoring panel, and the like may be included in the user-operated interface.
All the controls described below can be represented in the form of buttons, input boxes, graphics, text, and the like.
The scene of the game can be formed by virtual objects such as virtual characters, buildings, terrains, animals and plants and the like.
The virtual objects in the game scene may include virtual characters, virtual buildings, virtual props, and the like. For example, the virtual character may include a teammate character operated by a teammate, an enemy character operated by an enemy user, an arbitrary call, a virtual monster, a virtual NPC (non-player character), and the like; for example, a virtual building may include a virtual defense tower, a virtual spa, a virtual revival point, etc.; for example, the virtual items may include virtual pets, virtual rides, dropped virtual purses, and so forth.
The virtual object can be set according to actual requirements. For example, in some embodiments, the virtual object may be set to a teammate character of the virtual character, which is a teammate character that is a same play as the virtual character. The teammate role may be a virtual object controlled by a real person, a virtual object controlled by a computer AI (Artificial Intelligence), and the like.
The user role is played by the user, and the virtual role can be controlled by the real person of the user, for example, the user can control the virtual role to interact with the virtual scene, for example, the user can control the virtual role to move, hunt, reconnaissance and the like in the virtual scene.
Wherein, the visual angle refers to the visual angle of the lens of the virtual camera, and the virtual camera can capture the game world for the player and present the game world on the screen.
In some embodiments, the perspective may be located at an associated position of the virtual character, wherein the associated position of the virtual character may be established according to actual requirements, the associated position refers to a position moving along with the movement of the virtual character, and the associated position may be an eye, a head above the head, a back above the body, and the like of the virtual character. In some embodiments, a view switch control may be displayed on the graphical user interface in response to a view switch on instruction.
There are various ways to trigger the view switching on command. For example, in some embodiments, when it is detected that the user taps the back of the mobile terminal multiple times in succession, a view switch on instruction is triggered; for example, in some embodiments, a switching control may be included in the UI interface, and when it is detected that the user triggers the switching control by clicking, long-pressing, tapping, double-clicking, multiple-clicking, or the like, a view switching start instruction may be triggered; for example, in some embodiments, when it is detected that the user may trigger the view switching on instruction by shaking the mobile terminal, the user may also start view switching in the game setting interface to generate the view switching on instruction, or the view switching on control may be set in the graphical user interface, and the view switching on instruction may be generated by acquiring a trigger operation of the user on the view switching on control.
Thus, in some embodiments, step 101 may comprise the steps of:
displaying a switching control in the skill application area;
and responding to the triggering operation of the user for the switching control, and displaying a visual field switching control on the graphical user interface, wherein the visual field switching control is associated with the virtual object.
In some embodiments, the virtual object and the view switching control have a corresponding association relationship therebetween; in some embodiments, multiple virtual objects may be associated with the same view-switching control.
In some embodiments, the association relationship of the view switching control with the virtual object may be updated in response to a second touch operation for the view switching control.
The second touch operation may be double-click, single-click, touch, long-press, and the like.
In some embodiments, the association of the view switching control with the virtual object may be updated in response to a preset game event.
In some embodiments, the step of "displaying a view switching control on a graphical user interface" may comprise the steps of:
when the distance between the first virtual object and the second virtual object in the game scene is smaller than a preset value, determining a combined virtual object identifier for representing the first virtual object and the second virtual object;
displaying a visual field switching control containing a combined virtual object identifier on a graphical user interface;
in some embodiments, the step "determining a target virtual object according to the association relationship between the visual field switching control and the virtual object in response to the first touch operation for the visual field switching control" may include the steps of:
responding to the touch operation aiming at the visual field switching control, if the visual field switching control acted by the touch operation contains the combined virtual object identifier, determining that the visual field switching control has an incidence relation with the first virtual object and the second virtual object respectively, and determining the first virtual object or the second virtual object as the target virtual object.
In some embodiments, when it is detected that the distance between the first virtual object and the second virtual object in the game scene is greater than or equal to the preset value, the view switching control including the identification of the combined virtual object may be switched to a first view switching control and a second view switching control, the first view switching control including the identification of the first virtual object, and the second view switching control including the identification of the second virtual object.
In some embodiments, a view switch control may be displayed on the graphical user interface in response to a computer automatically generated view switch on instruction.
Among them, there are various ways in which a computer automatically generates a view switching on instruction. For example, in some embodiments, the computer may automatically generate a view switch on command each time a preset duration elapses; for example, in some embodiments, the computer may automatically generate the view switch open instruction upon detecting the occurrence of a preset game event, wherein the preset game event may include a group fight event, a virtual character kill event other enemy character events, a virtual character city return event, a virtual character transfer event, a virtual character purchase equipment event, and the like.
For example, in some embodiments, when it is detected that no enemy character is present near the virtual character, and the virtual character is stationary for a period of time, the computer may automatically generate a view switch on instruction.
For example, in some embodiments, whether a virtual character controlled by a player through a touch terminal is currently in a non-combat state is detected, and if so, a view switching on instruction is generated.
Wherein the non-combat state may include at least one of: the virtual object is in a death state, the virtual object is in a non-departure state, and no enemy exists in the preset distance range of the virtual object.
In some embodiments, in order to facilitate user operation and improve user experience, a view switching control corresponding to the virtual object may be displayed in a skill application region of the user operation interface.
For example, in some embodiments, a skill control displayed on the graphical user interface may be changed to a view switching control.
In some embodiments, the skill control may also be enlarged when the skill control displayed on the graphical user interface is changed to a view switching control. The amplified visual field switching control is more beneficial for a user to observe the mark on the visual field switching control clearly, so that accurate switching is realized.
The shape and size of the skill application area can be set according to actual requirements, for example, the skill application area can be a rectangular area, a circular area, a square area and the like.
In some embodiments, the view toggle control contains an identification of the target object. The view switching control may be represented in the form of a button, and the view switching control may include an identity of a virtual object, such as a name, an avatar, a symbol, and the like of the view switching control.
In some embodiments, the target object is a virtual object in the same game play as a virtual character controlled by the player through the touch terminal.
For example, in some embodiments, the target object may be a teammate character, the teammate character being a virtual object that is collocated with the virtual character, and the object identification of the virtual object may include a character avatar of the teammate character.
The visual field switching controls can be arbitrarily placed in the skill application area, for example, the visual field switching controls can be arranged and displayed in the skill application area in a dot matrix manner, and the arrangement method can be left alignment, right alignment, upper alignment, lower alignment, middle alignment and the like.
In some embodiments, a skill application control may be included in the skill application area, and may be used to manipulate the virtual character to apply the skill in the virtual scene.
For example, referring to fig. 1b, in some embodiments, the screens displayed by the screen of the mobile terminal may include a user operation interface and a first game scene screen, wherein the user operation interface may include a skill application area 10 therein, and the skill application area 10 may include a skill application control 11, a skill application control 12, a skill application control 13, and a skill application control 14 therein.
When the user triggers the skill application control 11, the virtual character 01 can apply the virtual skill (energy wave) corresponding to the skill application control 11; when a user triggers the skill application control 12, the virtual character 01 can apply the virtual skill (anti-jia) corresponding to the skill application control 12; when the user triggers the skill application control 13, the virtual character 01 can apply the virtual skill (defense state) corresponding to the skill application control 13; when the user triggers the skill application control 14, the virtual character 01 may apply the virtual skill [ common attack ] corresponding to the skill application control 14.
Thus, in some embodiments, the step of "displaying a view switching control corresponding to the virtual object in the skill application region of the user interface" may comprise the steps of:
(1) determining the control position of the skill application control in the skill application area;
(2) canceling the display of the skill application control in the skill application area;
(3) and displaying the visual field switching control corresponding to the virtual object at the control position.
For example, referring to fig. 1b, in some embodiments, the control position of skill application control 11 is (2000, 100), the control position of skill application control 12 is (2100, 200), the control position of skill application control 13 is (2200, 100), and the control position of skill application control 14 is (2300, 300); the skill application controls can be dismissed from display in the skill application area, and the view switching control 21 corresponding to the virtual object 01 is displayed at the control position (2000, 100), the view switching control 22 corresponding to the virtual object 02 is displayed at the control position (2100, 200), the view switching control 23 corresponding to the virtual object 03 is displayed at the control position (2200, 100), and the view switching control 24 corresponding to the virtual object 04 is displayed at the control position (2300, 300).
In some embodiments, if the number of the virtual objects is greater than the number of the skill application controls, additional view switching controls corresponding to additional virtual objects may be additionally displayed at a preset position in the skill application area, and it may also be controlled that more than two virtual objects correspond to the same view switching control.
In some embodiments, if the number of virtual objects is less than the number of skill application controls, the display of redundant skill application controls may be cancelled.
In some embodiments, after the step "(3) displaying the view switching control corresponding to the virtual object" at the control position, the following steps may be further included:
(3.1) in response to the vision switching closing instruction, canceling the display of the vision switching control in the skill application area;
and (3.2) displaying the skill application control at the control position.
In some embodiments, the view switch off command may be generated by the same triggering mode of operation as the view switch on command.
For example, in some embodiments, the view switch off instruction may be generated by triggering a finger of the view switch control again. For example, when responding to the touch operation of the visual field switching control for the first time, generating a visual field switching opening instruction so as to display the visual field switching control in the skill application area; and generating a visual field switching closing instruction when responding to the touch operation of the visual field switching control again, so that the visual field switching control is not displayed in the skill application area.
In some embodiments, the view switch off instruction may be generated by a different triggering mode of operation than the view switch control. For example, when the visual field switching control is clicked, a visual field switching opening instruction is generated, so that the visual field switching control is displayed in the skill application area; and when the visual field switching control is double-clicked, generating a visual field switching closing instruction, and thus canceling the display of the visual field switching control in the skill application area.
For example, as shown in fig. 1e, the skill application area may include a skill application control, the user operation interface may include a view switching opening control [ see fig. ], and when the user clicks the view switching opening control [ see fig. ], the view switching control is displayed in the skill application area, and the view switching closing control [ skill ] is displayed in the user operation interface; and when the user clicks the visual field to switch off the skill, displaying a skill application control in the skill application area.
In some embodiments, the view toggle on control and/or the view toggle off control may be displayed in a skill application area of the user operational interface.
In some embodiments, a plurality of virtual objects may be converged together, for example, when a team is in a battle, a teammate role B, a teammate role C, and a teammate role D are in a team, and at this time, the teammate role B, the teammate role C, and the teammate role D can be observed regardless of whether the positions of the teammate role B, the teammate role C, and the teammate role D are associated, so that the three teammate roles can correspond to the same view switching control.
Thus, in some embodiments, the view switching control includes a plurality of sub-controls, each corresponding to a respective one of the virtual objects.
In some embodiments, when a view switching control corresponds to multiple virtual objects, the view switching control may include the identities of the virtual objects, for example, the names, head images, and the like of the virtual objects.
For example, referring to fig. 1f, when the view switching control 30 corresponds to the virtual object 01 and the virtual object 03, the view switching control 30 may include an avatar of the virtual object 01, and an avatar of the virtual object 03.
When a plurality of virtual objects gathered together are dispersed, the three teammate roles corresponding to the same view switching control can be re-represented as one view switching control corresponding to each teammate role.
Therefore, in some embodiments, a plurality of virtual objects may correspond to a view switching control, and the step "(3) of displaying the view switching control corresponding to the virtual object at the control position" may include the steps of:
(1) determining a position of each virtual object in the virtual scene;
(2) grouping all the virtual objects based on the positions of the virtual objects to obtain at least one object group;
(3) and displaying the visual field switching control corresponding to each object group in a skill application area of the user operation interface.
For example, the virtual object 01, the virtual object 02, and the virtual object 03 may be grouped according to the relative distance between the virtual object 01 and the virtual object 02 or the virtual object 03, so as to obtain at least one object group; and then displaying the visual field switching control corresponding to each object group in the skill application area of the user operation interface.
For example, in some embodiments, the step of "(2) grouping all the virtual objects based on their positions to obtain at least one object group" may comprise the steps of:
calculating a relative distance between the virtual objects based on the positions of the virtual objects;
and clustering the virtual objects based on the relative distance between the virtual objects to obtain at least one object group.
The clustering method has various types, and can be designed according to the actual application scenario, and therefore, the details are not described herein.
In some embodiments, the view switching control corresponding to the object group may include an object group identifier, which may include an object identifier of each virtual object in the object group. For example, the object identification may be avatar information of the virtual object.
In some embodiments, the view toggle control for the object group may be larger than the view toggle control for the single virtual object, for example, in some embodiments, the view toggle control for the object group may be X times larger than the view toggle control for the single virtual object, where X is a positive number greater than 1.
102. Responding to a first touch operation aiming at the visual field switching control, and determining a target object according to the incidence relation between the visual field switching control and the virtual object; .
The first touch operation can be a single-click operation, a touch operation, a double-click operation, a long-press operation and the like.
For example, referring to fig. 1c, in response to a trigger operation of the user for the view switching control 21, the virtual object 01 corresponding to the view switching control 21 may be determined as a target object; in response to a triggering operation of the user on the view switching control 22, the virtual object 02 corresponding to the view switching control 22 may be determined as a target object; in response to the triggering operation of the user on the view switching control 23, the virtual object 03 corresponding to the view switching control 23 may be determined as a target object; in response to a user's trigger operation on the view switching control 24, the virtual object 04 corresponding to the view switching control 24 may be determined as a target object.
In some embodiments, when the view switching control corresponding to the object group is triggered, one of the virtual objects in the object group may be randomly selected as the target object.
103. And determining a second game visual angle according to the position of the target object in the game scene, and switching the first game scene picture into a second game scene picture for observing the game scene at the second game visual angle.
After the images are switched, the content of the game minimap is also changed adaptively, so in some embodiments, the graphical user interface further includes a minimap, and the display content of the minimap includes a first thumbnail of the first game scene image;
the step "after determining a target object according to the incidence relation between the visual field switching control and the virtual object in response to the first touch operation for the visual field switching control" may further include:
and switching the first thumbnail of the small map into a second thumbnail corresponding to the second game scene picture.
In some embodiments, the minimap may also be enlarged in response to an enlargement instruction for the minimap, and a view switching on instruction is generated.
In some embodiments, step 103 may be to implement the screen switch by canceling the display of the first game scene screen and displaying the second game scene screen.
In some embodiments, the second game scene screen may be switched to the first game scene screen in response to an end of the first touch operation.
In some embodiments, a thumbnail viewing area may be included in the user interface, and step 103 may be displaying a second game scene in the thumbnail viewing area.
In order to achieve an effect that a user can observe a teammate character while controlling the movement of the user character, in some embodiments, the virtual object is controlled to move in the game scene in response to a moving operation for the virtual object in the process of switching the first game scene screen to the second game scene screen in which the game scene is observed at the second game view angle.
As can be seen from the above, the embodiment of the application can display the view switching control on the graphical user interface; responding to a first touch operation aiming at the visual field switching control, and determining a target object according to the incidence relation between the visual field switching control and the virtual object; and determining a second game visual angle according to the position of the target object in the game scene, and switching the first game scene picture into a second game scene picture for observing the game scene at the second game visual angle.
The user often controls the virtual object to release the virtual skill through the skill release area, so that the visual field switching control can be arranged in the skill release area, the user does not need to click the identification of the virtual object in the small map, the virtual object to be observed can be selected directly by triggering the visual field switching control in the skill release area, the method is low in operation difficulty and small in operation span, the use habit of the user is well met, and the user can accurately trigger the visual field switching control which the user wants to trigger, so that the method is more accurate and simpler to operate. Therefore, the interaction efficiency is improved by the scheme.
The method described in the above embodiments is further described in detail below.
The interaction scheme provided by the embodiment of the application can be applied to various electronic game scenes. For example, the method of the embodiment of the present application will be described in detail by taking a MOBA-type mobile game as an example.
In the interaction scheme provided by the embodiment of the application, the player can control the virtual rocker with the left hand to control the player character controlled by the player to move in the game scene, and the player character with the right hand selects the virtual object control in the skill release area, so that the corresponding teammate character is observed. The game control method comprises the following specific processes:
a player can start a convenient observation function in a game setting, and referring to fig. 2a, a view switching start control can appear in a user interface with the convenient observation function started (see fig.).
In the normal mode, a view switching open control [ see fig ] may appear in the user interface.
And (II) the player can control the player character to move by using the left hand and click the view switching opening control (see the figure) by using the right hand, and then the player enters the observation mode from the common mode after clicking. Referring to fig. 2b, in this viewing mode, the skill buttons (i.e., skill application controls) become the avatar buttons of the teammate character.
In the observation mode, when the player clicks the field-of-view switching closing control [ skill ], the mode may be returned to the normal mode from the observation mode.
And (III) the player can control the movement of the player character by the left hand and select and click the head portrait key of any teammate character by the right hand to observe the teammate character.
And (IV) if a part of teammates are together to play the team battle, combining the head portrait buttons of the teammates together to form a total head portrait button.
In some embodiments, the total avatar button may be enlarged by a predetermined factor, for example, 1.5 times the size of the original skill button, so that the avatar is displayed more clearly.
In some embodiments, when the player clicks on this general avatar button, one of the teammate characters may be randomly selected for viewing.
And (V) if the teammate character in the total head portrait buttons leaves the observation visual angle in the observation process, separating the head portrait buttons of the teammate character from the total head portrait buttons and changing the head portrait buttons into a plurality of head portrait buttons again.
Therefore, in the embodiment of the application, when the player controls the player character to move, the player can observe the current situation of the teammate character through the skill key of the right hand, so that the teammate can be observed more quickly and conveniently in the MOBA game; in addition, the player can track and observe different players more conveniently through the scheme, so that the situation on the field can be quickly followed, and the situation of teammates can be better observed. Therefore, the scheme can improve the interaction efficiency.
In order to better implement the method, the embodiment of the present application further provides a game control apparatus, which may be specifically integrated in an electronic device, where the electronic device may be a terminal, a server, or the like. The terminal can be a mobile phone, a tablet computer, an intelligent Bluetooth device, a notebook computer, a personal computer and other devices; the server may be a single server or a server cluster composed of a plurality of servers.
For example, in the present embodiment, the method of the present embodiment will be described in detail by taking an example in which a game control device is specifically integrated in a mobile terminal.
For example, as shown in fig. 3, the game control apparatus may provide a graphical user interface through a touch terminal, where the graphical user interface includes a first game scene screen for viewing a game scene from a first game perspective, and the first game scene screen at least partially includes a virtual character, and the virtual character is a virtual object controlled by a player through the touch terminal, and the apparatus includes:
a control unit 301, configured to display a view switching control on a graphical user interface;
the target unit 302 is configured to determine a target object according to an association relationship between the view switching control and the virtual object in response to a first touch operation for the view switching control;
the second view unit 303 is configured to determine a second game view according to the position of the target object in the game scene, and switch the first game scene picture to a second game scene picture for observing the game scene at the second game view.
In some embodiments, the apparatus is further configured to:
and responding to the end of the first touch operation, and switching the second game scene picture to the first game scene picture.
In some embodiments, the apparatus is further configured to:
and updating the incidence relation between the visual field switching control and the virtual object in response to the second touch operation aiming at the visual field switching control.
In some embodiments, the apparatus is further configured to:
and updating the association relation between the visual field switching control and the virtual object in response to the preset game event.
In some embodiments, the view switching control includes a plurality of sub-controls, each corresponding to a respective one of the virtual objects.
In some embodiments, the view toggle control contains an identification of the target object.
In some embodiments, a control unit to:
and responding to the view switching starting instruction, and displaying a view switching control on the graphical user interface.
In some embodiments, the graphical user interface further comprises a minimap, the display content of the minimap containing a first thumbnail of the first game scene screen; the target unit is further configured to:
and switching the first thumbnail of the small map into a second thumbnail corresponding to the second game scene picture.
In some embodiments, the graphical user interface further comprises a minimap, the method further comprising:
in response to a zoom-in instruction for the minimap, the minimap is zoomed in, and a view switching on instruction is generated.
In some embodiments, a control unit to:
changing the skill control displayed on the graphical user interface to a view switching control.
In some embodiments, the step "when the skill control displayed on the graphical user interface is changed to the view switching control" further comprises:
and amplifying the skill control.
In some embodiments, the method further comprises:
and detecting whether the virtual object controlled by the player through the touch terminal is in a non-combat state currently, and if so, generating a view switching opening instruction.
In some embodiments, the non-combat state includes at least one of: the virtual object is in a death state, the virtual object is in a non-departure state, and no enemy exists in the preset distance range of the virtual object.
In some embodiments, the target object is a virtual object in the same game play as the virtual object controlled by the player through the touch terminal.
In some embodiments, the method further comprises:
in the process of switching the first game scene picture to a second game scene picture in which the game scene is viewed at a second game perspective, the virtual object is controlled to move in the game scene in response to a movement operation for the virtual object.
In some embodiments, a control unit to:
when the distance between the first virtual object and the second virtual object in the game scene is smaller than a preset value, determining a combined virtual object identifier for representing the first virtual object and the second virtual object;
displaying a visual field switching control containing a combined virtual object identifier on a graphical user interface;
a target unit for:
responding to the touch operation aiming at the visual field switching control, if the visual field switching control acted by the touch operation contains the combined virtual object identifier, determining that the visual field switching control has an incidence relation with the first virtual object and the second virtual object respectively, and determining the first virtual object or the second virtual object as a target object.
In some embodiments, the apparatus is further configured to:
when the fact that the distance between the first virtual object and the second virtual object in the game scene is larger than or equal to a preset value is detected, the visual field switching control containing the combined virtual object identification is switched into a first visual field switching control and a second visual field switching control, the first visual field switching control contains the identification of the first virtual object, and the second visual field switching control contains the identification of the second virtual object.
In a specific implementation, the above units may be implemented as independent entities, or may be combined arbitrarily to be implemented as the same or several entities, and the specific implementation of the above units may refer to the foregoing method embodiments, which are not described herein again.
As can be seen from the above, in the game control apparatus of the present embodiment, the control unit displays the view switching control on the graphical user interface; responding to a first touch operation aiming at the visual field switching control by the target unit, and determining a target object according to the incidence relation between the visual field switching control and the virtual object; and determining a second game visual angle according to the position of the target object in the game scene by the second visual angle unit, and switching the first game scene picture into a second game scene picture for observing the game scene at the second game visual angle.
Therefore, the interaction efficiency can be improved.
Correspondingly, the embodiment of the present application further provides a computer device, where the computer device may be a terminal or a server, and the terminal may be a terminal device such as a smart phone, a tablet computer, a notebook computer, a touch screen, a virtual machine, a Personal computer, and a Personal Digital Assistant (PDA).
As shown in fig. 4, fig. 4 is a schematic structural diagram of a computer device 400 according to an embodiment of the present application, where the computer device 400 includes a processor 401 having one or more processing cores, a memory 402 having one or more computer-readable storage media, and a computer program stored in the memory 402 and running on the processor. The processor 401 is electrically connected to the memory 402. Those skilled in the art will appreciate that the computer device configurations illustrated in the figures are not meant to be limiting of computer devices and may include more or fewer components than those illustrated, or some components may be combined, or a different arrangement of components.
The processor 401 is a control center of the computer device 400, connects the respective parts of the entire computer device 400 using various interfaces and lines, performs various functions of the computer device 400 and processes data by running or loading software programs and/or modules stored in the memory 402 and calling data stored in the memory 402, thereby monitoring the computer device 400 as a whole.
In the embodiment of the present application, the processor 401 in the computer device 400 loads instructions corresponding to processes of one or more application programs into the memory 402 according to the following steps, and the processor 401 runs the application programs stored in the memory 402, thereby implementing various functions:
displaying a view switching control on a graphical user interface;
responding to a first touch operation aiming at the visual field switching control, and determining a target object according to the incidence relation between the visual field switching control and the virtual object;
and determining a second game visual angle according to the position of the target object in the game scene, and switching the first game scene picture into a second game scene picture for observing the game scene at the second game visual angle.
The above operations can be implemented in the foregoing embodiments, and are not described in detail herein.
Optionally, as shown in fig. 4, the computer device 400 further includes: touch-sensitive display screen 403, radio frequency circuit 404, audio circuit 405, input unit 406 and power 407. The processor 401 is electrically connected to the touch display screen 403, the radio frequency circuit 404, the audio circuit 405, the input unit 406, and the power source 407. Those skilled in the art will appreciate that the computer device configuration illustrated in FIG. 4 does not constitute a limitation of computer devices, and may include more or fewer components than those illustrated, or some components may be combined, or a different arrangement of components.
The touch display screen 403 may be used for displaying a graphical user interface and receiving an operation instruction generated by a user acting on the graphical user interface. The touch display screen 403 may include a display panel and a touch panel. The display panel may be used, among other things, to display information entered by or provided to a user and various graphical user interfaces of the computer device, which may be made up of graphics, text, icons, video, and any combination thereof. Alternatively, the Display panel may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like. The touch panel may be used to collect touch operations of a user on or near the touch panel (for example, operations of the user on or near the touch panel using any suitable object or accessory such as a finger, a stylus pen, and the like), and generate corresponding operation instructions, and the operation instructions execute corresponding programs. Alternatively, the touch panel may include two parts, a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 401, and can receive and execute commands sent by the processor 401. The touch panel may overlay the display panel, and when the touch panel detects a touch operation thereon or nearby, the touch panel may transmit the touch operation to the processor 401 to determine the type of the touch event, and then the processor 401 may provide a corresponding visual output on the display panel according to the type of the touch event. In the embodiment of the present application, the touch panel and the display panel may be integrated into the touch display screen 403 to realize input and output functions. However, in some embodiments, the touch panel and the touch panel can be implemented as two separate components to perform the input and output functions. That is, the touch display screen 403 may also be used as a part of the input unit 406 to implement an input function.
In the embodiment of the present application, a virtual application is executed by the processor 401 to generate a graphical user interface on the touch display screen 403, where a virtual scene on the graphical user interface includes at least one skill control area, and the skill control area includes at least one skill control. The touch display screen 403 is used for presenting a graphical user interface and receiving an operation instruction generated by a user acting on the graphical user interface.
The rf circuit 404 may be used for transceiving rf signals to establish wireless communication with a network device or other computer device via wireless communication, and for transceiving signals with the network device or other computer device.
The audio circuit 405 may be used to provide an audio interface between a user and a computer device through speakers, microphones. The audio circuit 405 may transmit the electrical signal converted from the received audio data to a speaker, and convert the electrical signal into a sound signal for output; on the other hand, the microphone converts the collected sound signal into an electrical signal, which is received by the audio circuit 405 and converted into audio data, which is then processed by the audio data output processor 401, and then sent to, for example, another computer device via the radio frequency circuit 404, or output to the memory 402 for further processing. The audio circuit 405 may also include an earbud jack to provide communication of a peripheral headset with the computer device.
The input unit 406 may be used to receive input numbers, character information, or user characteristic information (e.g., fingerprint, iris, facial information, etc.), and to generate keyboard, mouse, joystick, optical, or trackball signal inputs related to user settings and function control.
The power supply 407 is used to power the various components of the computer device 400. Optionally, the power source 407 may be logically connected to the processor 401 through a power management system, so as to implement functions of managing charging, discharging, power consumption management, and the like through the power management system. The power supply 407 may also include one or more dc or ac power sources, recharging systems, power failure detection circuitry, power converters or inverters, power status indicators, or any other component.
Although not shown in fig. 4, the computer device 400 may further include a camera, a sensor, a wireless fidelity module, a bluetooth module, etc., which are not described in detail herein.
In the foregoing embodiments, the descriptions of the respective embodiments have respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to related descriptions of other embodiments.
As can be seen from the above, the computer device provided in this embodiment may display the view switching control on the graphical user interface; responding to a first touch operation aiming at the visual field switching control, and determining a target object according to the incidence relation between the visual field switching control and the virtual object; and determining a second game visual angle according to the position of the target object in the game scene, and switching the first game scene picture into a second game scene picture for observing the game scene at the second game visual angle.
It will be understood by those skilled in the art that all or part of the steps of the methods of the above embodiments may be performed by instructions or by associated hardware controlled by the instructions, which may be stored in a computer readable storage medium and loaded and executed by a processor.
To this end, the present application provides a computer-readable storage medium, in which a plurality of computer programs are stored, and the computer programs can be loaded by a processor to execute the steps in any one of the methods for game control provided by the embodiments of the present application. For example, the computer program may perform the steps of:
displaying a view switching control on a graphical user interface;
responding to a first touch operation aiming at the visual field switching control, and determining a target object according to the incidence relation between the visual field switching control and the virtual object;
and determining a second game visual angle according to the position of the target object in the game scene, and switching the first game scene picture into a second game scene picture for observing the game scene at the second game visual angle.
The above operations can be implemented in the foregoing embodiments, and are not described in detail herein.
Wherein the storage medium may include: read Only Memory (ROM), Random Access Memory (RAM), magnetic or optical disks, and the like.
Since the computer program stored in the storage medium can execute the steps in any of the game control methods provided in the embodiments of the present application, the beneficial effects that can be achieved by any of the game control methods provided in the embodiments of the present application can be achieved, and detailed descriptions are omitted here for the sake of detail in the foregoing embodiments.
The foregoing detailed description is directed to a method, an apparatus, a storage medium, and a computer device for game control provided in the embodiments of the present application, and specific examples are applied herein to illustrate the principles and implementations of the present application, and the descriptions of the foregoing embodiments are only used to help understand the method and the core ideas of the present application; meanwhile, for those skilled in the art, according to the idea of the present application, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present application.

Claims (20)

1. A method for controlling a game, wherein a graphical user interface is provided through a touch terminal, the graphical user interface includes a first game scene picture for observing a game scene at a first game perspective, the first game scene picture at least partially includes a virtual character, and the virtual character is a virtual object controlled by a player through the touch terminal, the method includes:
displaying a view switching control on the graphical user interface;
responding to a first touch operation aiming at the visual field switching control, and determining a target object according to the incidence relation between the visual field switching control and the virtual object;
and determining a second game visual angle according to the position of the target object in the game scene, and switching the first game scene picture into a second game scene picture for observing the game scene with the second game visual angle.
2. The method of claim 1, further comprising:
and responding to the end of the first touch operation, and switching the second game scene picture to the first game scene picture.
3. The method of claim 1, further comprising:
and updating the incidence relation between the visual field switching control and the virtual object in response to the second touch operation aiming at the visual field switching control.
4. The method of claim 1, further comprising:
and responding to a preset game event, and updating the incidence relation between the visual field switching control and the virtual object.
5. The method of claim 1, wherein the view switching control comprises a plurality of sub-controls, and each of the sub-controls corresponds to a respective virtual object.
6. The method of claim 1, wherein the view toggle control contains an identification of the target object.
7. The method of claim 1, wherein displaying a view switching control on the graphical user interface comprises:
and responding to a view switching starting instruction, and displaying a view switching control on the graphical user interface.
8. The method of claim 1, wherein the graphical user interface further comprises a minimap, the display content of the minimap containing a first thumbnail of the first game scene screen;
after responding to the first touch operation aiming at the visual field switching control and determining a target object according to the incidence relation between the visual field switching control and the virtual object, the method further comprises the following steps:
and switching the first thumbnail of the small map to a second thumbnail corresponding to the second game scene picture.
9. The method of claim 7, wherein the graphical user interface further comprises a minimap, the method further comprising:
in response to a zoom-in instruction for the minimap, the minimap is zoomed in, and a view switching on instruction is generated.
10. The method of claim 1, wherein displaying a view switching control on the graphical user interface comprises:
and changing the skill control displayed on the graphical user interface into a view switching control.
11. The method of claim 10, wherein when the skill control displayed on the graphical user interface is changed to a view switching control, the method further comprises:
and amplifying the skill control.
12. The method of claim 7, further comprising:
and detecting whether the virtual object controlled by the player through the touch terminal is in a non-combat state currently, and if so, generating a view switching opening instruction.
13. The method of claim 12, wherein the non-combat state comprises at least one of: the virtual object is in a death state, the virtual object is in a non-departure state, and no enemy exists in the preset distance range of the virtual object.
14. The method of claim 1, wherein the target object is a virtual object in a same game play as a virtual object controlled by the player through the touch terminal.
15. The method of claim 1, further comprising:
and in the process of switching the first game scene picture to a second game scene picture for observing a game scene at the second game visual angle, controlling the virtual object to move in the game scene in response to the movement operation aiming at the virtual object.
16. The method of claim 6, wherein displaying a view switching control on the graphical user interface comprises:
when the distance between a first virtual object and a second virtual object in the game scene is smaller than a preset value, determining a combined virtual object identifier for representing the first virtual object and the second virtual object;
displaying a visual field switching control containing the combined virtual object identifier on the graphical user interface;
the responding to the first touch operation of the visual field switching control, and determining a target object according to the incidence relation between the visual field switching control and the virtual object, wherein the determining comprises the following steps:
responding to the touch operation aiming at the visual field switching control, if the visual field switching control acted by the touch operation contains the combined virtual object identifier, determining that the visual field switching control has an association relation with the first virtual object and the second virtual object respectively, and determining the first virtual object or the second virtual object as a target object.
17. The method of claim 16, further comprising:
when the fact that the distance between the first virtual object and the second virtual object in the game scene is larger than or equal to the preset value is detected, the visual field switching control containing the combined virtual object identification is switched to a first visual field switching control and a second visual field switching control, the first visual field switching control contains the identification of the first virtual object, and the second visual field switching control contains the identification of the second virtual object.
18. An apparatus for controlling a game, wherein a graphical user interface is provided through a touch terminal, the graphical user interface includes a first game scene screen for viewing a game scene from a first game perspective, the first game scene screen at least partially includes a virtual character, and the virtual character is a virtual object controlled by a player through the touch terminal, the apparatus comprising:
the control unit is used for displaying a visual field switching control on the graphical user interface;
the target unit is used for responding to the first touch operation aiming at the visual field switching control and determining a target object according to the incidence relation between the visual field switching control and the virtual object;
and the second visual angle unit is used for determining a second game visual angle according to the position of the target object in the game scene and switching the first game scene picture into a second game scene picture for observing the game scene at the second game visual angle.
19. A terminal comprising a processor and a memory, said memory storing a plurality of instructions; the processor loads instructions from the memory to perform the steps of the method of game control according to any one of claims 1 to 17.
20. A computer readable storage medium storing instructions adapted to be loaded by a processor to perform the steps of the method of game control according to any of claims 1 to 17.
CN202110800941.3A 2021-07-15 2021-07-15 Game control method, device, terminal and storage medium Pending CN113633963A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110800941.3A CN113633963A (en) 2021-07-15 2021-07-15 Game control method, device, terminal and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110800941.3A CN113633963A (en) 2021-07-15 2021-07-15 Game control method, device, terminal and storage medium

Publications (1)

Publication Number Publication Date
CN113633963A true CN113633963A (en) 2021-11-12

Family

ID=78417424

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110800941.3A Pending CN113633963A (en) 2021-07-15 2021-07-15 Game control method, device, terminal and storage medium

Country Status (1)

Country Link
CN (1) CN113633963A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115779446A (en) * 2023-01-29 2023-03-14 深圳市人马互动科技有限公司 Data processing method based on game task and related product
WO2023236602A1 (en) * 2022-06-08 2023-12-14 网易(杭州)网络有限公司 Display control method and device for virtual object, and storage medium and electronic device
WO2024078324A1 (en) * 2022-10-14 2024-04-18 腾讯科技(深圳)有限公司 Virtual object control method and apparatus, and storage medium and electronic device

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105159687A (en) * 2015-09-29 2015-12-16 腾讯科技(深圳)有限公司 Information processing method, terminal and computer storage medium
CN105597310A (en) * 2015-12-24 2016-05-25 网易(杭州)网络有限公司 Game control method and device
CN109331468A (en) * 2018-09-26 2019-02-15 网易(杭州)网络有限公司 Display methods, display device and the display terminal at game visual angle
US20190118078A1 (en) * 2017-10-23 2019-04-25 Netease (Hangzhou) Network Co.,Ltd. Information Processing Method and Apparatus, Storage Medium, and Electronic Device
CN111773711A (en) * 2020-07-27 2020-10-16 网易(杭州)网络有限公司 Game visual angle control method and device, storage medium and electronic device
CN112619137A (en) * 2021-01-06 2021-04-09 网易(杭州)网络有限公司 Game picture switching method and device, electronic equipment and storage medium

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105159687A (en) * 2015-09-29 2015-12-16 腾讯科技(深圳)有限公司 Information processing method, terminal and computer storage medium
CN105597310A (en) * 2015-12-24 2016-05-25 网易(杭州)网络有限公司 Game control method and device
US20190118078A1 (en) * 2017-10-23 2019-04-25 Netease (Hangzhou) Network Co.,Ltd. Information Processing Method and Apparatus, Storage Medium, and Electronic Device
CN109331468A (en) * 2018-09-26 2019-02-15 网易(杭州)网络有限公司 Display methods, display device and the display terminal at game visual angle
CN111773711A (en) * 2020-07-27 2020-10-16 网易(杭州)网络有限公司 Game visual angle control method and device, storage medium and electronic device
CN112619137A (en) * 2021-01-06 2021-04-09 网易(杭州)网络有限公司 Game picture switching method and device, electronic equipment and storage medium

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023236602A1 (en) * 2022-06-08 2023-12-14 网易(杭州)网络有限公司 Display control method and device for virtual object, and storage medium and electronic device
WO2024078324A1 (en) * 2022-10-14 2024-04-18 腾讯科技(深圳)有限公司 Virtual object control method and apparatus, and storage medium and electronic device
CN115779446A (en) * 2023-01-29 2023-03-14 深圳市人马互动科技有限公司 Data processing method based on game task and related product

Similar Documents

Publication Publication Date Title
CN113398565B (en) Game control method, game control device, terminal and storage medium
CN113101652A (en) Information display method and device, computer equipment and storage medium
CN113633963A (en) Game control method, device, terminal and storage medium
CN113350793B (en) Interface element setting method and device, electronic equipment and storage medium
CN113426124B (en) Display control method and device in game, storage medium and computer equipment
CN113101657B (en) Game interface element control method, game interface element control device, computer equipment and storage medium
CN115040873A (en) Game grouping processing method and device, computer equipment and storage medium
CN112843719A (en) Skill processing method, skill processing device, storage medium and computer equipment
CN113332716A (en) Virtual article processing method and device, computer equipment and storage medium
CN113332724B (en) Virtual character control method, device, terminal and storage medium
CN115193064A (en) Virtual object control method and device, storage medium and computer equipment
CN115999153A (en) Virtual character control method and device, storage medium and terminal equipment
CN115193043A (en) Game information sending method and device, computer equipment and storage medium
CN114225412A (en) Information processing method, information processing device, computer equipment and storage medium
CN114159788A (en) Information processing method, system, mobile terminal and storage medium in game
CN113426115A (en) Game role display method and device and terminal
CN116139483A (en) Game function control method, game function control device, storage medium and computer equipment
CN113867873A (en) Page display method and device, computer equipment and storage medium
CN113413595A (en) Information prompting method and device, electronic equipment and storage medium
CN113332721B (en) Game control method, game control device, computer equipment and storage medium
CN117919705A (en) Game control method, game control device, computer equipment and storage medium
CN115193062A (en) Game control method, device, storage medium and computer equipment
CN115430145A (en) Target position interaction method and device, electronic equipment and readable storage medium
CN116474367A (en) Virtual lens control method and device, storage medium and computer equipment
CN116850594A (en) Game interaction method, game interaction device, computer equipment and computer readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination