CN116271811A - Game operation method and device and electronic equipment - Google Patents

Game operation method and device and electronic equipment Download PDF

Info

Publication number
CN116271811A
CN116271811A CN202310195645.4A CN202310195645A CN116271811A CN 116271811 A CN116271811 A CN 116271811A CN 202310195645 A CN202310195645 A CN 202310195645A CN 116271811 A CN116271811 A CN 116271811A
Authority
CN
China
Prior art keywords
virtual
game
state
user interface
graphical user
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310195645.4A
Other languages
Chinese (zh)
Inventor
陶欣怡
林�智
刘勇成
胡志鹏
袁思思
程龙
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Netease Hangzhou Network Co Ltd
Original Assignee
Netease Hangzhou Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netease Hangzhou Network Co Ltd filed Critical Netease Hangzhou Network Co Ltd
Priority to CN202310195645.4A priority Critical patent/CN116271811A/en
Publication of CN116271811A publication Critical patent/CN116271811A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/822Strategy games; Role-playing games
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/30Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device
    • A63F2300/308Details of the user interface
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/807Role playing or strategy games

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application discloses a game operation method, a game operation device and electronic equipment. The method comprises the following steps: providing, by the terminal device, a graphical user interface comprising at least a portion of a game scene, the portion of the game scene comprising a first virtual character, the method comprising: displaying a first game screen on the graphical user interface, the first game screen comprising: the first virtual character is displayed in a first display mode and is in a first state, wherein the first state represents a state in which the preset virtual skills are not released; in response to the trigger condition being met, displaying a second game screen at the graphical user interface, the second game screen comprising: the first virtual character in the second state is displayed in a second display mode, the second state represents a state of releasing the preset virtual skills, the second display mode is a mode of displaying the first virtual character in the first state in a second size, and the first display mode is a mode of displaying the first virtual character in the first state in a first size. The method can reduce the misoperation rate.

Description

Game operation method and device and electronic equipment
Technical Field
The present disclosure relates to the field of computer technologies, and in particular, to a method and an apparatus for game operation, and an electronic device.
Background
In a game, players are generally divided into at least two camps, players in different camps compete with each other in a game scene by controlling selected virtual characters to achieve game targets, and players in the same camps can mutually cooperate in the game scene by controlling the selected virtual characters to achieve the killing of enemies.
At present, in a game scene of a multi-player mixed battle, it is generally difficult to distinguish whether a game character releasing virtual skills is a teammate or an enemy, and particularly, in a case where a game character selected by a teammate and an enemy is the same game character, there is a phenomenon that a teammate releasing virtual skills is battled as a battle object, resulting in a high error rate.
Therefore, there is a need for a game operation method by which the malfunction rate can be reduced.
Disclosure of Invention
The application provides a game operation method, a game operation device and electronic equipment.
In a first aspect, a method for game operation is provided, where a graphical user interface is provided by a terminal device, where the graphical user interface includes at least a part of a game scene, and the part of the game scene includes at least one first virtual character, and the method includes: displaying a first game screen on the graphical user interface, wherein the first game screen comprises: the first virtual character in a first state is displayed in a first display mode, wherein the first state represents a state in which a preset virtual skill is not released; in response to the trigger condition being met, displaying a second game screen on the graphical user interface, wherein the second game screen comprises: the first virtual character in a second state displayed in a second display mode, wherein the second state represents a state of releasing the preset virtual skills; the first virtual character in the first state displayed in the first display mode is: a character model of the first virtual character in the first state displayed at a first size; the first virtual character in the second state displayed in the second display mode is: a character model of the first virtual character in the second state displayed at a second size, the second size being larger than the first size.
A second aspect of the embodiments of the present application provides an apparatus for game operations, where a graphical user interface is provided by a terminal device, where the graphical user interface includes at least a part of a game scene, where the part of the game scene includes at least one first virtual character, and the apparatus includes a display unit, where the display unit is configured to: displaying a first game screen on the graphical user interface, wherein the first game screen comprises: the first virtual character in a first state is displayed in a first display mode, wherein the first state represents a state in which a preset virtual skill is not released; the display unit is further configured to: in response to the trigger condition being met, displaying a second game screen on the graphical user interface, wherein the second game screen comprises: the first virtual character in the second state displayed in a second display mode, wherein the second state represents a state of releasing the preset virtual skills; the first virtual character in the first state displayed in the first display mode is: a character model of the first virtual character in the first state displayed at a first size; the first virtual character in the second state displayed in the second display mode is: a character model of the first virtual character in the second state displayed at a second size, the second size being larger than the first size.
A third aspect of the embodiments of the present application further provides an electronic device, including: a processor; and a memory for storing a data processing program, the server being powered on and executing the program by the processor, to perform the method as described above.
A fourth aspect of the embodiments of the present application further provides a computer readable storage medium having one or more computer instructions stored thereon, where the instructions are executed by a processor to implement a method according to any one of the above-mentioned aspects.
It should be understood that the description in this section is not intended to identify key or critical features of the disclosed embodiments of the application, nor is it intended to be used to limit the scope of the disclosed embodiments of the application. Other features of the present disclosure will become apparent from the following specification.
According to the technical scheme of the game operation method, a graphical user interface is provided through terminal equipment, the graphical user interface comprises at least part of game scenes, and the part of game scenes comprise at least one first virtual character, and the method comprises the following steps: displaying a first game screen on the graphical user interface, wherein the first game screen comprises: the first virtual character is displayed in a first display mode and is in a first state, wherein the first state represents a state in which the preset virtual skills are not released; in response to the trigger condition being met, displaying a second game screen on the graphical user interface, wherein the second game screen comprises: the first virtual character in a second state displayed in a second display mode, wherein the second state represents a state of releasing preset virtual skills, and the first virtual character in the first state displayed in the first display mode is as follows: a character model of the first virtual character in the first state displayed at a first size; the first virtual character in the second state displayed in the second display mode is: a character model of the first virtual character in the second state displayed at a second size, the second size being larger than the first size. In the above game operation method, a first game screen is displayed on the graphical user interface, where the first game screen includes: a first virtual character in a first state displayed in a first display mode; and displaying a second game picture on the graphical user interface under the condition that the preset triggering condition is met, wherein the second game picture comprises: a first virtual character in a second state displayed in a second display mode; the first state represents a state in which the preset virtual skills are not released, and the second state represents a state in which the preset virtual skills are released. That is, in the above-described implementation, when the state of the first virtual character is changed, the display mode for displaying the first virtual character (i.e., the first display mode is updated to the second display mode) may be updated in the graphical user interface, wherein the size of the character model of the first virtual character displayed in the second display mode is larger than the size of the character model of the first virtual character displayed in the first display mode, that is, the second display mode is a highlighting mode compared with the first display mode, so that the first virtual character having the state changed can be easily recognized, and the error rate can be reduced.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed in the description of the embodiments of the present application will be briefly described below, it being obvious that the drawings in the following description are only some embodiments of the present application, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a schematic diagram of an application scenario suitable for a method of game operation provided in an embodiment of the present application.
Fig. 2 is a schematic diagram of a graphical user interface provided by the terminal device 101 in the application scenario shown in fig. 1 described above.
Fig. 3 is a schematic diagram of a graphical user interface provided by the terminal device 102 in the application scenario shown in fig. 1 described above.
Fig. 4 is a schematic diagram of a method of game operation provided in an embodiment of the present application.
FIG. 5A is a schematic illustration of a graphical user interface of the method of game play described above with respect to FIG. 4.
FIG. 5B is a schematic illustration of another graphical user interface of the method of game play described above with respect to FIG. 4.
FIG. 6 is a schematic diagram of another method of game play provided in an embodiment of the present application.
FIG. 7 is a schematic diagram of a graphical user interface of a method of game play described above with respect to FIG. 6.
Fig. 8 is a schematic diagram of a method of further game operation provided in an embodiment of the present application.
FIG. 9 is a schematic diagram of a graphical user interface of a method of game play described above with respect to FIG. 8.
Fig. 10 is a schematic structural view of a game operation device according to an embodiment of the present application.
Fig. 11 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
In order to enable those skilled in the art to better understand the technical solutions of the present application, the present application is clearly and completely described below with reference to the accompanying drawings in the embodiments of the present application. This application is intended to be limited to the details shown and described, and it is intended that the invention not be limited to the particular embodiment disclosed, but that the application will include all embodiments falling within the scope of the appended claims.
It should be noted that the terms "first," "second," "third," and the like in the claims, specification, and drawings herein are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. The data so used may be interchanged where appropriate to facilitate the embodiments of the present application described herein, and may be implemented in sequences other than those illustrated or otherwise described herein. Furthermore, the terms "comprises," "comprising," and their variants are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
The various triggering events disclosed in the specification can be preset through a system, or can be set in real time after receiving an operation instruction of a user in the process of program operation. It will be appreciated that different trigger events will trigger execution of different functions accordingly.
For ease of understanding, technical terms that may be involved in embodiments of the present application will first be briefly described.
A graphic user interface, which is an interface display format in which a person communicates with a computer, allows a user to manipulate icons, marks, or menu options on a screen using an input device such as a mouse or a keyboard, and also allows the user to manipulate icons or menu options on the screen by performing a touch operation on a touch screen of a touch terminal to select a command, start a program, or perform some other task, etc.
The virtual scene is a virtual scene that an application program displays (or provides) when running on a terminal or a server. Optionally, the virtual scene is a simulation environment for the real world, or a semi-simulated semi-fictional virtual environment, or a purely fictional virtual environment. The virtual scene is any one of a two-dimensional virtual scene and a three-dimensional virtual scene, and the virtual environment can be sky, land, ocean and the like, wherein the land comprises environmental elements such as deserts, cities and the like. The virtual scene is a scene of a complete game logic of a user control virtual character.
A virtual character refers to a virtual character in a virtual environment, which may be a virtual character manipulated by a player, including but not limited to at least one of a virtual character, a virtual animal, a cartoon character, and may also be a non-player-manipulated virtual character (NPC). Alternatively, when the virtual environment is a three-dimensional virtual environment, the virtual characters may be three-dimensional virtual models, each having its own shape and volume in the three-dimensional virtual environment, occupying a part of the space in the three-dimensional virtual environment. Optionally, the virtual character is a three-dimensional character constructed based on three-dimensional human skeleton technology, which implements different external figures by wearing different skins. In some implementations, the avatar may also be implemented using a 2.5-dimensional or 2-dimensional model, which is not limited by embodiments of the present application. There may be multiple virtual characters in the virtual scene, which are virtual characters that the player manipulates (i.e., characters that the player controls through the input device), or artificial intelligence (Artificial Intelligence, AI) set in the virtual environment combat through training. Optionally, the avatar is a avatar that plays in the game scene. Optionally, the number of virtual characters in the game scene fight is preset, or is dynamically determined according to the number of terminal devices joining the virtual fight, which is not limited in the embodiment of the present application. In one possible implementation, a user can control a virtual character to move in the virtual scene, e.g., control the virtual character to run, jump, crawl, etc., and also control the virtual character to fight other virtual characters using virtual skills, virtual props, etc., provided by an application.
The game interface is an interface corresponding to the application program provided or displayed through the graphical user interface, and the interface comprises a UI interface and a game picture for the player to interact. In alternative embodiments, game controls (e.g., skill controls, movement controls, functionality controls, etc.), indication identifiers (e.g., direction indication identifiers, character indication identifiers, etc.), information presentation areas (e.g., number of clicks, time of play, etc.), or game setting controls (e.g., system settings, stores, gold coins, etc.) may be included in the UI interface. In an alternative embodiment, the game screen may be a display screen corresponding to the virtual scene displayed by the terminal device, may be a game interface for viewing a virtual task in the game, or may be a parameter configuration interface in a game preparation stage, and the display screen corresponding to the virtual scene may include virtual objects such as a game character, an NPC character, and an AI character for executing game logic in the virtual scene.
The third person refers to viewing angle, which means that the user views the picture. When a third person calls the visual angle, all elements in the scene can be observed, and no dizziness is generated. The user can also go through the elements presented in the view, i.e. the picture, to the operation that may take place in the next step. In this application, a third person is referred to as a viewing angle, i.e. a view of a user or a current player is taken. Similarly, the visual effect is the same or similar in a similar perspective as the user or current player.
The first person viewing angle is a viewing angle simulating the vision of human eyes (users) in a picture, and has strong substitution feeling. The first person viewing angle is a very subjective feature, and a user can only observe a scene from the eyes of a person having a viewing angle, i.e., a virtual character.
As in the present exemplary embodiment, the manner in which the virtual character is manipulated is various. For example, the operation of touching the touch interface with a finger, that is, one of touching, clicking, double clicking, translating, pressing, sliding, etc., may be performed, or two or more of touching, clicking, double clicking, translating, pressing, sliding, etc. may be combined, for example, the finger may be slid while pressing.
The response actions corresponding to the same operation may be different, for example, the display screen may be enlarged or reduced by a single click operation, or may be enlarged or reduced by a double click operation. The foregoing is by way of example only and is not to be construed as limiting in any way.
A single click is typically used to represent a quick key selection to choose an option or to control the game avatar to make an action or to quickly place an item, etc.
Double-clicking is typically used to confirm a selection, or opening or closing of a current window, and wear of the equipment may also guide movement of the game character.
A long press may generally be used to confirm the selection of an item or an action.
Dragging is a common means of operation to help a user drag a certain "game element" in a game, such as a play object or an object in a scene, from one place to another.
Hereinafter, application scenarios of the method of game operation and the method of game operation applied to the embodiments of the present application will be described in detail with reference to the accompanying drawings. It is to be understood that the following embodiments and features thereof may be combined with each other without conflict between the embodiments provided in the present application. In addition, the sequence of steps in the method embodiments described below is only an example and is not strictly limited.
First, application scenarios of a method applicable to game operations of the embodiments of the present application will be described with reference to the accompanying drawings.
Fig. 1 is a schematic diagram of an application scenario suitable for a method of game operation provided in an embodiment of the present application. By way of example, the application scenario illustrated in fig. 1 includes a plurality of terminal devices (i.e., terminal device 101 and terminal device 102), a server 103, and a network 104. Wherein any one of a plurality of terminal apparatuses (e.g., terminal apparatus 101 or terminal apparatus 102) communicates with and data interacts with server 103 via network 104.
The terminal devices (i.e., the terminal device 101 and the terminal device 102) are clients installed with application programs, and the running result of the application programs can be displayed to the user through a graphical User Interface (UI) provided by a touch display of the terminal device. The application program installed in the terminal device may be an application program that needs to be downloaded and installed, or may be a point-and-use application program, which is not limited in this embodiment of the present application. The type of the terminal device is not particularly limited, and the terminal device may be a mobile terminal, for example, the mobile terminal may be, but not limited to, any one of the following: smart phones, tablet computers, gaming machines, palm top computers (personal digital assistant, PDA), etc., which can interact with a user through input devices such as keyboards, virtual keyboards, touch pads, touch screens, and voice-controlled devices. The operating system of the mobile terminal may include Android (Android), IOS, windows Phone, windows, etc., and may generally support running of various games.
The application installed in the terminal device may be any application that can provide a virtual environment in which a virtual object substituted and operated by a user is active. By way of example, the application may be a gaming application such as a massively multiplayer online role playing (massively multiplayer online role-playing, mmop) game, a massively multiplayer online (massively multiplayer online, MMO) game, a massively multiplayer online strategy (massively multiplayer online simulation, MMOs) game, a third-party shooter game (third-personal shooting game, TPS), and a multiplayer warfare survival game, among others. Of course, other types of applications besides game applications may expose virtual objects to users and provide corresponding functionality to the virtual objects. That is, the application installed in the terminal device may also be another type of application other than the game application. For example, other types of applications may be, but are not limited to, any of the following: virtual Reality (VR) class applications, augmented reality (augmented reality, AR) class applications, three-dimensional map applications, military simulation applications, social class applications, or interactive entertainment class applications. In addition, for different applications, the forms of the virtual objects provided by the application programs are different, and the corresponding functions are also different, specifically, the application program can be configured in advance according to actual requirements, which is not limited in the embodiment of the present application. In some implementations, the application is an application developed based on a three-dimensional virtual environment engine, for example, the virtual environment engine is a Unity engine, and the virtual environment engine can construct a three-dimensional virtual environment, virtual objects, virtual props and the like, so as to bring more immersive game experience to the user. The virtual environment is a scene that is displayed (or provided) when a client of an application program (e.g., a game application program) runs on a terminal device, and the virtual environment refers to a scene that is created for a virtual object to perform an activity (such as a game competition), such as a virtual house, a virtual island, a virtual map, a virtual building, and the like. The virtual environment may be a simulation environment for the real world, a semi-simulation and semi-imaginary environment, or a pure imaginary environment. The virtual environment may be a two-dimensional virtual environment, a 2.5-dimensional virtual environment, or a three-dimensional virtual environment, which is not limited in the embodiment of the present application. The virtual object may be a virtual character controlled by a user account in an application program, or may be a virtual character controlled by a computer program in the application program. Taking an application program as a game application program as an example, the virtual object can be a game character controlled by a user account in the game application program, or can be a game monster controlled by a computer program in the game application program. The virtual object may be in the form of a character, which may be an animal, a cartoon, or other form, and embodiments of the present application are not limited in this regard. The virtual object may be displayed in a three-dimensional form or a two-dimensional form, which is not limited in the embodiment of the present application. Alternatively, when the virtual environment is a three-dimensional virtual environment, the virtual object is a three-dimensional stereoscopic model created based on an animated skeleton technique. Each virtual object has its own shape and volume in the three-dimensional virtual environment, occupying a portion of the space in the three-dimensional virtual environment.
In some implementations, the application program is a game application program, and after the terminal device runs the game application program, a touch display of the terminal device obtains a corresponding graphical user interface through rendering, where the touch display includes a touch panel and a display panel. The touch panel may collect touch or non-touch operations on or near the user (for example, see fig. 2, which shows operations of the user's finger 1013 on or near the touch panel in the graphical user interface 1012 shown in fig. 2), and generate preset operation instructions. In addition, the touch panel may include two parts, a touch detection device and a touch controller. The touch detection device detects the touch azimuth and the touch gesture of a user, detects signals brought by touch operation and transmits the signals to the touch controller; the touch controller receives touch information from the touch detection device, converts the touch information into information which can be processed by the processor, sends the information to the processor of the terminal equipment, and can receive and execute a command sent by the processor. The touch display or the general display may be configured in the form of a liquid crystal display (liquid crystal display, LCD), an organic light-emitting diode (OLED), or the like. It will be appreciated that in the above implementation, the graphical user interface displayed by the display of the terminal device may be used to display a partial or complete game screen, which is not particularly limited in the embodiments of the present application.
Optionally, in other implementations, the application program is a game application program, and after the terminal device runs the game application program, the general display of the terminal device obtains a corresponding graphical user interface through rendering, where the mouse cursor may control a virtual character in a game screen displayed by the graphical user interface obtained by rendering of the general display. (for example, referring to fig. 3, the mouse cursor 1022 displayed in the graphical user interface 1021 shown in fig. 3 may control the movement direction of the virtual character, etc., wherein the mouse cursor 1022 may be controlled by a hardware mouse to which the terminal device 102 is connected).
The server 103 is used for providing background services for applications in the terminal device. For example, the server 103 may be a background server of the application program described above. The server 103 may be a server, a server cluster formed by a plurality of servers, or a cloud computing service center. Alternatively, the server 103 may provide background services for applications in a plurality of terminal devices at the same time.
The network 104 may be a wired network or a wireless network, which is not particularly limited in this application. The wireless or wired networks described above use standard communication techniques and/or protocols. The network is typically the internet, but can be any network including, but not limited to, a local area network (local area network, LAN), metropolitan area network (metropolitan area network, MAN), wide area network (wide area network, WAN), mobile, wired or wireless network, private network, or any combination of virtual private networks.
It should be understood that the application scenario shown in fig. 1 and described above, and the graphical user interface provided by the terminal device 101 shown in fig. 2 is merely schematic and not limiting in any way. For example, the application scenario illustrated in fig. 1 above may also include a greater number of terminal devices or servers 103. As another example, the terminal device shown in fig. 1 above may also display a plurality of graphical user interfaces via a display.
Next, a method of game operations provided in the embodiments of the present application will be described with reference to fig. 4 to 9.
Fig. 4 is a schematic diagram of a method of game operation provided in an embodiment of the present application. As shown in fig. 4, the method of game operation includes S410 and S420. Next, S410 and S420 are described in detail.
S410, displaying a first game picture on a graphical user interface, wherein the first game picture comprises: and displaying the first virtual character in a first state in a first display mode, wherein the first state represents a state in which the preset virtual skills are not released.
S420, in response to the trigger condition being met, displaying a second game screen on the graphical user interface, wherein the second game screen comprises: the first virtual character in the second state displayed in the second display mode, wherein the second state represents a state of releasing the preset virtual skills, and the first virtual character in the first state displayed in the first display mode is as follows: a character model of a first virtual character in a first state displayed at a first size; the first virtual character in the second state displayed in the second display mode is: a character model of the first virtual character in the second state is displayed in a second size, the second size being larger than the first size.
According to the game operation method, a graphical user interface is provided through the terminal equipment, the graphical user interface comprises at least part of game scenes, and the part of game scenes comprise at least one first virtual character. The content included in the partial game scene is not particularly limited. Optionally, in other implementations, the portion of the game scene may further include other content besides the at least one virtual character, where the other content may be, but is not limited to, any of the following: other virtual characters, virtual props, virtual plants, or virtual buildings. It will be appreciated that the embodiment of the present application does not specifically limit the type of device of the terminal device that provides the graphical user interface described above. For example, the terminal device may be, but is not limited to, any one of the following electronic devices: smart phones, tablet computers or palm computers. For example, in some implementations, the terminal device described in the embodiments of the present application may be the terminal device 101 shown in fig. 1 and the graphical user interface provided by the terminal device 101 may refer to the graphical user interface shown in fig. 2. As another example, in some implementations, the terminal device described in the embodiments of the present application may be the terminal device 102 shown in fig. 1 and the graphical user interface provided by the terminal device 102 may refer to the graphical user interface shown in fig. 3.
In the embodiment of the present application, the device for controlling the first virtual character is not particularly limited. In some implementations, the first virtual character is a character controlled by the terminal device. Alternatively, in other implementations, the first virtual character is a teammate of a second virtual character, where the second virtual character is a character controlled by the terminal device. In this implementation, when the trigger condition is satisfied, the first virtual character of the teammate as the second virtual character displayed in the first display mode in the graphical user interface is updated, and after updating, the first virtual character in the second state is displayed in the second display mode. Optionally, in still other implementations, the first virtual character is an enemy of a second virtual character, wherein the second virtual character is a character controlled by the terminal device. The first game screen and the second game screen may display the second virtual character, or the first game screen and the second game screen may not display the second virtual character, which is not specifically limited in the embodiment of the present application. In this implementation, when the trigger condition is satisfied, the first virtual character of the enemy as the second virtual character displayed in the first display mode in the graphical user interface is updated, and after updating, the first virtual character in the second state is displayed in the second display mode. The implementation mode is easy to distinguish enemies, and the misoperation rate can be reduced. Based on the above implementation, optionally, in some implementations, the part of the game scene further includes a second virtual character, and the game character configured by the first virtual character is the same as the game character configured by the second virtual character, where the first game screen further includes: a second virtual character in a first state displayed in a first display mode; and, the second game screen further includes: and displaying the second virtual character in the first state in the first display mode. In this implementation, the second virtual character is the same as the game character configured by the first virtual character, and when the trigger condition is satisfied, the first virtual character, which is an enemy of the second virtual character and is displayed in the first display mode in the graphical user interface, is updated, and the first virtual character in the second state is displayed in the second display mode after the updating. The implementation mode is easy to distinguish enemies and teammates, and the misoperation rate can be reduced.
In the embodiment of the present application, the above trigger conditions are not particularly limited. In some implementations, satisfying the trigger condition includes: the first trigger condition is satisfied, wherein the first trigger condition includes that a first virtual object in a first state is switched to a first virtual object in a second state. It will be understood that in this implementation, S420 is performed after S410 is performed, and in the case that the first trigger condition is satisfied, the terminal device automatically performs an operation of displaying the second game screen on the graphical user interface. That is, in this implementation, in the case where the first virtual object in the first state is satisfied to be switched to the first virtual state of the second state, the terminal device can display the second game screen on the graphical user interface without additionally performing other operations.
Optionally, in other implementations, satisfying the trigger condition further includes: and a second trigger condition is met, wherein the second trigger condition comprises: a preset trigger operation is detected. It will be appreciated that in such an implementation, both the first trigger condition and the second trigger condition need to be satisfied. The first time when the first trigger condition is satisfied and the second time when the second trigger condition is satisfied are not particularly limited. In some implementations, the first time is a time before the second time. That is, in this implementation, performing S420 may specifically be: and displaying a second game picture on the graphical user interface under the condition that the first trigger condition is met at the first moment and the second trigger condition is met at the second moment. Alternatively, in other implementations, the first time and the second time are the same time. That is, in this implementation, performing S420 may specifically be: and displaying a second game picture on the graphical user interface under the condition that the first trigger condition and the second trigger condition are met at the same time. In the embodiment of the present application, the preset triggering operation described in the foregoing implementation manner is not specifically limited, and may be set according to an application scenario and actual requirements.
Next, two different implementations of the preset trigger operation described in the foregoing implementations provided in the embodiments of the present application are described, where the two different implementations specifically include an implementation one and an implementation two.
Next, first, an implementation manner, namely an implementation manner one, of the preset triggering operation described in the foregoing implementation manner provided in the embodiments of the present application is described.
The implementation mode is as follows:
in a first implementation manner, the preset trigger operation is a first gesture operation or a first key operation. In this implementation, the implementation of S420 may specifically be: and displaying a second game screen on the graphical user interface under the condition that the first trigger condition is met and the first gesture operation or the first key operation is detected. The first gesture operation is not particularly limited, and may be set according to actual scenes and requirements. For example, the first gesture operation may be, but is not limited to being: and (5) magnifying gesture operation. As another example, the first gesture operation may be, but is not limited to, one or more click operations. The first key operation is not particularly limited, and may be set according to practical applications. For example, the first key operation may be, but is not limited to being: click the middle mouse button and scroll up. As another example, the first key operation may be, but is not limited to being: double clicking the right mouse button. For another example, the first key operation may be, but is not limited to: the hardware combination key operation, for example, when the terminal device is a mobile phone, the hardware combination key operation may be, but is not limited to: a power key and an volume key; for example, when the terminal device is a desktop computer, the hardware combination key operation may be, but is not limited to: the control keys in the keyboard are added with letter keys or shortcut keys on the keyboard.
Optionally, in some implementations, after the first implementation, that is, after the displaying the second game screen on the graphical user interface, the method further includes: in the case that the preset trigger operation is the first key operation, responding to the condition that the first key operation is detected to be switched to the condition that the first key operation is not detected, and displaying a first game picture on a graphical user interface; or when the preset trigger operation is a first gesture operation, in response to switching from the detection of the first gesture operation to the detection of a second gesture operation, displaying a first game screen on the graphical user interface, wherein the second gesture operation is different from the first gesture operation. The first gesture operation and the second gesture operation are two different gesture operations, and the first gesture operation and the second gesture operation are not particularly limited. For example, in some implementations, the first gesture operates as: amplifying gesture operation; the second gesture is as follows: and (5) gesture operation is reduced. As another example, in some implementations, the first gesture operates as: right slide gesture operation; the second gesture is as follows: a left slide gesture operation.
Next, first, another implementation manner of the preset trigger operation described in the foregoing implementation manner provided in the embodiment of the present application, that is, implementation manner two, is described.
The implementation mode II is as follows:
in a second implementation manner, the preset trigger operation is a first sliding operation, and the displaying, in response to the trigger condition being met, a second game screen on the graphical user interface includes: in response to the first trigger condition being met, displaying a third game screen on the graphical user interface, wherein the third game screen comprises: a first virtual object in a second state, and a first control; and displaying a second game screen on the graphical user interface in response to a first sliding operation for the first control, wherein the second game screen further comprises the first control, the first sliding operation being a sliding operation for a first direction of the first control. Next, taking the first game screen shown in (1) in fig. 5B as an example, an example of the third game screen and the second game screen described in the above-described second implementation mode will be described. Wherein (2) in fig. 5B shows a third game screen described in the second implementation manner described above; fig. 5B (3) shows that the user performs a first slide operation for the first control; fig. 5B (4) shows a second game screen described in the second implementation.
Optionally, after the second implementation manner, that is, after the graphical user interface displays the second game screen, the method further includes: in response to a second sliding operation for the first control, displaying a first game screen on the graphical user interface, wherein the second sliding operation is a sliding operation for a second direction of the first control, the second direction and the first direction being two different directions. In the embodiment of the present application, the first direction and the second direction are not particularly limited. For example, the first direction and the second direction may be opposite directions in the horizontal direction. As another example, the first direction and the second direction may be two directions perpendicular to each other. For another example, the angle between the first direction and the second direction is greater than zero and less than 90 degrees, or the angle between the first direction and the second direction is greater than 90 and less than 180 degrees. Illustratively, continuing with the example of fig. 5B described above, the description of "displaying the first game screen in the graphical user interface in response to the second sliding operation for the first control" in the above-described implementation may refer to the graphical user interfaces shown in (5) in fig. 5 and (6) in fig. 5.
In the embodiment of the present application, the preset virtual skills are not specifically limited, and may be set according to actual requirements. In some implementations, the first virtual character in the game has one virtual skill, and in such implementations, the preset virtual skill is the one virtual skill the first virtual character has. Optionally, in other implementations, the first virtual character in the game has a plurality of virtual skills, where the plurality of virtual skills corresponds to a plurality of attack intensities, and it is understood that the stronger the attack intensity corresponding to the virtual skills, the greater the injury to the enemy. In this implementation, the preset virtual skill is a virtual skill of the first virtual character having a highest attack intensity among the plurality of virtual skills, the virtual skill of the highest attack intensity also being referred to as a big-call of the first virtual character. That is, the first avatar releases the highest attack-strength virtual skill, which may also be referred to as a first avatar turn-on sign or turn-on sign. Any virtual skill possessed by the first virtual character is not particularly limited. For example, any one of the virtual skills possessed by the first virtual character is: freezing enemy skills. As another example, the first virtual character has any one of the virtual skills: burning enemy skills. For another example, the first virtual character may have any one of the following virtual skills: the body gets bigger instantaneously, and the gust can push away enemies on two sides of the channel instantaneously.
In this embodiment of the present application, the first state represents a state in which the preset virtual skills are not released, and the second state represents a state in which the preset virtual skills are released. Wherein whether or not the first virtual character in the first state releases other virtual skills than the preset virtual skills is not particularly limited. That is, the first avatar in the first state may release other virtual skills than the preset virtual skills described above, although the preset virtual skills are not released by the first avatar in the first state. Whether or not the first virtual character in the second state releases the virtual skills other than the preset virtual skills is not particularly limited. That is, the first virtual character in the second state may also release other virtual skills than the preset virtual skills described above.
In the embodiment of the present application, the first display mode and the second display mode are different. Specifically, the first virtual character in the first state displayed in the first display manner is specifically: a character model of a first virtual character in a first state displayed at a first size; the first virtual character in the second state displayed in the second display mode is specifically: a character model of the first virtual character in a second state displayed at a second size, wherein the second size is larger than the first size. The second display mode is a highlighted mode as compared with the first display mode. In the above implementation, by displaying the first virtual character in the form of the enlarged size (i.e., the second size) in which the preset virtual skill is released, the first virtual character in the form of the released preset virtual skill can be easily recognized, and the game malfunction rate can be reduced. The first game screen and the second game screen provided in the embodiments of the present application are described below by way of example. Illustratively, FIG. 5A (1) shows that the first game screen displayed by the graphical user interface includes a first virtual character, and the character model of the first virtual character in the first state is displayed in the first game screen at a first size; fig. 5A (2) shows a second game screen displayed by the graphical user interface displaying the character model of the first virtual character in the second state in the second size. It will be appreciated that (1) in fig. 5A described above shows that only the first virtual character is displayed in the first game screen displayed by the graphical user interface. By way of example, fig. 5A (3) shows that the first game screen displayed by the graphical user interface includes a first virtual character and a second virtual character, and the character model of the first virtual character in the first state is displayed in the first game screen in a first size, and the character model of the second virtual character in the unreleased preset virtual skill is displayed in the first size; fig. 5A (4) shows a second game screen displayed by the graphic user interface displaying the character model of the first virtual character in the second state in the second size and displaying the character model of the second virtual character in the first size without releasing the preset virtual skill. It will be appreciated that (2) in fig. 5A above shows that a plurality of virtual characters are displayed in the first game screen displayed by the graphical user interface, wherein the plurality of virtual characters includes a first virtual character and a second virtual character.
It should be understood that the method of game operation shown in fig. 4 is merely illustrative, and does not constitute any limitation on the method of game operation provided in the embodiments of the present application. For example, the first game screen may also display more content, where more content may be, but is not limited to: other virtual characters or virtual props, etc. As another example, the graphical user interface described above may also display a greater number of game visuals at the same time, where the greater number of game visuals may include, but are not limited to: a first game screen and a second game screen. For another example, in the above implementation manner, the first virtual character in the first state displayed in the first display manner may be replaced with: a character model of a first virtual character in a first state displayed at a first saturation; the first virtual character in the second state displayed in the second display manner may be replaced with: the character model of the first virtual character in the second state is displayed at a second saturation, wherein the second saturation is higher than the first saturation, and in this implementation, the second saturation is displayed in a highlighted manner as compared to the first saturation.
In an embodiment of the present application, a first game screen is displayed on a graphical user interface, where the first game screen includes: a first virtual character in a first state displayed in a first display mode; and displaying a second game picture on the graphical user interface under the condition that the preset triggering condition is met, wherein the second game picture comprises: a first virtual character in a second state displayed in a second display mode; the first state represents a state in which the preset virtual skills are not released, and the second state represents a state in which the preset virtual skills are released. That is, in the above-described implementation, when the state of the first virtual character is changed, the display mode for displaying the first virtual character (i.e., the first display mode is updated to the second display mode) may be updated in the graphical user interface, wherein the size of the character model of the first virtual character displayed in the second display mode is larger than the size of the character model of the first virtual character displayed in the first display mode, that is, the second display mode is a highlighting mode compared with the first display mode, so that the first virtual character having the state changed can be easily recognized and distinguished, and thus the error operation rate may be reduced.
Next, another method of game operation provided in the embodiment of the present application will be described with reference to fig. 6. It will be appreciated that the method of game play described in fig. 6 is one specific example of the method of game play described in fig. 4 above. Specifically, the terminal device 102 depicted in fig. 6 is a specific example of the terminal device depicted in fig. 4 described above; the graphical user interface 1 depicted in fig. 6 is a specific example of the graphical user interface described above in fig. 4; the scuba sign described in fig. 6 is a specific example of the preset virtual skills described in fig. 4 described above.
FIG. 6 is a schematic diagram of another method of game play provided in an embodiment of the present application. It should be understood that the example of fig. 6 is merely to aid one skilled in the art in understanding the present embodiments, and is not intended to limit the present embodiments to the specific values or particular scenarios illustrated. Various equivalent modifications and variations will be apparent to those skilled in the art from the example of fig. 6 given below, and such modifications and variations are intended to be within the scope of the embodiments of the present application. By way of example, the execution subject of the method of game operation is described below in fig. 6 as the terminal device 102 shown in fig. 1 described above. As shown in fig. 6, the method of game operation includes S610 to S650. Next, S610 to S650 are described in detail.
S610, displaying a graphical user interface 1 at the terminal device 102, wherein the graphical user interface 1 displays a game screen of a multi-player battle including a virtual character a, a teammate a of the virtual character a, and an enemy a in unreleased skill displayed in size 1.
The virtual character a is a game character controlled by the terminal device 102. The type of the terminal equipment is not particularly limited, and can be selected according to actual requirements. By way of example, the terminal device 102 may be, but is not limited to, the terminal device 102 in the application scenario illustrated in fig. 1 described above. For example, the terminal device 102 may be a personal computer.
The graphic user interface 1 displays a game screen of a multi-player combat including a virtual character a, a teammate a of the virtual character a, and an enemy a in unreleased skills displayed in size 1. Whether or not the game screen of the multi-person battle includes other virtual objects is not particularly limited. For example, the game screen of the multi-player combat may include a virtual plant. For example, referring to (1) in fig. 7, a game screen displayed by the graphical user interface 1 is shown, the game screen including: virtual plants, virtual character a, enemy a displayed in size 1, and teammates a of virtual character a.
The graphic user interface 1 displays that any one of the virtual character a, the enemy a, and the teammate a included in the game screen has at least one virtual skill. Wherein, a virtual character included in the game picture can fight with other virtual characters by utilizing specific virtual skills of the virtual character. The virtual skill possessed by any one virtual character is not particularly limited. For example, any one virtual character has virtual skills of: cooling the virtual skills of the enemy. As another example, any one virtual character has virtual skills of: the body gets bigger instantaneously, and the gust can push away enemies on two sides of the channel instantaneously. It will be appreciated that in the case where one virtual character has a plurality of virtual skills, the virtual skill of the plurality of virtual skills that has the greatest killing power is also referred to as the large poster of the one virtual character. That is, the one virtual character releases the virtual skill having the greatest killing power among the plurality of virtual skills, which is also referred to as the one virtual character's enlarged or enlarged poster.
The virtual character a, the game character selected by the teammate a of the virtual character a and the enemy a are not particularly limited, and can be configured according to actual needs and scenes. For example, virtual character A is game character 1, teammate A of virtual character A is game character 2, and enemy A is game character 3. In some game play scenarios, the game piece configured by one virtual character in the my team may also be the same as the game piece configured by one virtual character in the opponent team. For example, virtual character a is game character 1, teammate a of virtual character a is game character 2, and enemy a is game character 2.
S620, in response to the enemy A in the multi-player combat game screen displayed by the graphical user interface 1 being in a big-break state, judging whether the distance 1 between the enemy A and the teammate A is smaller than a preset threshold.
The purpose of S620 described above is to determine whether or not there is a safe distance between enemy a and teammate a in the open-air recruitment state. Wherein, in the case that the distance between the teammate a and the enemy a is a safe distance, the large recruitment released by the enemy a cannot form injury to the teammate a. It is understood that, before the above S620 is performed, the enemy a is in a state of not releasing the virtual skills.
The value of the preset threshold may be set according to an actual application scenario, which is not specifically limited in the embodiment of the present application. For example, the preset threshold may be, but is not limited to, 5 meters or 10 meters.
Wherein, in response to the enemy a in the game screen of the multi-player battle displayed by the graphical user interface 1 being in the open-up and recruitment state, determining whether the distance 1 between the enemy a and the teammate a is smaller than a preset threshold comprises: in response to the enemy a in the game screen of the multi-player battle displayed by the graphical user interface 1 being in the open-up recruitment state, performing S630 after performing S620 described above if it is determined that the distance 1 between the enemy a and the teammate a is less than or equal to the preset threshold; alternatively, in response to the enemy a in the game screen of the multi-player battle displayed by the graphic user interface 1 being in the on-hook state, in the case where it is determined that the distance 1 between the enemy a and the teammate a is greater than the preset threshold, S650 is performed after S620 described above is performed.
Next, a specific implementation of the graphical user interface 1 described in S620 above is described by way of example. Continuing with the example of the graphical user interface 1 described in S610 above being the graphical user interface shown in (1) of fig. 7, based on this, a specific example of the graphical user interface 1 described in S620 above is shown in (2) of fig. 7. Referring to (2) in fig. 7, the game screen displayed by the graphic user interface 1 includes: the device comprises a virtual plant, a virtual character A, a teammate A of the virtual character A and an enemy A in a release and big recruitment state, wherein the distance between the enemy A and the teammate A is 1.
S630, in response to the trigger operation a for the mouse of the terminal apparatus 101, the enemy a displayed in the size 1 in the graphical user interface 1 is updated to the enemy a displayed in the size 2, where the size 2 is larger than the size 1.
After the execution of S620, S630 is executed, namely: in the case where the distance 1 between the enemy a and the teammate a of the virtual character a is smaller than the preset threshold, if the enemy a is in a state of releasing the enlarged sign, the enemy a is displayed in an enlarged size (i.e., size 2) in the graphical user interface 1 by performing the trigger operation a. In the implementation manner, the enemy A in the release and amplification mode is highlighted, so that a player controlling the virtual character A can conveniently and quickly distinguish between the enemy and teammates, and further, the misoperation rate can be reduced.
Next, a specific implementation of the graphical user interface 1 described in S630 above is described by way of example. Continuing with the example where the graphical user interface 1 described above in S610 is the graphical user interface shown in (1) of fig. 7, and the graphical user interface 1 described above in S620 is the graphical user interface shown in (1) of fig. 7, based on this, fig. 7 (3) shows a specific example of the graphical user interface 1 described above in S630. Referring to (3) in fig. 7, the game screen displayed by the graphic user interface 1 includes: the virtual plant, the virtual character a, the teammate a of the virtual character a, and the enemy a in the released and enlarged state are displayed in a size 2, wherein the distance between the enemy a and the teammate a is 1.
S640, in response to ending the trigger operation a for the mouse of the terminal apparatus 101, causes the enemy a displayed in the size 2 in the graphical user interface 1 to be updated to the enemy a displayed in the size 1.
The above S640 is performed to update the enemy a displayed with the enlarged size 2 displayed on the graphical user interface 1 to the enemy a displayed with the original size 1.
In the embodiment of the present application, the triggering operation a is not particularly limited. For example, the trigger operation a may be: press the middle key of the mouse and scroll up. As another example, trigger operation a may be: double clicking the right mouse button. For another example, the triggering operation a may be: press the middle key of the mouse and scroll down.
Alternatively, before the above S640 is performed, the trigger operation a of ending the mouse for the terminal device 101 may also be performed. Here, execution timing of the trigger operation a of the mouse of the terminal apparatus 101 for execution end is not particularly limited. For example, after a period of time in which the enemy a is in a large-scale sign and is displayed in size 2, the trigger operation a to end the mouse for the terminal apparatus 101 may be performed. As another example, in a case where the enemy a is switched from the state of being in the big-call state to the state of not releasing the virtual skills, the trigger operation a to end the mouse for the terminal apparatus 101 may be performed.
S650, in response to no teammate of the virtual character a existing within the preset range of the enemy a in the on-hook state, displaying the enemy a in the size 1 in the graphic user interface 1, wherein the teammate of the virtual character a includes the teammate a.
After the execution of S620, S650 is executed, namely: in the case where the enemy a displayed in the graphical user interface 1 is in a state of being on the move, if the distance 1 between the enemy a and the teammate a of the virtual character a is greater than the preset threshold, it may not be necessary to display the enemy a in the size 2 in the graphical user interface 1.
It should be understood that the method of game operation shown in fig. 6 is merely illustrative, and does not constitute any limitation on the method of game operation provided in the embodiments of the present application. It will be appreciated that the method described in fig. 6 above is to display enemy a in enlarged size 2 in the case where enemy a is on the large scale and the distance between enemy a and virtual character a is less than or equal to the preset threshold. For example, in other implementations, when the game character placed by the enemy a and the game character of the teammate a of the virtual character a are the same game character, the enemy a may be displayed in the size 2 of the mode when the enemy a is enlarged. For example, in other implementations, S620 may not be performed after S610 described above is performed, i.e., S630 may be performed directly after S610, in which the operation of S630 is performed as long as enemy a is in the enlarged recruited state. As another example, in other implementations, S640 may not be performed after S630 is performed.
In the embodiment of the application, when the enemy a displayed on the graphical user interface 1 provided by the terminal device 102 is in a state of being on a large scale, and the distance between the enemy a and the teammate a of the virtual character a is less than or equal to the preset threshold, the enemy a can be displayed on the graphical user interface 1 in the enlarged size 2, the enemy a can be easily identified, and the misoperation rate can be reduced. Specifically, in the case where the game characters disposed by the enemy a and the teammate a are the same game character, when a plurality of persons are in a mixed battle, it is often not clear whether the teammate is on the large side or the enemy is on the large side, and the game character and the teammate displayed in an enlarged form can be distinguished by the graphical user interface 1, so that the error rate can be reduced.
Next, a method of further game operations provided in an embodiment of the present application will be described with reference to fig. 8. It will be appreciated that the method of game play described in fig. 8 is one specific example of the method of game play described in fig. 4 above.
Fig. 8 is a schematic diagram of another method of game play provided in an embodiment of the present application. It should be appreciated that the example of fig. 8 is merely to aid one skilled in the art in understanding the present embodiments, and is not intended to limit the present embodiments to the specific values or particular scenarios illustrated. Various equivalent modifications and variations will be apparent to those skilled in the art from the example of fig. 8 given below, and such modifications and variations are intended to be within the scope of the embodiments of the present application. By way of example, the execution subject of the method of game operation is described below in fig. 8 as an example of the terminal device 101 shown in fig. 1 described above. As shown in fig. 8, the method of game operation includes S810 to S830. Next, S810 to S830 are described in detail.
S810, a graphical user interface #1 is displayed at the terminal device 101, wherein the graphical user interface #1 displays the virtual plant, the virtual object 1, the teammate 1 of the virtual object 1, and the enemy 1 in the un-enlarged recruited state in size # 1.
The virtual object 1 is a game character controlled by the terminal device 101. The terminal device 101 supports a touch function, that is, a user may perform an operation of a preset gesture on a game screen displayed on the graphical user interface #1 provided by the terminal device 101, so as to display an execution result corresponding to the preset gesture on the graphical user interface # 1. The type of the terminal device 102 is not particularly limited. For example, the terminal device 102 may be, but is not limited to being, a smart phone supporting touch functionality or a tablet computer supporting touch functionality.
The game character configured by the teammate 1 of the virtual object 1 and the game character configured by the enemy 1 are the same game character. The same game character is not particularly limited and may be set according to actual application scenes and demands. Illustratively, the same game piece may be, but is not limited to being: rabbits or monkeys.
The enemy 1 may have a plurality of different virtual skills, in which the large recruitment that the enemy 1 has is the virtual skill that has the greatest killing power against other virtual characters among the plurality of different virtual skills that the enemy 1 has. The specific virtual skills corresponding to the large recruits of the enemy 1 are not limited. For example, in some implementations, enemy 1 has virtual skill 1 and virtual skill 2, where virtual skill 1 is 2 seconds to freeze other virtual characters and virtual skill 2 is 10 seconds to freeze other virtual characters, in which case virtual skill 2 may serve as a big poster of enemy 1.
Next, the graphical user interface #1 described in S810 above is described by way of example. By way of example, graphical user interface #1 shown in fig. 9 (1) shows: virtual plant, virtual object 1, teammate 1 (i.e., teammate of virtual object 1), and enemy 1 in an un-magnified state is displayed in size #1.
S820, in response to the enemy 1 being in the un-enlarged recruited state being switched to the enlarged recruited state, in response to the execution of the enlarged gesture operation on the graphical user interface #1 provided by the terminal device 101, the graphical user interface #1 displays the enemy 1 in the enlarged recruited state in a size #2, wherein the size #2 is larger than the size #1.
The above S820 is performed, that is, in the case where the enemy 1 is in the enlarged recruitment state, by the user interacting with the graphical user interface #1 provided by the terminal device 101 to realize the display of the enemy 1 in the enlarged recruitment state in the enlarged size #2, so that the enemy 1 in the enlarged recruitment state is easily recognized, and the error rate can be reduced.
Next, the gesture operation described in S820 above is described by way of example. By way of example, graphical user interface #1 shown in fig. 9 (2) shows: virtual plant, virtual object 1, teammate 1 (i.e., teammate of virtual object 1), and enemy 1 in an un-magnified state is displayed in size #1. Wherein the user's finger performs a zoom-in gesture operation on the graphical user interface #1. The graphical user interface #1 described in S820 above is described by way of example. By way of example, graphical user interface #1 shown in fig. 9 (3) shows: virtual plant, virtual object 1, teammate 1 (i.e., teammate of virtual object 1), and enemy 1 in an enlarged recruitment state is displayed in size # 2.
Alternatively, the zoom-in gesture operation described in S820 above may be replaced with other gesture operations, in which the other gesture operations control the graphical user interface #1 to display the enemy 1 in the zoom-in state in size # 2. Other gesture operations are not particularly limited, and can be set according to actual application scenes and requirements. For example, other gesture operations may be, but are not limited to: double clicking with a finger clicks on enemy 1.
S830, in response to a trigger operation of a control button displayed for the graphical user interface #1, the graphical user interface #1 displays the enemy 1 in the unreleased sign in the size #1.
By executing S830 described above, the enemy 1 can be displayed in the original graphical user interface #1 in the size of the size #1, in the case where the enemy 1 is not in the open-size-invitation, so that misunderstanding to the game player is avoided to reduce the game play experience.
Next, the graphical user interface #1 described in S830 above is described by way of example. By way of example, graphical user interface #1 shown in fig. 9 (3) shows: the user's finger performs a trigger operation for the control button displayed by the graphical user interface #1. Thereafter, the content displayed by the graphical user interface #1 is shown with reference to (4) in fig. 9, i.e., the enemy 1 in the unreleased sign is displayed with the size #1. It should be understood that the method of game operation shown in fig. 8 is merely illustrative, and does not constitute any limitation on the method of game operation provided in the embodiments of the present application.
It should be understood that the method of game operation described in fig. 8 is merely illustrative, and is not limited in any way to the method of game operation provided in the embodiments of the present application. For example, in some implementations, only S810 and S820 described above may be performed. As another example, in some implementations, a related description that enemy 1 is in a zoom-in state may also be presented in a bullet text box in graphical user interface #1, that is, in such implementations, the presented content of the bullet text box replaces the above-described scheme of displaying enemy 1 in size # 2.
In the embodiment of the present application, in the case where the enemy 1 displayed on the graphical user interface #1 provided by the terminal device 101 is in an enlarged recruitment, the user interacts with the graphical user interface #1, so that the graphical user interface #1 displays the enemy 1 in the enlarged size #2, and in this embodiment of the present application, in the case where the game character configured by the teammate of the virtual object 1 and the game character configured by the enemy 1 are the same game character, it is easy to distinguish between the enemy and the enemy, and the error rate can be reduced.
In the above, the application scenario and the method of game operation to which the method of game operation provided in the present application is applicable are described in detail with reference to fig. 1 to 9. Next, an apparatus and an electronic device for game operations provided in the present application are described with reference to fig. 10 and 11. It should be appreciated that the above method of game operation corresponds to the below apparatus and electronic device of game operation. Details not described in detail below can be found in the relevant description of the method embodiments described above.
Fig. 10 is a schematic structural view of a game operation device according to an embodiment of the present application. Referring to the apparatus of the game operation shown in fig. 10, a graphic user interface including at least a part of a game scene including at least one first virtual character is provided through a terminal device, and the apparatus includes a display unit 1001. Next, the function of the display unit 1001 will be described in detail.
The display unit 1001 is configured to: displaying a first game screen on the graphical user interface, wherein the first game screen comprises: the first virtual character in a first state is displayed in a first display mode, wherein the first state represents a state in which a preset virtual skill is not released; the display unit 1001 is further configured to: in response to the trigger condition being met, displaying a second game screen on the graphical user interface, wherein the second game screen comprises: the first virtual character in a second state displayed in a second display mode, wherein the second state represents a state of releasing the preset virtual skills; the first virtual character in the first state displayed in the first display mode is: a character model of the first virtual character in the first state displayed at a first size; the first virtual character in the second state displayed in the second display mode is: a character model of the first virtual character in the second state displayed at a second size, the second size being larger than the first size.
Optionally, in some implementations, the meeting the trigger condition includes: a first trigger condition is satisfied, wherein the first trigger condition includes the first virtual object in the first state switching to the first virtual object in the second state.
Optionally, in other implementations, the meeting the triggering condition further includes: a second trigger condition is satisfied, wherein the second trigger condition includes: a preset trigger operation is detected.
Optionally, in other implementations, the preset trigger operation is a first gesture operation or a first key operation.
Optionally, in other implementations, after the graphical user interface displays the second game screen, the method further includes: in response to switching from detecting the first key operation to not detecting the first key operation under the condition that the preset trigger operation is the first key operation, displaying the first game screen on the graphical user interface; or when the preset trigger operation is the first gesture operation, the first game screen is displayed on the graphical user interface in response to switching from the detection of the first gesture operation to the detection of a second gesture operation, wherein the second gesture operation is different from the first gesture operation.
Optionally, in other implementations, the preset trigger operation is a first sliding operation, and the displaying, in response to the trigger condition being met, the second game screen on the graphical user interface includes: in response to the first trigger condition being met, displaying a third game screen on the graphical user interface, wherein the third game screen comprises: the first virtual object in the second state and a first control; and displaying the second game screen on the graphical user interface in response to the first sliding operation for the first control, wherein the second game screen further comprises the first control, and the first sliding operation is a sliding operation for a first direction of the first control.
Optionally, in other implementations, after the graphical user interface displays the second game screen, the method further includes: and displaying the first game screen on the graphical user interface in response to a second sliding operation for the first control, wherein the second sliding operation is a sliding operation for a second direction of the first control, and the second direction and the first direction are two different directions.
Optionally, in other implementations, the first virtual role is a role controlled by the terminal device.
Optionally, in other implementations, the first virtual character is a teammate of a second virtual character, where the second virtual character is a character controlled by the terminal device.
Optionally, in other implementations, the first virtual character is an enemy of a second virtual character, where the second virtual character is a character controlled by the terminal device.
Optionally, in other implementations, the portion of the game scene further includes the second avatar, the game character configured by the first avatar is the same as the game character configured by the second avatar, wherein the first game screen further includes: the second virtual character in the first state displayed in the first display mode; and, the second game screen further includes: and displaying the second virtual character in the first state in the first display mode.
Fig. 11 is a schematic structural diagram of an electronic device according to an embodiment of the present application. As shown in fig. 11, includes a memory 1101, a processor 1102, a communication interface 1103 and a communication bus 1104. The memory 1101, the processor 1102, and the communication interface 1103 implement communication connection therebetween through a communication bus 1104.
The memory 1101 may be a Read Only Memory (ROM), a static storage device, a dynamic storage device, or a random access memory (random access memory, RAM). The memory 1101 may store programs, and when the programs stored in the memory 1101 are executed by the processor 1102, the processor 1102 and the communication interface 1103 are used to perform the steps of the method of game operations of the embodiments of the present application.
The processor 1102 may employ a general-purpose central processing unit (central processing unit, CPU), microprocessor, application specific integrated circuit (application specific integrated circuit, ASIC), graphics processor (graphics processing unit, GPU) or one or more integrated circuits for executing associated programs to perform functions required by the elements of the apparatus for game operations of the embodiments of the present application or to perform the various steps of the methods of game operations of the embodiments of the present application.
The processor 1102 may also be an integrated circuit chip with signal processing capabilities. In implementation, various steps of the methods of game operations provided herein may be performed by integrated logic circuitry in hardware or instructions in software in the processor 1102. The processor 1102 may also be a general purpose processor, a digital signal processor (digital signal processing, DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf programmable gate array (field programmable gate array, FPGA) or other programmable logic device, discrete gate or transistor logic device, discrete hardware components. The disclosed methods, steps, and logic blocks in the embodiments of the present application may be implemented or performed. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. The steps of a method disclosed in connection with the embodiments of the present application may be embodied directly in hardware, in a decoded processor, or in a combination of hardware and software modules in a decoded processor. The software modules may be located in a random access memory, flash memory, read only memory, programmable read only memory, or electrically erasable programmable memory, registers, etc. as well known in the art. The storage medium is located in the memory 1101, and the processor 1102 reads information in the memory 1101, and in combination with hardware thereof, performs functions required to be performed by units included in the apparatus for game operation of the method embodiment of the present application, or performs a method for game operation of the method embodiment of the present application.
The communication interface 1103 enables communication between the device shown in fig. 11 and other devices or communication networks using a transceiver means such as, but not limited to, a transceiver.
A communication bus 1104 may include a path to transfer information between the various components of the device shown in fig. 11 (e.g., memory 1101, processor 1102, communication interface 1103).
The embodiment of the application provides a computer readable storage medium, which comprises computer instructions, and the computer instructions are used for realizing the technical scheme of any game operation method in the embodiment of the application when being executed by a processor.
From the above description of embodiments, those skilled in the art will readily appreciate that the example embodiments described herein may be implemented in software, or may be implemented in software in combination with the necessary hardware. Thus, the technical solution according to the embodiments of the present disclosure may be embodied in the form of a software product, which may be stored on a computer readable medium, including several instructions to cause a computing device (which may be a personal computer, a server, a terminal device, or a network device, etc.) to perform the method according to the embodiments of the present disclosure.
In one typical configuration, a computing device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
The memory may include volatile memory in a computer-readable medium, random Access Memory (RAM) and/or nonvolatile memory, such as Read Only Memory (ROM) or flash memory (flash RAM). Memory is an example of computer-readable media.
Computer readable media, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of storage media for a computer include, but are not limited to, phase change memory (PRAM), static Random Access Memory (SRAM), dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), read Only Memory (ROM), electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape magnetic disk storage or other magnetic storage media, or any other non-transmission media, that can be used to store information that can be accessed by a computing device. Computer readable media, as defined herein, does not include non-transitory computer readable media (transmission media), such as modulated data signals and carrier waves.
It will be appreciated by those skilled in the art that embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
While the preferred embodiment has been described, it is not intended to limit the invention thereto, and any person skilled in the art may make variations and modifications without departing from the spirit and scope of the present invention, so that the scope of the present invention shall be defined by the claims of the present application.

Claims (14)

1. A method of game play, characterized in that a graphical user interface is provided by a terminal device, said graphical user interface comprising at least a part of a game scene, said part of a game scene comprising at least one first virtual character, said method comprising:
displaying a first game screen on the graphical user interface, wherein the first game screen comprises: the first virtual character in a first state is displayed in a first display mode, wherein the first state represents a state in which a preset virtual skill is not released;
In response to the trigger condition being met, displaying a second game screen on the graphical user interface, wherein the second game screen comprises: the first virtual character in a second state displayed in a second display mode, wherein the second state represents a state of releasing the preset virtual skills;
the first virtual character in the first state displayed in the first display mode is: a character model of the first virtual character in the first state displayed at a first size; the first virtual character in the second state displayed in the second display mode is: a character model of the first virtual character in the second state displayed at a second size, the second size being larger than the first size.
2. The method of claim 1, wherein the step of determining the position of the substrate comprises,
the meeting the triggering condition comprises: a first trigger condition is satisfied, wherein the first trigger condition includes the first virtual object in the first state switching to the first virtual object in the second state.
3. The method of claim 2, wherein the step of determining the position of the substrate comprises,
the meeting the triggering condition further includes: a second trigger condition is satisfied, wherein the second trigger condition includes: a preset trigger operation is detected.
4. A method according to claim 3, wherein the preset trigger operation is a first gesture operation or a first key operation.
5. The method of claim 4, wherein after the graphical user interface displays a second game screen, the method further comprises:
in response to switching from detecting the first key operation to not detecting the first key operation under the condition that the preset trigger operation is the first key operation, displaying the first game screen on the graphical user interface; or,
and when the preset trigger operation is the first gesture operation, responding to the detection of the first gesture operation to the detection of a second gesture operation, and displaying the first game picture on the graphical user interface, wherein the second gesture operation is different from the first gesture operation.
6. A method according to claim 3, wherein the preset trigger operation is a first sliding operation, and the displaying the second game screen on the graphical user interface in response to the trigger condition being satisfied comprises:
in response to the first trigger condition being met, displaying a third game screen on the graphical user interface, wherein the third game screen comprises: the first virtual object in the second state and a first control;
And displaying the second game screen on the graphical user interface in response to the first sliding operation for the first control, wherein the second game screen further comprises the first control, and the first sliding operation is a sliding operation for a first direction of the first control.
7. The method of claim 6, wherein after the graphical user interface displays a second game screen, the method further comprises:
and displaying the first game screen on the graphical user interface in response to a second sliding operation for the first control, wherein the second sliding operation is a sliding operation for a second direction of the first control, and the second direction and the first direction are two different directions.
8. A method according to any one of claims 1 to 7, characterized in that the first virtual character is a character controlled by the terminal device.
9. The method according to any one of claims 1 to 7, wherein,
the first virtual character is a teammate of a second virtual character, wherein the second virtual character is a character controlled by the terminal device.
10. The method according to any one of claims 1 to 7, wherein,
the first virtual character is an enemy of a second virtual character, wherein the second virtual character is a character controlled by the terminal device.
11. The method of claim 10 wherein the portion of the game scene further comprises the second avatar, the game piece configured by the first avatar and the game piece configured by the second avatar being the same,
wherein, the first game picture further includes: the second virtual character in the first state displayed in the first display mode; and, the second game screen further includes: and displaying the second virtual character in the first state in the first display mode.
12. An apparatus for game operation, characterized in that a graphical user interface is provided by means of a terminal device, said graphical user interface comprising at least a part of a game scene, said part of a game scene comprising at least one first virtual character, said apparatus comprising a display unit,
the display unit is used for: displaying a first game screen on the graphical user interface, wherein the first game screen comprises: the first virtual character in a first state is displayed in a first display mode, wherein the first state represents a state in which a preset virtual skill is not released;
The display unit is further configured to: in response to the trigger condition being met, displaying a second game screen on the graphical user interface, wherein the second game screen comprises: the first virtual character in the second state displayed in a second display mode, wherein the second state represents a state of releasing the preset virtual skills;
the first virtual character in the first state displayed in the first display mode is: a character model of the first virtual character in the first state displayed at a first size; the first virtual character in the second state displayed in the second display mode is: a character model of the first virtual character in the second state displayed at a second size, the second size being larger than the first size.
13. An electronic device, comprising: a memory and a processor, the memory and the processor coupled;
the memory is used for storing one or more computer instructions;
the processor is configured to execute the one or more computer instructions to implement the method of any of claims 1 to 11.
14. A computer readable storage medium having stored thereon one or more computer instructions executable by a processor to implement the method of any of claims 1 to 11.
CN202310195645.4A 2023-02-22 2023-02-22 Game operation method and device and electronic equipment Pending CN116271811A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310195645.4A CN116271811A (en) 2023-02-22 2023-02-22 Game operation method and device and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310195645.4A CN116271811A (en) 2023-02-22 2023-02-22 Game operation method and device and electronic equipment

Publications (1)

Publication Number Publication Date
CN116271811A true CN116271811A (en) 2023-06-23

Family

ID=86837190

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310195645.4A Pending CN116271811A (en) 2023-02-22 2023-02-22 Game operation method and device and electronic equipment

Country Status (1)

Country Link
CN (1) CN116271811A (en)

Similar Documents

Publication Publication Date Title
WO2022151946A1 (en) Virtual character control method and apparatus, and electronic device, computer-readable storage medium and computer program product
JP7390400B2 (en) Virtual object control method, device, terminal and computer program thereof
CN113398601B (en) Information transmission method, information transmission device, computer-readable medium, and apparatus
JP2022526456A (en) Virtual object control method and its devices, computer devices and programs
CN112416196B (en) Virtual object control method, device, equipment and computer readable storage medium
JP2024524734A (en) Live match broadcast display method and device, computer device, and computer program
WO2022257653A1 (en) Virtual prop display method and apparatus, electronic device and storage medium
CN111905363B (en) Virtual object control method, device, terminal and storage medium
CN113262481B (en) Interaction method, device, equipment and storage medium in game
CN110801629B (en) Method, device, terminal and medium for displaying virtual object life value prompt graph
TWI831074B (en) Information processing methods, devices, equipments, computer-readable storage mediums, and computer program products in virtual scene
CN112691366B (en) Virtual prop display method, device, equipment and medium
US20160170589A1 (en) Smart ping system
WO2022193838A1 (en) Game settlement interface display method and apparatus, device and medium
CN114377396A (en) Game data processing method and device, electronic equipment and storage medium
CN114272617A (en) Virtual resource processing method, device, equipment and storage medium in virtual scene
KR20230042517A (en) Contact information display method, apparatus and electronic device, computer-readable storage medium, and computer program product
JP2023164687A (en) Virtual object control method and apparatus, and computer device and storage medium
WO2023024880A1 (en) Method and apparatus for expression displaying in virtual scenario, and device and medium
CN115645923A (en) Game interaction method and device, terminal equipment and computer-readable storage medium
CN115708956A (en) Game picture updating method and device, computer equipment and medium
EP3984608A1 (en) Method and apparatus for controlling virtual object, and terminal and storage medium
CN116271811A (en) Game operation method and device and electronic equipment
CN116920374A (en) Virtual object display method and device, storage medium and electronic equipment
CN116212386A (en) Method and device for picking up virtual article in game, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination