CN118079393A - Game interaction method and device, electronic equipment and readable storage medium - Google Patents
Game interaction method and device, electronic equipment and readable storage medium Download PDFInfo
- Publication number
- CN118079393A CN118079393A CN202410058996.5A CN202410058996A CN118079393A CN 118079393 A CN118079393 A CN 118079393A CN 202410058996 A CN202410058996 A CN 202410058996A CN 118079393 A CN118079393 A CN 118079393A
- Authority
- CN
- China
- Prior art keywords
- virtual
- game
- virtual sphere
- virtual object
- sphere
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 76
- 230000003993 interaction Effects 0.000 title claims abstract description 38
- 230000004044 response Effects 0.000 claims abstract description 32
- 230000001960 triggered effect Effects 0.000 claims description 14
- 230000006870 function Effects 0.000 description 11
- 238000010586 diagram Methods 0.000 description 10
- 230000000007 visual effect Effects 0.000 description 6
- 230000008569 process Effects 0.000 description 4
- 230000008685 targeting Effects 0.000 description 4
- 230000005540 biological transmission Effects 0.000 description 3
- 230000008859 change Effects 0.000 description 3
- 208000003464 asthenopia Diseases 0.000 description 2
- 230000002860 competitive effect Effects 0.000 description 2
- 238000004590 computer program Methods 0.000 description 2
- 230000001419 dependent effect Effects 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 238000007654 immersion Methods 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 238000004088 simulation Methods 0.000 description 2
- 238000013473 artificial intelligence Methods 0.000 description 1
- 230000001413 cellular effect Effects 0.000 description 1
- 239000003086 colorant Substances 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 210000005069 ears Anatomy 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 230000002708 enhancing effect Effects 0.000 description 1
- 230000035484 reaction time Effects 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
Classifications
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/55—Controlling game characters or game objects based on the game progress
- A63F13/57—Simulating properties, behaviour or motion of objects in the game world, e.g. computing tyre load in a car race game
- A63F13/577—Simulating properties, behaviour or motion of objects in the game world, e.g. computing tyre load in a car race game using determination of contact between game characters or objects, e.g. to avoid collision between virtual racing cars
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/55—Controlling game characters or game objects based on the game progress
- A63F13/57—Simulating properties, behaviour or motion of objects in the game world, e.g. computing tyre load in a car race game
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/80—Special adaptations for executing a specific game genre or game mode
- A63F13/803—Driving vehicles or craft, e.g. cars, airplanes, ships, robots or tanks
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/80—Special adaptations for executing a specific game genre or game mode
- A63F13/812—Ball games, e.g. soccer or baseball
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/60—Methods for processing data by generating or executing the game program
- A63F2300/64—Methods for processing data by generating or executing the game program for computing dynamical parameters of game objects, e.g. motion determination or computation of frictional forces for a virtual car
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/80—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
- A63F2300/8011—Ball
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The application discloses a game interaction method, a game interaction device, electronic equipment and a computer readable storage medium, wherein the method provides a graphical user interface for displaying at least part of game scenes through terminal equipment, and the game scenes comprise first virtual objects and virtual spheres controlled by the terminal equipment; the method comprises the following steps: displaying a target control through the graphical user interface; responding to triggering operation aiming at the target control, and determining a target direction of the virtual sphere to be moved; and controlling the virtual sphere to move along the target direction in response to a game event that the first virtual object collides with the virtual sphere. The method can reduce the operation difficulty of determining the launching direction of the virtual sphere in the game by the player, improve the control precision of the player on the launching angle of the virtual sphere, and improve the game experience of the player.
Description
Technical Field
The present application relates to the field of computers, and in particular, to a game interaction method, apparatus, electronic device, and computer readable storage medium.
Background
In the related art, in a racing football game, a football is launched immediately after being impacted by a racing car, and the launching angle of the football is generally determined according to the angle between the racing car and the football at the moment of impact, that is, the launching direction of the football depends on the angle between the racing car and the football. As the angle of football emission is higher in randomness and uncontrollability due to the randomness of the impact of the racing car, the direction of football emission is difficult to control by a player in a match, the operation threshold of the playing method is higher, and the game experience of the player is influenced.
It should be noted that the information disclosed in the above background section is only for enhancing understanding of the background of the application and thus may include information that does not form the prior art that is already known to those of ordinary skill in the art.
Disclosure of Invention
In view of the above, the present application provides a game interaction method, apparatus, electronic device, and computer readable storage medium, which can improve the accuracy of controlling the launching angle of a virtual sphere by a player in a game, and improve the game experience of the player.
In a first aspect, an embodiment of the present application provides a game interaction method, where a graphical user interface is provided by a terminal device, where at least part of a game scene is displayed in the graphical user interface, where the game scene includes a first virtual object and a virtual sphere controlled by the terminal device; the method comprises the following steps:
Displaying a target control through the graphical user interface;
responding to triggering operation aiming at the target control, and determining a target direction of the virtual sphere to be moved;
and controlling the virtual sphere to move along the target direction in response to a game event that the first virtual object collides with the virtual sphere.
In a second aspect, an embodiment of the present application provides a game interaction device, where a graphical user interface is provided by a terminal device, where at least part of a game scene is displayed in the graphical user interface, where the game scene includes a first virtual object and a virtual sphere controlled by the terminal device; the device comprises: the device comprises a display unit, a determining unit and a control unit;
the display unit is used for displaying a target control through the graphical user interface;
the determining unit is used for responding to the triggering operation aiming at the target control and determining the target direction of the virtual sphere to be moved;
The control unit is used for responding to a game event that the first virtual object collides with the virtual sphere and controlling the virtual sphere to move along the target direction.
In a third aspect, an embodiment of the present application provides an electronic device, including:
A processor; and
A memory for storing a data processing program, the electronic device being powered on and executing the program by the processor, to perform the method as in the first aspect.
In a fourth aspect, embodiments of the present application provide a computer readable storage medium storing a data processing program for execution by a processor to perform a method as in the first aspect.
According to the game interaction method provided by the application, a graphical user interface is provided through the terminal equipment, at least part of game scenes are displayed in the graphical user interface, and the game scenes comprise first virtual objects and virtual spheres controlled by the terminal equipment; displaying a target control through the graphical user interface; the target control is used for controlling the movement direction of the virtual sphere; responding to triggering operation aiming at the target control, and determining a target direction of the virtual sphere to be moved; after determining the target direction, the virtual sphere may be controlled to move in the target direction in response to a game event that controls the first virtual object to collide with the virtual sphere when the player controls the first virtual object to collide with the virtual sphere.
Therefore, the movement direction of the virtual sphere is controlled through the target control, after the first virtual object controlled by the player collides with the virtual sphere, the virtual sphere is controlled to move along the target direction determined by the player, so that the movement direction of the virtual sphere can be accurately controlled by the player, the operation difficulty of the player for controlling the angle of the first virtual object to collide with the virtual sphere is reduced, and the game experience of the player is improved.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed in the description of the embodiments of the present application will be briefly described below, it being obvious that the drawings in the following description are only some embodiments of the present application, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a schematic diagram of an exemplary system for implementing a game interaction method according to an embodiment of the present application;
FIG. 2 is a flowchart of an embodiment of a game interaction method according to the present application;
FIG. 3 is a schematic diagram of some graphical user interfaces provided by embodiments of the present application;
FIG. 4 is a schematic diagram of an interface for displaying direction identifiers according to an embodiment of the present application;
FIG. 5 is a schematic diagram of a comparison of active and inactive target display controls provided by an embodiment of the present application;
FIG. 6 is a schematic diagram illustrating a display of an example target control according to an embodiment of the present application;
FIG. 7 is an interface diagram of a graphical user interface of another example game interaction method provided by an embodiment of the present application;
FIG. 8 is a schematic structural diagram of a game interaction device according to an embodiment of the present application;
fig. 9 is a block diagram of an electronic device for implementing a game interaction method according to an embodiment of the present application.
Detailed Description
In the following description, numerous specific details are set forth in order to provide a thorough understanding of the present application. The present application may be embodied in many other forms than those herein described, and those skilled in the art will readily appreciate that the present application may be similarly embodied without departing from the spirit or essential characteristics thereof, and therefore the present application is not limited to the specific embodiments disclosed below.
It should be noted that the terms "first," "second," "third," and the like in the claims, description, and drawings of the present application are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. The data so used is interchangeable under appropriate circumstances such that the embodiments of the application described herein are capable of operation in sequences other than those illustrated or otherwise described herein. Furthermore, the terms "comprises," "comprising," and their variants are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
It should be understood that in embodiments of the present application, "at least one" means one or more and "a plurality" means two or more. "and/or" is merely an association relationship describing an association object, meaning that there may be three relationships, e.g., a and/or B, may represent: a exists alone, A and B exist together, and B exists alone. The character "/" generally indicates that the context-dependent object is an "or" relationship. "comprising A, B and/or C" means comprising any 1 or any 2 or 3 of A, B, C.
It should be understood that in embodiments of the present application, "B corresponding to a", "a corresponding to B", or "B corresponding to a" means that B is associated with a from which B may be determined. Determining B from a does not mean determining B from a alone, but may also determine B from a and/or other information.
Before explaining the embodiments of the present application in detail, some related arts will be first described.
In the related art, in a racing football game, the launching angle of the football is generally determined according to the angle between the racing car and the football at the moment of impact, that is, the launching direction of the football depends on the impact point, impact area, etc. of the racing car to the football, and the launching direction of the football is obtained by calculating a series of information based on the impact point, impact area, etc. at the moment of impact, for example, when the racing car impacts the football, the football is at the right front of the headstock, the football is launched in the right front direction of the headstock, and if the football is at the left front of the headstock, the football is launched in the left front direction of the headstock. The direction of the ball is dependent on the player controlling the direction in which the car hits the ball. Since racing is generally faster, the position where the racing hits the football is randomized, resulting in a high degree of randomness and uncontrollability in the angle of the football launch. If the player wants to launch the football according to the expected launching direction when the player controls the racing car to strike the football, the player needs to have higher operation level and stronger space architecture capacity, so that the angle of striking the football by the racing car and the rapid switching operation can be accurately judged. The technical threshold for new hand entry is high, the learning cost is high, and the game experience of a player is affected.
In addition, in the racing game, football can be launched immediately after being impacted by the racing car, the reaction time of player operation is less, the direction of football emission is difficult to control, the operation threshold of playing method is higher, and the game experience of the player is affected.
Based on the above problems, embodiments of the present application provide a game interaction method, apparatus, electronic device, and computer readable storage medium.
The game interaction method provided by the embodiment of the application can be executed by the electronic equipment, and the electronic equipment can be a terminal or a server and other equipment. The terminal can be terminal equipment such as a smart phone, a tablet personal computer, a notebook computer and the like. The server may be an independent physical server, a server cluster or a distributed system formed by a plurality of physical servers, or a cloud server providing cloud services, cloud databases, cloud computing, cloud storage, network services, cloud communication, middleware services, domain name services, security services, CDNs, basic cloud computing services such as big data and artificial intelligence platforms, and the like.
In an alternative embodiment, the terminal device stores the game application and the game virtual scene when the game interaction method is run on the terminal device. The terminal device interacts with the player through a graphical user interface. The manner in which the terminal device provides the graphical user interface to the player may include a variety of ways, for example, the graphical user interface may be rendered for display on a display screen of the terminal device, or presented by holographic projection.
In an alternative embodiment, the game interaction method may be implemented and executed based on a cloud gaming system when the method is run on a server. Cloud gaming systems refer to gaming modalities based on cloud computing. The cloud game system comprises a server and client equipment. The running main body of the game application program and the game picture presentation main body are separated, and the storage and running of the game interaction method are completed on a server. The game screen presentation is performed at a client, and the client is mainly used for receiving and sending game data and presenting the game screen, for example, the client may be a display device with a data transmission function near a player side, such as a mobile terminal, a television, a computer, a palm computer, a personal digital assistant, a head-mounted display device, etc., but the terminal device for processing game data is a cloud server. When playing the game, the player operates the client to send an instruction to the server, the server controls the game to run according to the instruction, codes and compresses data such as game pictures and the like, the data is returned to the client through a network, and finally, the game pictures are decoded and output through the client.
It should be noted that, in the embodiment of the present application, the execution body of the game interaction method may be a terminal device or a server, where the terminal device may be a local terminal device or a client device in the foregoing cloud game. The embodiment of the application does not limit the type of the execution body.
By way of example, in connection with the above description, fig. 1 illustrates a gaming system 1000 for implementing a game interaction method according to an embodiment of the present application, the gaming system 1000 may include at least one terminal 1001, at least one server 1002, at least one database 1003, and a network. The terminal 1001 held by the player may be connected to a server of a different game through a network. A terminal is any device having computing hardware capable of supporting software application tools corresponding to executing a game.
Wherein the terminal 1001 includes a display screen for presenting a game screen and receiving an operation of a player for generating a game screen, and a processor. The game screen may include a portion of a virtual game scene that is a virtual world in which virtual characters are active. The processor is configured to run the game, generate a game screen, respond to an operation, and control display of the game screen on the display screen. When the player operates the game screen through the display screen, the game screen can control the local content of the terminal in response to the received operation instruction, and can also control the content of the opposite terminal server 1002 in response to the received operation instruction.
In addition, when the game system 1000 includes a plurality of terminals, a plurality of servers, a plurality of networks, different terminals may be connected to each other through different networks, through different servers. The network may be a wireless network or a wired network, such as a Wireless Local Area Network (WLAN), a Local Area Network (LAN), a cellular network, a 2G network, a 3G network, a 4G network, a 5G network, etc. In addition, the different terminals may be connected to other terminals or to a server or the like using their own bluetooth network or hotspot network. In addition, the gaming system 1000 may include multiple databases coupled to different servers and information related to the game may be continuously stored in the databases as different players play the multiplayer game online.
It should be noted that, in the embodiment of the present application, the same virtual game is executed on a plurality of terminal devices, so that data interaction between the plurality of terminal devices may be implemented through a server of the virtual game. Thus, the sending of data by the terminal device 1 to the terminal device 2 can be understood as: the terminal device 1 sends data to a server of the virtual game, which sends the data to the terminal device 2. The reception of data transmitted by the terminal device 2 by the terminal device 1 can be understood as: the terminal device 1 receives data transmitted from the server of the virtual game, the data being data transmitted from the terminal device 2 to the server. Or there may be no game server, and the terminal device 1 directly transmits game data to the terminal device 2.
It should be noted that the schematic game system shown in fig. 1 is only an example, and the game system 1000 described in the embodiment of the present application is for more clearly describing the technical solution of the embodiment of the present application, and does not constitute a limitation to the technical solution provided in the embodiment of the present application, and those skilled in the art can know that the technical solution provided in the embodiment of the present application is equally applicable to similar technical problems with evolution of the game system and occurrence of new service scenarios.
It should be noted that, the triggering operations that occur in the following detailed description of the game interaction method provided by the embodiment of the present application may be regarded as triggering operations implemented by the player through a finger or a medium such as a control mouse, a keyboard or a stylus. Which medium is specifically used may be determined according to the type of electronic device. For example, when the electronic device is a touch screen device such as a cell phone, tablet, gaming machine, etc., the player may operate on the touch screen via any suitable object or accessory such as a finger, stylus, etc. When the terminal equipment is a non-touch screen terminal equipment such as a desktop computer, a notebook computer and the like, a player can operate through external equipment such as a mouse, a keyboard and the like.
The technical scheme of the application is described in detail through specific examples. It should be noted that the following embodiments may be combined with each other, and the same or similar concepts or processes may not be described in detail in some embodiments.
In the embodiment of the application, a graphical user interface is provided through a terminal, wherein the graphical user interface displays at least part of game scenes of a virtual game, and the game scenes comprise a first virtual object and a virtual sphere which are controlled by the terminal.
The virtual game in the present application may be any game that includes a function or play of hitting balls through a controlled virtual object. The virtual game may be a single-player game (single-PLAYER GAME) or a multiplayer online competitive game (Multiplayer Online Game), and the application is not particularly limited in this regard.
The virtual sphere may include a virtual object having a spherical shape or a virtual object having a spherical body (for example, a virtual character having a head body as a spherical model and model details such as hands, feet, and ears).
The above game scene may be understood as a simulation environment of the real world in the game, may be a half-simulated half-fictional virtual environment, or may be a purely fictional virtual environment. The game scene may be any one of a two-dimensional virtual scene, a 2.5-dimensional virtual scene, and a three-dimensional virtual scene. A virtual scene may generally include a plurality of scene elements, which refer to individual elements required to construct the virtual scene. In this embodiment, the above-mentioned game scene is a game environment in which a player controls a virtual object to strike a virtual sphere, and may be, for example, a simulated environment for a football game field.
It can be understood that the first virtual object is a game character that controls the player of the terminal device to play in the game. The first virtual object may represent a player character, and the first virtual object may be implemented as a three-dimensional virtual model or a two-dimensional virtual model, and the embodiment is not particularly limited. The first virtual object includes, but is not limited to, at least one of a virtual character, a virtual animal, a virtual machine. Examples of virtual machines may include racing vehicles, airships, and the like.
It will be appreciated that one or more first virtual objects may be included in a game scene, and embodiments of the application are not particularly limited. In one embodiment, the same player may control virtual objects under multiple different accounts. Such as: the player A has two game accounts in a game, the two game accounts respectively correspond to the virtual object 1 and the virtual object 2 in the game process, and the player A logs in the two accounts on the two game terminals respectively to play the same game, so that two first virtual objects can exist in a game scene at the same time.
Fig. 2 is a flowchart of an example of a game interaction method according to an embodiment of the present application, as shown in fig. 2. It should be noted that the steps shown may be performed in a different logical order than that shown in the method flow diagram. The method may include the following steps S210 to S230.
Step S210: and displaying the target control through a graphical user interface.
In the embodiment of the application, the target control is provided with game logic for controlling the virtual sphere, for example, the movement direction and movement speed of the virtual sphere can be controlled by triggering the target control. The shape of the target control may be circular, rectangular, hexagonal, etc., although other shapes may be included, and the present embodiment is not limited thereto.
It should be noted that, in the game interface, various interface contents to be displayed are usually presented in a layer, where a layer refers to a layered display manner in the gui, and different interface contents may be placed on different layers, so as to better control the display and hiding of each interface content. Such as: the game scene may be placed on a scene layer, i.e., the scene layer is primarily used to render the game world and objects therein, including characters, environments, dynamic elements, etc. The display of various UI (User Interface) elements is placed on the UI layer, for example, the UI elements may include, but are not limited to, various operational controls, information panels (e.g., chat interfaces, character status panels, etc.) for displaying various game information, game thumbnail maps, and so forth. Since the UI element is an interface element for a player to interact with the game, the UI layer on which the UI element is placed is typically displayed at a higher level than the scene layer of the game scene. That is, the display level of each UI element is higher than the display level of the scene screen. It should be noted that, in this embodiment, the scene picture is a game scene, and both express the same meaning.
It will be appreciated that the above-described target controls are displayed in a UI layer that has a display level higher than the display level of the game scene.
The embodiment of the display timing of the target control is not particularly limited, and the target control may be displayed in a resident manner or at a specific timing. In an alternative embodiment, the target control may be resident in a graphical user interface, and switching of the game scene does not affect the display of the target control. The resident display will understand that the target controls remain visible on the graphical user interface at all times and will not automatically hide or close as the player performs other operations.
Step S220: and responding to the triggering operation aiming at the target control, and determining the target direction of the virtual sphere to be moved.
The target direction may be understood as the direction of movement of the virtual sphere in future movements.
Triggering operations for a target control may be understood as operations that initiate game logic corresponding to the target control. In this embodiment, the triggering operation for the target control may include, but is not limited to, a shortcut key operation, a voice command operation, a space gesture recognition operation, or a touch screen operation. The touch screen operation may include a pressing operation, a clicking operation, or a sliding operation, among others. This is not particularly limited.
It can be understood that a corresponding control operation area can be preset in the graphical user interface for the target control, and the triggering operation for the target control is the operation in the control operation area corresponding to the target control. It should be noted that, the control operation area corresponding to the target control may include a display area of the target control in the graphical user interface, or may include a specific area of the selectable target control preset in the graphical user interface, for example, a lower left portion or a left half portion of the graphical user interface may be set as the control operation area corresponding to the target control, and when each player applies a triggering operation to the lower left portion of the graphical user interface, game logic corresponding to the triggering target control may be implemented. In this case, even if the operation position of the player does not exactly coincide with the display area of the target control, the game logic corresponding to the trigger target control can be realized as long as the operation position of the operation is located in the specific area corresponding to the target control.
When the control operation area corresponding to the target control is a specific area in the graphical user interface, even if the player does not accurately touch the current display area of the target control, the target control can move along with the touch finger moving to the player as long as the touch position is in the specific area.
In some alternative embodiments, the target direction in which the virtual sphere is to be moved is associated with a triggering operation for a target control. Specifically, the target direction is associated with a contact location of the trigger operation within a control operation region of the target control. For example, a corresponding preset direction may be set in advance for each position within the control operation area of the target control, e.g., a ray direction from the center of the target control to the contact position is determined as the preset direction corresponding to the contact position. Thus, when the player applies the trigger operation to the target control, the player can inquire the preset direction corresponding to the trigger operation and take the preset direction as the target direction.
As shown in table 1, table 1 shows an example of correspondence between each position point in the control operation area of the target control and the preset direction.
Table 1 example of correspondence between each location point of the target control and a preset direction
Position point corresponding to target control | Preset direction |
(x1,y1) | A first preset direction |
(x1,y1) | A second preset direction |
…… | |
(xn,yn) | N th preset direction |
As shown in table 1, different directions are preset for each position point in the control operation area of the target control, for example, for a position point with a coordinate of (x 1,y1), the corresponding preset direction is the first preset direction. The first preset direction is determined as the target direction when the touch point position of the trigger operation of the player is the position point of coordinates (x 1,y1). For example, when the first preset direction is the left direction, the target direction is the left direction starting from the virtual sphere.
Step S230: in response to a game event in which the first virtual object impacts the virtual sphere, the virtual sphere is controlled to move in a target direction.
It will be appreciated that, as previously described, the first virtual object is a player-controlled game character, and thus, in a virtual game, a player may control the first virtual object to move in a game scene.
The game event that the first virtual object collides with the virtual sphere can be understood as that the player controls the first virtual object to move to the position where the virtual sphere is located, so that contact occurs between the first virtual object and the virtual sphere.
When the first virtual object is detected to collide with the virtual sphere, the virtual sphere may be controlled to move in the above-determined target direction.
It is noted that the game event in which the first virtual object collides with the virtual sphere may occur before or after the determination of the target direction, which is not particularly limited in the present embodiment. In an alternative embodiment, if the game event that the first virtual object collides with the virtual sphere occurs before the target direction is determined, after the first virtual object collides with the virtual sphere, the target direction is not determined, and the virtual sphere can move along with the first virtual object and cannot be ejected immediately, and then the virtual sphere is ejected after the target direction is further determined. Reference is made to the following description for this embodiment, and no further description is given here.
In yet other embodiments, when a game event occurs in which the first virtual object collides with the virtual sphere after the target direction is determined, the virtual sphere may be controlled to be ejected immediately along the target direction based on the determined target direction.
As shown in FIG. 3, FIG. 3 provides a schematic representation of an example of a graphical user interface. In the graphical user interface 300 shown in fig. 3, a game scene 100 is displayed, and a first virtual object 10, a virtual sphere 11, a virtual goal 12, and a target control 13 are displayed in the game scene 100. Assuming that, when the player determines that the target direction of the virtual sphere is the direction indicated by the dotted arrow 14 shown in the graphic user interface 300 by the trigger operation applied to the target 13 in the above-described manner, when the player hits the virtual sphere 11 by controlling the first virtual object 10, the graphic user interface 300 is displayed switchably as shown in the graphic user interface 301, the first virtual object 10 is displayed in the graphic user interface 301 to hit the virtual sphere 11, the graphic user interface 301 is displayed switchably as the graphic user interface 302 in response to the game event in which the first virtual object 10 hits the virtual sphere 11, and the virtual sphere moves in the graphic user interface 302 in the target direction indicated by the dotted arrow 14.
It will be appreciated that the control operation area for the target control described above may be a corresponding gray display area in the target control 13 in the graphical user interface 300. Or may be any blank area preset in the gui 300, and the embodiment is not particularly limited.
Therefore, according to the game interaction method provided by the application, the target control is displayed in the graphical user interface, and the movement direction of the virtual sphere is controlled by operating the target control, so that after the first virtual object controlled by the player collides with the virtual sphere, the virtual sphere is controlled to move along the target direction determined by the player, the movement direction of the virtual sphere can be accurately controlled by the player, the operation difficulty of the player for controlling the angle of the first virtual object to collide with the virtual sphere is reduced, and the game experience of the player is improved.
Next, some optional embodiments and some specific embodiments provided for the present application will be described.
In an alternative embodiment, the direction identifier characterizing the target direction is displayed in the graphical user interface in response to a triggering operation for the target control.
The direction indicator may be any type of indicator, for example, the direction indicator may include, but is not limited to, a straight arrow shape, a curved arrow shape, a rectangular box with a gradual color change, etc., which is not limited thereto.
In this embodiment, the display position of the direction identifier may include, but is not limited to, displaying with an operation contact point of a trigger operation performed by the player as a starting point, displaying separately in a blank area of the graphical user interface, and displaying with a current position of the virtual sphere as a starting point. This is not particularly limited.
In the preferred embodiment, the current position of the virtual sphere is used as a starting point for displaying, so that a clear visual focus is provided for a player, the player can intuitively and quickly position the target direction in a vigorous game, and if the target direction is not the expected direction of the player, the player can conveniently adjust the target direction based on the target position point to which the virtual sphere is to be moved. For example, when a player hits a virtual ball to shoot, by displaying the direction identification, the player can determine whether the target direction in which the virtual ball is to be moved is aimed at the virtual goal.
Therefore, the accuracy of the movement direction of the virtual sphere operated by the player can be further improved, the operation difficulty is reduced, the game immersion of the player is improved, and the game requirements of different players are met.
As shown in fig. 4, in the graphical user interface 303, when the player triggers the target control 13, the direction identification 15 is displayed in the graphical user interface 303.
In some embodiments, to enhance the player's visual experience, the occlusion of the game scene by the UI controls is reduced, and the target controls may not be resident in the graphical user interface. Specifically, in this embodiment, the target control may be displayed in response to a preset game event.
In an alternative embodiment, the preset game event includes any one of the following examples:
example one: the first virtual object controls the virtual sphere such that the virtual sphere moves following the first virtual object.
In this example, the game event of the first virtual object controlling virtual sphere may be triggered by the first virtual object hitting the virtual sphere, i.e. triggering the game event of the first virtual object controlling virtual sphere after the first virtual object hitting the virtual sphere, colloquially understood, i.e. in the first virtual object dribbling state.
The first virtual object controls the virtual sphere, so that the virtual sphere can move along with the first virtual object. Compared with the technical scheme that after the racing car encounters the football, the football is launched out immediately, the technical scheme provided by the example can increase the time for the first virtual object to control the virtual ball, in the control time, the player can further determine the target direction, the thinking time is provided for the player to determine the target direction, the requirements on the reaction capacity and the operation level of the player are reduced, the learning cost of the player is reduced, and the game experience of the player is improved.
In this example, the target control is displayed on the graphical user interface in response to a game event that triggers the first virtual object to control the virtual sphere. In this way, the determination operation of the target direction can be performed after the first virtual object is controlled, and after the target direction is determined, the first virtual object and the virtual sphere are always in a contact state (collision state), so that the virtual sphere can be controlled to be ejected immediately. No additional operations need to be performed by the player.
The example can redisplay the target control in case of need, the change of the control element can provide visual focus for the player, so that when the player controls the first virtual object to collide with the virtual sphere, the player is reminded of determining the target direction by operating the target control, and the learning cost and the memory burden of the player are reduced.
Example two: the virtual sphere is not controlled by the first virtual object, and the distance between the virtual sphere and the first virtual object is smaller than or equal to a preset distance.
The virtual sphere is not controlled by the first virtual object, which can be understood that the first virtual object and the virtual sphere are not collided, and if the distance between the first virtual object and the virtual sphere is smaller than or equal to the preset distance under the condition that the first virtual object and the virtual sphere are not collided, the target control is displayed.
In this embodiment, when the distance between the first virtual object and the virtual sphere is smaller than or equal to the preset distance, the target control is displayed, so that the player can be reminded of realizing the pre-aiming function through the target control, that is, when no collision occurs between the first virtual object and the virtual sphere, the target direction can be determined in advance through the target control, and when the target direction is determined, if collision occurs between the first virtual object and the virtual sphere is detected, the virtual sphere can be ejected according to the target direction.
According to the technical scheme, in the game scene with the faster moving speed of the first virtual object, longer aiming time, namely more target direction determining time, can be provided for a player, tactical adjustment time is provided for the player, and control accuracy of the movement direction of the virtual sphere can be further improved.
In some embodiments, the "displaying the target control through the graphical user interface" in the step S210 may specifically be: the target control is displayed in a first display style through a graphical user interface, wherein the first display style is used for prompting that the target control is not activated so as to prompt that the player target control is not available currently. Thus, the player can be prevented from touching the target control by mistake, and the operation difficulty of the player is reduced.
The first display style may be a dark state display style, as shown in fig. 5, in which the target control 13 is displayed in an unfilled state in the graphical user interface 304, in this embodiment, the dark state display style is represented in the unfilled state.
In some embodiments, where the target control is resident in the graphical user interface in the first display style, upon triggering the exemplary game event described above, the target control is displayed in an activated state to prompt the player that the target control is currently available upon triggering the exemplary game event. Specifically, when the player triggers different game events, the target control can be displayed in different display modes so as to prompt the player of the type of the game event triggered currently and the game logic corresponding to the target control currently.
In some specific embodiments, when a game event of the virtual sphere is controlled by touching the first virtual object, the first display style is adjusted to the second display style, and the target control is displayed in the second display style.
It will be appreciated that the second display style is used to alert the player that the target control has been activated, i.e., that the target control is currently available for use.
In this embodiment, when the player controls the first virtual object to collide with the virtual sphere, the first virtual object dribbling state may be entered, in which the target control is displayed in the second display style to prompt the player that the dribbling state is currently entered, in which the target direction may be adjusted.
In this embodiment, the second display mode may be a bright display mode.
As shown in fig. 5, in the graphical user interface 305, when the first virtual object 10 collides with the virtual sphere 11, a bright state display style is represented in a filled state for the target control 13.
In other embodiments, a pre-targeting function may be set for the target control, i.e. the target direction may be predetermined by the target control when certain conditions are met without the first virtual object controlling the virtual sphere. Reference may be made to the foregoing.
In this embodiment, the target control may be displayed in a third display style in response to the pre-targeting function being triggered to prompt the player that the pre-targeting function is currently trigged by the target control. Specifically, in response to a game event in which the distance between the virtual sphere and the first virtual object is less than or equal to a preset distance and the virtual sphere is not controlled by the first virtual object, the first display style is adjusted to a third display style, and the target control is displayed in the third display style; wherein the third display style is used to prompt that the target control has been activated.
That is, when the distance between the first virtual object and the virtual sphere is less than or equal to the preset distance, but the first virtual object does not strike the virtual sphere, the unactivated state is adjusted to the activated state, and the player is prompted that the target control can be triggered. For example, the preset distance is 5 meters, and the pre-aiming function is triggered when the player controls the first virtual object to move within 5 meters from the virtual sphere.
In some embodiments, after the pre-aiming function is triggered and the target control is displayed in an activated manner, if a game event of the virtual sphere controlled by the first virtual object is triggered, a third display style corresponding to the pre-aiming function is adjusted to a second display style, and the target control is displayed in the second display style. Specifically, after a game event that the distance between the virtual sphere and the first virtual object is smaller than or equal to a preset distance is triggered, when the first virtual object is triggered to control the game event of the virtual sphere, the third display style is adjusted to be a second display style, and the target control is displayed in the second display style.
In this way, the target control can be displayed in different display modes under the condition of corresponding to different game functions, visual prompts can be given to the player, the player can be helped to quickly distinguish game logics which can be realized by the current trigger target control, instant feedback can be provided for the player, the player can intuitively perceive that own operation has been recognized and executed by the electronic equipment, for example, the first virtual object controlled by the player is close to the virtual sphere, or the first virtual object is collided with the virtual sphere. In addition, visual fatigue of the player can be reduced, and visual experience of the player is improved.
Alternatively, the third display style may be the same as the second display style described above, i.e., a bright display style in a filled state is used as the third display style in the graphical user interface 305 shown in fig. 5. Or may be displayed differently from the second display style, for example, the second display style and the third display style are displayed in different colors, e.g., the second display style is red and the third display style is green. In this way, the player can be prompted by different colored target controls for the game functions currently achievable by triggering the target controls. Visual fatigue of the player is reduced, and visual experience of the player is enriched.
Based on the above embodiment, after determining the target direction, if the first virtual object is detected to collide with the virtual sphere, the virtual sphere is immediately ejected according to the target direction. The application further provides a scheme that a player can determine whether the virtual sphere moves according to the target direction through operation, so that the player can further control the movement of the virtual sphere through operation, and the control accuracy of the player on the virtual sphere is improved.
In a specific embodiment, in response to a game event that the first virtual object collides with the virtual sphere, and the triggering operation is finished, the virtual sphere is controlled to move along the target direction.
In this embodiment, whether the virtual sphere is ejected after being impacted is controlled by a triggering operation for the target control.
The end of the triggering operation may be understood as the end of the operation on the control operation area corresponding to the target control, for example, the player lifts the finger and does not click, press or slide the control operation area of the target control. The player finishes the trigger operation, confirms the current target direction as the final determined target direction on behalf of the player, and decides to move the virtual sphere in that direction.
In this embodiment, control conditions are added for whether the virtual sphere moves along the target direction, so that the player can autonomously decide whether to eject the virtual sphere, and the participation of the player in the game can be improved. It should be noted that, when the trigger operation is not finished (for example, when the finger of the player is not lifted after the trigger operation is applied to the target control), after the first virtual object is controlled to strike the virtual sphere, the target direction may be adjusted by adjusting the trigger operation, so that the virtual sphere may move according to the adjusted target direction when the adjusted trigger operation is finished.
In an optional embodiment, the control operation area corresponding to the target control may include a first sub-area and a second sub-area; the first subarea is positioned in the central area of the control operation area, and the second subarea is positioned around the first subarea. In a colloquial sense, the first sub-region may be the display region of the virtual rocker mentioned in the related art, and the second sub-region is the region where the virtual rocker is movable.
As shown in fig. 6, the target control 13 comprises a rocker 16, and a second sub-area 17 in which the rocker 16 is movable.
When the target control comprises a rocker and a movable area of the rocker, the player does not need to accurately touch the rocker of the target control when playing the game, and the rocker can automatically display the finger position along with the player in a touch manner only by touching the target control, so that the effect similar to that of adsorbing the fingers of the player is achieved; in some games, even if a player does not need to touch a target control accurately, the target control and a rocker of the target control can be displayed along with the position of the touch finger of the player as long as the player touches the target control in a preset specific area in the graphical user interface. Therefore, the triggering operation of the target control comprises accurately touching the rocker of the virtual control displayed at present, touching the second sub-area except the rocker corresponding to the target control, and touching the specific area preset in the graphical user interface and capable of triggering the target control.
It should be noted that in some embodiments, in the case where the target control is not triggered, the second sub-region may not be displayed in the graphical user interface, and when a trigger operation is applied to the target control, an area of the second sub-region is displayed.
As can be seen from the foregoing description, the triggering operation for the target control may include a plurality of operation manners, which is not particularly limited in this embodiment. In some specific embodiments, the triggering operation may include a sliding operation starting from a control operation area corresponding to the target control. In response to the sliding operation, a ray direction from the center of the control operation area corresponding to the target control to the current contact point position of the sliding operation can be determined as a target direction in which the virtual sphere is to be moved. That is, the direction of the ray from the initial position where the rocker is located to the contact point position at which the sliding operation is current is the target direction. As the position of the sliding operation contact point changes, the target direction also changes, and when the player finishes the sliding operation with the finger off the touch screen, the target direction is finally determined (i.e., the direction of the ray corresponding to the contact point position at the end of the sliding operation), and the virtual sphere moves in the finally determined target direction.
The method provided by the application is described in detail below in connection with fig. 7.
As shown in fig. 7, in the graphical user interface 306, the target control 13 is displayed in a dark state display style, indicating that the target control 13 is currently not triggerable; when the player controls the first virtual object 10 to move to a position which is only 5 meters away from the virtual sphere 11, the graphical user interface 306 is switched and displayed as a graphical user interface 307, the distance between the first virtual object 10 and the virtual sphere in the graphical user interface 307 is 5 meters, at the moment, the target control is switched and displayed as a bright state display pattern in the graphical user interface 307 by a dark state display pattern shown by the graphical user interface 306, and the player is prompted that the target control is activated, and at the moment, the pre-sighting state is entered.
In some embodiments, in response to entering the pre-aiming state (when the distance between the virtual sphere and the first virtual object is less than or equal to a preset distance and a game event occurs in which the virtual sphere is not controlled by the first virtual object), a pre-aiming identification for representing the pre-aiming state is displayed at a position corresponding to the virtual sphere so as to prompt the player that the virtual sphere is located.
In a graphical user interface 307 as shown in fig. 7, a pre-aiming identification 18 is displayed over the virtual sphere, in this example, the pre-aiming identification 18 being an arrow identification for characterizing that a pre-aiming operation may be performed for the virtual sphere. Based on the graphical user interface 307, when the player applies a sliding operation to the target control, that is, slides in the direction indicated by the broken line 19 from the rocker position in the graphical user interface 307, the graphical user interface 307 is displayed as a graphical user interface 308, and the direction identifier 15 is displayed in the graphical user interface 308, where the direction indicated by the direction identifier 15 coincides with the target direction in which the virtual sphere corresponding to the current contact position of the sliding operation is to be moved.
After determining the target direction, if the player controls the first virtual object 10 to collide with the virtual sphere 11, as shown in fig. 7, the graphical user interface 308 is displayed in a switching manner as a graphical user interface 309, and it can be seen in the graphical user interface 309 that the first virtual object 10 contacts the virtual sphere 11, and at this time, the virtual sphere can be launched.
In some embodiments, after determining the target direction the first virtual object collides with the virtual sphere, a pre-targeting indication switch may be displayed as a collision indication to prompt the player that the sphere is currently targeted and can be launched, i.e., in response to a game event in which the first virtual object collides with the virtual sphere, the collision indication is displayed at the corresponding location of the virtual sphere. As shown in fig. 7, collision markers 20 are displayed around the virtual sphere 11 in the graphical user interface 309, prompting the player that the virtual sphere 11 has been aimed and is in a state to be transmitted. In addition, in this case, the target control 13 may be displayed in a different state from the display state in the above-described graphical user interface 308, such as in the graphical user interface 309, bolded for the border of the target control 13 in the activated state, to distinguish the pre-aimed state and the aimed state for the virtual sphere 11 by the display state of the target control 13.
In this example, after the first virtual object 10 collides with the virtual sphere 11, the player may lift a finger to end the sliding operation, that is, end the triggering operation on the target control, and as shown in fig. 7, the graphical user interface 309 is displayed as a graphical user interface 310 in a switching manner, the virtual sphere 11 moves along the direction indicated by the direction identifier 15, and when the distance between the virtual sphere 11 and the first virtual object 10 is greater than 5 meters, the target control 13 is displayed in a dark state display style, prompting the player that the target control 13 is currently in an inactive state. Of course, the target control 13 may be displayed in the dark state display style immediately after the virtual sphere 11 is separated from the first virtual object 10.
In some embodiments, a second virtual object is further displayed in the above game scene, and the game interaction method provided by the present application may further include the following step S240.
Step S240: in the case that the first virtual object controls the virtual sphere, the control of the first virtual object on the virtual sphere is canceled in response to a game event that the second virtual object collides with the first virtual object or the virtual sphere.
That is, when the first virtual object is in the dribbling state, if the second virtual object hits the first virtual object or the virtual sphere, dribbling of the first virtual object is interrupted and is exited from the dribbling state. The game simulation and immersion sense are enhanced, and the game experience of the player is improved.
In some embodiments, when the virtual game is a multiplayer online competitive game, the second virtual object may be a virtual object that is in the same camp as the first virtual object, or may be a virtual object that is in a different camp from the first virtual object.
In some embodiments, when the second virtual object is a virtual object that is in a different camping from the first virtual object, a third virtual object that is in the same camping as the first virtual object is displayed in the game scene, and when a game event that the third virtual object impacts the first virtual object while the first virtual object is in the dribbling state occurs, the dribbling state of the first virtual object may be set to be unaffected (or may be set to be affected to exit the dribbling state of the first virtual object).
The application scenarios of the game interaction method provided by the application can include, but are not limited to: the player controls the first virtual object to determine a shooting direction when shooting a goal, and the player controls the first virtual object to determine a passing direction when passing a ball to the third virtual object.
It should be noted that the dashed lines in fig. 3 to 7 are lines that are fictitious for easy understanding, and are not present on the graphical user interface provided by the terminal device.
It will be appreciated that the above-described information of dimensions, appearances, layouts, display patterns, etc. of the elements in the interface schematic diagrams like those shown in fig. 3 to 7 are exemplary and are not limiting to the actual dimensions.
Therefore, the method provided by the embodiment is described, the movement direction of the virtual sphere is controlled through the target control, so that after the first virtual object controlled by the player collides with the virtual sphere, the virtual sphere is controlled to move along the target direction determined by the player, the movement direction of the virtual sphere can be accurately controlled by the player, the operation difficulty of the player for controlling the angle of the first virtual object to collide with the virtual sphere is reduced, and the game experience of the player is improved.
Further, when the distance between the first virtual object and the virtual sphere is smaller than or equal to the preset distance, the player can enter the pre-aiming state, and the time for determining the target direction by the player can be increased. In a game with a faster movement speed of a first virtual object like a racing football, the operation difficulty of a player in determining a target direction and the requirement on the reaction capability of the player are reduced.
In addition, in the technical scheme provided by the application, after the first virtual object collides with the virtual sphere, the virtual sphere is not ejected immediately, but the player is required to decide whether the virtual sphere needs to be ejected or not through the operation of the target control, so that the participation of the player in the game is increased.
Corresponding to the game interaction method provided by the embodiment of the application, the embodiment of the application also provides a game interaction device 400, as shown in fig. 8, a graphical user interface is provided through a terminal device, at least part of game scenes are displayed in the graphical user interface, and the game scenes comprise a first virtual object and a virtual sphere controlled by the terminal device; the apparatus 400 includes: a display unit 401, a determination unit 402, and a control unit 403;
The display unit 401 is configured to display a target control through the graphical user interface;
The determining unit 402 is configured to determine a target direction in which the virtual sphere is to be moved, in response to a trigger operation for the target control;
The control unit 403 is configured to control the virtual sphere to move along the target direction in response to a game event that the first virtual object collides with the virtual sphere.
Optionally, the display unit 401 is specifically configured to display, through the graphical user interface, the target control in response to a preset game event.
Optionally, the preset game event includes any one of the following:
the first virtual object controls the virtual sphere to enable the virtual sphere to move along with the first virtual object;
The virtual sphere is not controlled by the first virtual object, and the distance between the virtual sphere and the first virtual object is smaller than or equal to a preset distance.
Optionally, the display unit 401 is specifically configured to display, through the graphical user interface, the target control in a first display style; the first display style is used for prompting that the target control is not activated.
Optionally, the display unit 401 is further configured to adjust the first display style to a second display style when the first virtual object is triggered to control a game event of the virtual sphere, and display the target control in the second display style; wherein the second display style is used to prompt that the target control has been activated.
Optionally, the foregoing display unit 401 is further configured to adjust, in response to a game event in which a distance between the virtual sphere and the first virtual object is less than or equal to a preset distance and the virtual sphere is not controlled by the first virtual object, the first display style to a third display style, and display the target control in the third display style; wherein the third display style is used to prompt that the target control has been activated.
Optionally, the above display unit 401 is further configured to adjust the third display style to a second display style and display the target control in the second display style when the first virtual object is triggered to control the game event of the virtual sphere after triggering the game event that the distance between the virtual sphere and the first virtual object is less than or equal to the preset distance.
Optionally, the display unit 401 is further configured to display, in the graphical user interface, a direction identifier that characterizes the target direction in response to a triggering operation for the target control.
Optionally, the control unit 403 is specifically configured to control the virtual sphere to move along the target direction in response to a game event that the first virtual object collides with the virtual sphere, and the triggering operation is ended.
Optionally, the triggering operation includes a sliding operation taking a control operation area corresponding to the target control as a starting point; the determining unit 402 is specifically configured to determine, in response to the sliding operation, a direction of a ray from a center of the control operation area to a current contact point position of the sliding operation as a target direction in which the virtual sphere is to be moved.
Optionally, the game scene further comprises a second virtual object; the apparatus 400 further comprises a cancellation unit 404;
A cancellation unit 404, configured to cancel control of the virtual sphere by the first virtual object in response to a game event that the second virtual object collides with the first virtual object or the virtual sphere, in a case that the first virtual object controls the virtual sphere.
Optionally, the display unit 401 is further configured to display, in response to a game event that the first virtual object collides with the virtual sphere, a collision identifier at a corresponding position of the virtual sphere.
Optionally, the display unit 401 is further configured to display a pre-aiming identifier at a corresponding position of the virtual sphere in response to a game event that a distance between the virtual sphere and the first virtual object is less than or equal to a preset distance and the virtual sphere is not controlled by the first virtual object.
Corresponding to the game interaction method provided by the embodiment of the present application, the embodiment of the present application further provides an electronic device for implementing the game interaction method, as shown in fig. 9, where the electronic device includes: a processor 501; and a memory 502, configured to store a program of a game interaction method, where after the device is powered on and runs the program of the game interaction method through a processor, a graphical user interface is provided through a terminal device, where at least part of a game scene is displayed in the graphical user interface, and the game scene includes a first virtual object and a virtual sphere controlled by the terminal device; then, the following steps are performed:
Displaying a target control through the graphical user interface;
responding to triggering operation aiming at the target control, and determining a target direction of the virtual sphere to be moved;
and controlling the virtual sphere to move along the target direction in response to a game event that the first virtual object collides with the virtual sphere.
Corresponding to the game interaction method provided by the embodiment of the application, the embodiment of the application also provides a computer readable storage medium, which stores a program of the game interaction method, the program is run by a processor, a graphical user interface is provided through a terminal device, at least part of game scenes are displayed in the graphical user interface, and the game scenes comprise a first virtual object and a virtual sphere controlled by the terminal device; then, the following steps are performed:
Displaying a target control through the graphical user interface;
responding to triggering operation aiming at the target control, and determining a target direction of the virtual sphere to be moved;
and controlling the virtual sphere to move along the target direction in response to a game event that the first virtual object collides with the virtual sphere.
It should be noted that, for the detailed description of the apparatus, the electronic device, and the computer readable storage medium provided in the embodiments of the present application, reference may be made to the related description of the embodiments of the game interaction method provided in the embodiments of the present application, which is not repeated here.
While the application has been described in terms of preferred embodiments, it is not intended to be limiting, but rather, it will be apparent to those skilled in the art that various changes and modifications can be made herein without departing from the spirit and scope of the application as defined by the appended claims.
In one typical configuration, a computing device of a blockchain node includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
The memory may include volatile memory in a computer-readable medium, random Access Memory (RAM) and/or nonvolatile memory, such as Read Only Memory (ROM) or flash memory (flash RAM). Memory is an example of computer-readable media.
1. Computer readable media, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable operations, data structures, modules of the program, or other data. Examples of storage media for a computer include, but are not limited to, phase change memory (PRAM), static Random Access Memory (SRAM), dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), read Only Memory (ROM), electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape magnetic disk storage or other magnetic storage devices, or any other non-transmission medium, which can be used to store information that can be accessed by a computing device. Computer readable media, as defined herein, does not include non-transitory computer readable media (transmission media), such as modulated data signals and carrier waves.
2. It will be appreciated by those skilled in the art that embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
While the application has been described in terms of preferred embodiments, it is not intended to be limiting, but rather, it will be apparent to those skilled in the art that various changes and modifications can be made herein without departing from the spirit and scope of the application as defined by the appended claims.
Claims (16)
1. A game interaction method is characterized in that a graphical user interface is provided through terminal equipment, at least part of game scenes are displayed in the graphical user interface, and the game scenes comprise first virtual objects and virtual spheres controlled by the terminal equipment; the method comprises the following steps:
Displaying a target control through the graphical user interface;
responding to triggering operation aiming at the target control, and determining a target direction of the virtual sphere to be moved;
and controlling the virtual sphere to move along the target direction in response to a game event that the first virtual object collides with the virtual sphere.
2. The method of claim 1, wherein the displaying, via the graphical user interface, a target control comprises:
and responding to a preset game event, and displaying the target control through the graphical user interface.
3. The method of claim 2, wherein the preset game event comprises any one of:
the first virtual object controls the virtual sphere to enable the virtual sphere to move along with the first virtual object;
The virtual sphere is not controlled by the first virtual object, and the distance between the virtual sphere and the first virtual object is smaller than or equal to a preset distance.
4. The method of claim 1, wherein the displaying the target control through the graphical user interface comprises:
Displaying the target control in a first display style through the graphical user interface; the first display style is used for prompting that the target control is not activated.
5. The method according to claim 4, wherein the method further comprises:
when the game event of the virtual sphere is controlled by the first virtual object, the first display mode is adjusted to a second display mode, and the target control is displayed in the second display mode; wherein the second display style is used to prompt that the target control has been activated.
6. The method according to claim 4, wherein the method further comprises:
Responding to a game event that the distance between the virtual sphere and the first virtual object is smaller than or equal to a preset distance and the virtual sphere is not controlled by the first virtual object, adjusting the first display mode into a third display mode, and displaying the target control in the third display mode; wherein the third display style is used to prompt that the target control has been activated.
7. The method of claim 6, wherein the method further comprises:
after a game event that the distance between the virtual sphere and the first virtual object is smaller than or equal to a preset distance is triggered, when the first virtual object is triggered to control the game event of the virtual sphere, the third display mode is adjusted to be a second display mode, and the target control is displayed in the second display mode.
8. The method according to claim 1, wherein the method further comprises:
and responding to the triggering operation aiming at the target control, and displaying a direction identification representing the target direction in the graphical user interface.
9. The method of claim 1, wherein controlling the movement of the virtual sphere in the target direction in response to a game event in which the first virtual object impacts the virtual sphere comprises:
And responding to a game event that the first virtual object collides with the virtual sphere, and controlling the virtual sphere to move along the target direction after the triggering operation is finished.
10. The method according to any one of claims 1 to 9, wherein the triggering operation includes a sliding operation starting from a control operation area corresponding to the target control;
The responding to the triggering operation of the target control determines the target direction of the virtual sphere to be moved, and the method comprises the following steps:
and responding to the sliding operation, and determining the ray direction from the center of the control operation area to the current contact point position of the sliding operation as the target direction of the virtual sphere to be moved.
11. The method of claim 1, further comprising a second virtual object in the game scene; the method further comprises the steps of:
And under the condition that the first virtual object controls the virtual sphere, responding to a game event that the second virtual object collides with the first virtual object or the virtual sphere, and canceling the control of the first virtual object on the virtual sphere.
12. The method according to claim 1, wherein the method further comprises:
And responding to a game event that the first virtual object collides with the virtual sphere, and displaying collision identification at the corresponding position of the virtual sphere.
13. The method according to claim 1, wherein the method further comprises:
And displaying a pre-aiming identification at a corresponding position of the virtual sphere in response to a game event that the distance between the virtual sphere and the first virtual object is smaller than or equal to a preset distance and the virtual sphere is not controlled by the first virtual object.
14. A game interaction device is characterized in that a graphical user interface is provided through a terminal device, at least part of game scenes are displayed in the graphical user interface, and the game scenes comprise a first virtual object and a virtual sphere which are controlled by the terminal device; the device comprises: a display unit, a determination unit, and a control unit;
the display unit is used for displaying a target control through the graphical user interface;
the determining unit is used for responding to the triggering operation aiming at the target control and determining the target direction of the virtual sphere to be moved;
The control unit is used for responding to a game event that the first virtual object collides with the virtual sphere and controlling the virtual sphere to move along the target direction.
15. An electronic device, comprising:
A processor; and
A memory for storing a data processing program, the electronic device being powered on and executing the program by the processor, to perform the method of any one of claims 1 to 13.
16. A computer readable storage medium, characterized in that a data processing program is stored, which program is run by a processor, performing the method according to any of claims 1-13.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202410058996.5A CN118079393A (en) | 2024-01-15 | 2024-01-15 | Game interaction method and device, electronic equipment and readable storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202410058996.5A CN118079393A (en) | 2024-01-15 | 2024-01-15 | Game interaction method and device, electronic equipment and readable storage medium |
Publications (1)
Publication Number | Publication Date |
---|---|
CN118079393A true CN118079393A (en) | 2024-05-28 |
Family
ID=91151628
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202410058996.5A Pending CN118079393A (en) | 2024-01-15 | 2024-01-15 | Game interaction method and device, electronic equipment and readable storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN118079393A (en) |
-
2024
- 2024-01-15 CN CN202410058996.5A patent/CN118079393A/en active Pending
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11992760B2 (en) | Virtual object control method and apparatus, terminal, and storage medium | |
US10029179B2 (en) | Touch control with dynamic zones and displayed elements | |
KR20210141670A (en) | Virtual object control method and apparatus, device and storage medium | |
US20230076343A1 (en) | Virtual item selection interface | |
CN110465087B (en) | Virtual article control method, device, terminal and storage medium | |
CN111265872B (en) | Virtual object control method, device, terminal and storage medium | |
CN111589148A (en) | User interface display method, device, terminal and storage medium | |
US12059620B2 (en) | Method and apparatus for selecting virtual object interaction mode, device, medium, and product | |
KR20220139970A (en) | Data processing method, device, storage medium, and program product in a virtual scene | |
WO2021203831A1 (en) | Virtual object control method and apparatus, computer device, and storage medium | |
CN111760280B (en) | Interface display method, device, terminal and storage medium | |
CN114225372A (en) | Virtual object control method, device, terminal, storage medium and program product | |
US20230082510A1 (en) | Controlling a virtual vehicle using auxiliary control function | |
US20240173621A1 (en) | Target locking method and apparatus, and electronic device and storage medium | |
CN115212570A (en) | Virtual game control method, device, storage medium and electronic equipment | |
CN113827974B (en) | AI role control method and device in game | |
CN113209624B (en) | Target selection method, terminal, electronic equipment and storage medium | |
JP2024026661A (en) | Control method of virtual object, device, terminal, and computer program | |
US20230330543A1 (en) | Card casting method and apparatus, device, storage medium, and program product | |
CN118079393A (en) | Game interaction method and device, electronic equipment and readable storage medium | |
JP2020089492A (en) | Game program, game processing method and game terminal | |
CN115105831A (en) | Virtual object switching method and device, storage medium and electronic device | |
CN112057859B (en) | Virtual object control method, device, terminal and storage medium | |
CN115999153A (en) | Virtual character control method and device, storage medium and terminal equipment | |
CN118356655A (en) | Interactive control method and device for game, electronic equipment and readable storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |