CN116899237A - Game interaction method, game interaction device, computer readable storage medium and electronic equipment - Google Patents

Game interaction method, game interaction device, computer readable storage medium and electronic equipment Download PDF

Info

Publication number
CN116899237A
CN116899237A CN202310920788.7A CN202310920788A CN116899237A CN 116899237 A CN116899237 A CN 116899237A CN 202310920788 A CN202310920788 A CN 202310920788A CN 116899237 A CN116899237 A CN 116899237A
Authority
CN
China
Prior art keywords
game
virtual character
observing
user interface
graphical user
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310920788.7A
Other languages
Chinese (zh)
Inventor
何俊乐
龙易杰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Netease Hangzhou Network Co Ltd
Original Assignee
Netease Hangzhou Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netease Hangzhou Network Co Ltd filed Critical Netease Hangzhou Network Co Ltd
Priority to CN202310920788.7A priority Critical patent/CN116899237A/en
Publication of CN116899237A publication Critical patent/CN116899237A/en
Pending legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/85Providing additional services to players
    • A63F13/87Communicating with other players during game play, e.g. by e-mail or chat
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/23Input arrangements for video game devices for interfacing with the game device, e.g. specific interfaces between game controller and console
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/53Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game
    • A63F13/537Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game using indicators, e.g. showing the condition of a game character on screen
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • Software Systems (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Optics & Photonics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The disclosure provides a game interaction method, a game interaction device, a storage medium and electronic equipment, and relates to the technical field of computers. The game interaction method comprises the following steps: providing a character identification control at the graphical user interface for identifying the second virtual character; in response to a first operation of the character identification control for the second virtual character, controlling to switch a game screen displayed in the graphical user interface, which is obtained by observing the game scene at the first game view angle, to display a game screen obtained by observing the game scene at the second game view angle; the second game view angle is a game view angle corresponding to the second virtual character. According to the method and the device, the game pictures obtained by viewing the game scenes through viewing other virtual character visual angles are beneficial to more accurate azimuth communication among game players.

Description

Game interaction method, game interaction device, computer readable storage medium and electronic equipment
Technical Field
The disclosure relates to the field of computer technology, and in particular relates to a game interaction method, a game interaction device, a computer readable storage medium and electronic equipment.
Background
In some games, information transfer is required between different players for better team cooperation. In the related art, information is generally transferred between players through voice or text interaction. Because the position information in the game scene is difficult to accurately express in a voice or text interaction mode, the communication requirement of players on the position information cannot be met in the voice or text interaction mode. In order to meet the communication requirement of the players for the position information, in the related art, the players also transmit the position information through the interaction mode of marking in the game map, but the mode of transmitting the position information is not accurate and visual for teammates due to the fact that the game map is split with the game scene.
It should be noted that the information disclosed in the above background section is only for enhancing understanding of the background of the present disclosure and thus may include information that does not constitute prior art known to those of ordinary skill in the art.
Disclosure of Invention
The present disclosure provides a game interaction method, a game interaction device, a computer-readable storage medium, and an electronic apparatus, so as to overcome, at least to some extent, the problem of poor interaction accuracy and intuitiveness in the related art.
Other features and advantages of the present disclosure will be apparent from the following detailed description, or may be learned in part by the practice of the disclosure.
According to a first aspect of the present disclosure, there is provided a game interaction method, in which a graphical user interface is provided by a first terminal device, where the graphical user interface displays a game screen obtained by observing a game scene from a first game perspective corresponding to a first virtual character, where the game scene includes the first virtual character and a second virtual character, and the first virtual character is controlled by the first terminal device, the method includes: providing a character identification control at the graphical user interface for identifying the second virtual character; responsive to a first operation of a character identification control for the second virtual character, controlling switching of a game screen displayed in the graphical user interface, the game screen being obtained by observing the game scene through the first game view angle, to a game screen being obtained by observing the game scene through a second game view angle; the second game view angle is a game view angle corresponding to the second virtual character.
According to a second aspect of the present disclosure, there is provided a game interaction apparatus that provides a graphical user interface through a first terminal device, the graphical user interface displaying a game screen obtained by observing a game scene through a first game angle corresponding to a first virtual character, the game scene including the first virtual character and a second virtual character, the first virtual character being controlled by the first terminal device, the apparatus comprising: an identification control providing module for providing a character identification control for identifying the second virtual character in the graphical user interface; a game screen switching module, configured to control, in response to a first operation of a character identification control for the second virtual character, switching a game screen displayed in the graphical user interface, the game screen being obtained by observing the game scene through the first game view angle, to a game screen being obtained by observing the game scene through a second game view angle; the second game view angle is a game view angle corresponding to the second virtual character.
According to a third aspect of the present disclosure, a computer readable storage medium is provided, on which a computer program is stored which, when being executed by a processor, implements the above-mentioned game interaction method and possible implementations thereof.
According to a fourth aspect of the present disclosure, there is provided an electronic device comprising: one or more processors; and a storage device for storing one or more programs which, when executed by the one or more processors, cause the one or more processors to implement the game interaction method and possible implementations thereof.
The technical scheme of the present disclosure has the following beneficial effects:
in the game interaction process, a role identification control for identifying a second virtual role is provided on a graphical user interface; in response to a first operation of the character identification control for the second virtual character, controlling to switch a game screen displayed in the graphical user interface, which is obtained by observing the game scene at the first game view angle, to display a game screen obtained by observing the game scene at the second game view angle; the second game view angle is a game view angle corresponding to the second virtual character. The game picture obtained by observing the game scene from the first game view angle is switched to the game picture obtained by observing the game scene from the second game view angle through the game picture displayed in the graphic user interface, so that the game picture obtained by observing the game scene from the view angles of other virtual characters is realized, the interactive experience of game players is enriched, and more accurate azimuth communication among the game players is facilitated.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the disclosure and together with the description, serve to explain the principles of the disclosure. It will be apparent to those of ordinary skill in the art that the drawings in the following description are merely some embodiments of the present disclosure and that other drawings may be derived from these drawings without undue effort.
FIG. 1 illustrates a flow chart of one of the game interaction methods of the exemplary embodiments of the present disclosure;
FIG. 2 illustrates an interface schematic diagram of a role identification control in one of the exemplary embodiments of the present disclosure;
FIG. 3 illustrates a schematic diagram of position marking at a second game perspective in accordance with one of the exemplary embodiments of the present disclosure;
FIG. 4 illustrates a flowchart of one of the exemplary embodiments of the present disclosure for position marking at a teammate perspective;
FIG. 5 shows a block diagram of one of the game interaction devices of the present exemplary embodiment;
fig. 6 shows an electronic device for implementing one of the above-described game interaction methods according to the present exemplary embodiment.
Detailed Description
Example embodiments will now be described more fully with reference to the accompanying drawings. However, the exemplary embodiments may be embodied in many forms and should not be construed as limited to the examples set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of the example embodiments to those skilled in the art. The described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. In the following description, numerous specific details are provided to give a thorough understanding of embodiments of the present disclosure. One skilled in the relevant art will recognize, however, that the aspects of the disclosure may be practiced without one or more of the specific details, or with other methods, components, devices, steps, etc. In other instances, well-known technical solutions have not been shown or described in detail to avoid obscuring aspects of the present disclosure.
Furthermore, the drawings are merely schematic illustrations of the present disclosure and are not necessarily drawn to scale. The same reference numerals in the drawings denote the same or similar parts, and thus a repetitive description thereof will be omitted. Some of the block diagrams shown in the figures are functional entities and do not necessarily correspond to physically or logically separate entities. These functional entities may be implemented in software or in one or more hardware modules or integrated circuits or in different networks and/or processor devices and/or microcontroller devices.
Herein, "first," "second," and the like are labels for specific objects, and do not limit the number or order of objects.
In the related art, the communication requirement of players on position information cannot be met through a voice or text interaction mode; by the interactive mode of marking in the game map, the game map and the game scene are split, so that the method is not accurate and visual for teammates.
In view of one or more of the problems described above, exemplary embodiments of the present disclosure provide a game interaction method, a game interaction device, a computer-readable storage medium, and an electronic apparatus.
Alternatively, the game interaction method can be applied to team combat games in the first person perspective.
In one embodiment of the present disclosure, the game interaction method may be executed on a local terminal device or a server. When the game interaction method runs on a server, the game interaction method can be realized and executed based on a cloud interaction system, wherein the cloud interaction system comprises the server and the client device.
In an alternative embodiment, various cloud applications may be run under the cloud interaction system, such as: and (5) cloud game. Taking cloud game as an example, cloud game refers to a game mode based on cloud computing. In the cloud game operation mode, the game program operation main body and the game picture presentation main body are separated, the storage and operation of the game interaction method are completed on the cloud game server, the client device is used for receiving and sending data and presenting the game picture, for example, the client device can be a display device with a data transmission function close to a user side, such as a mobile terminal, a television, a computer, a palm computer and the like; but the cloud game server that performs the game interactions is the cloud. When playing the game, the player operates the client device to send an operation instruction to the cloud game server, the cloud game server runs the game according to the operation instruction, codes and compresses data such as game pictures and the like, returns the data to the client device through a network, and finally decodes the data through the client device and outputs the game pictures.
In an alternative embodiment, taking a game as an example, the local terminal device stores a game program and is used to present a game screen. The local terminal device is used for interacting with the player through the graphical user interface, namely, conventionally downloading and installing the game program through the electronic device and running. The manner in which the local terminal device provides the graphical user interface to the player may include a variety of ways, for example, it may be rendered for display on a display screen of the terminal, or provided to the player by holographic projection. For example, the local terminal device may include a display screen for presenting a graphical user interface including game visuals, and a processor for running the game, generating the graphical user interface, and controlling the display of the graphical user interface on the display screen.
The embodiment of the disclosure provides a game interaction method, wherein a graphical user interface is provided through first terminal equipment, the graphical user interface is displayed with a game picture obtained by observing a game scene through a first game view angle corresponding to a first virtual role, and the game scene comprises the first virtual role and a second virtual role.
The first virtual character and the second virtual character may be game objects in any form such as virtual characters, virtual animals, cartoon characters and the like in a game scene, and may be in the same game camp. The first virtual character may be controlled by a first terminal device and the second virtual character may be controlled by a second terminal device.
The game scene may be a simulation scene of the real world, a semi-simulation and semi-fictional three-dimensional scene, or a pure fictional three-dimensional scene, which is not particularly limited in this disclosure.
The first terminal device may provide a graphical user interface, where the graphical user interface may display, in an initial or default state, a game screen obtained by observing a game scene from a first game perspective, that is, a game screen presented when the first virtual character is observed in the game scene through a camera model corresponding to the first virtual character. The first game view angle is a game view angle corresponding to the first virtual character.
The second terminal device may also provide a graphical user interface, where the graphical user interface may display, in an initial or default state, a game screen obtained by observing a game scene from the second game perspective, that is, a game screen presented when the second virtual character is observed in the game scene through a camera model corresponding to the second virtual character. The second game view angle is a game view angle corresponding to the second virtual character.
As shown in fig. 1, a flow chart of a game interaction method is provided, which specifically includes the following steps S110 to S120:
Step S110, providing a character identification control for identifying the second virtual character in the graphical user interface;
step S120, in response to the first operation of the character identification control for the second virtual character, controls switching the game screen displayed in the graphical user interface, which is obtained by observing the game scene at the first game angle, to display the game screen obtained by observing the game scene at the second game angle.
In the game interaction process, the game picture obtained by observing the game scene from the first game view angle is switched to the game picture obtained by observing the game scene from the second game view angle through the game picture displayed in the graphical user interface, so that the game picture obtained by observing the game scene from the view angles of other virtual roles is realized, the interaction experience of game players is enriched, and more accurate azimuth communication among the game players is facilitated.
Each step of fig. 1 is described in detail below.
In step S110, a character identification control is provided at the graphical user interface for identifying the second virtual character.
There may be multiple second virtual characters in the game scene, and in order to distinguish the character identification controls of different virtual characters, information such as numbers of the virtual characters in the game camp, virtual character names and the like may be displayed on the character identification controls.
For example, as shown in fig. 2, an interface schematic diagram of a character identification control is provided, different virtual character numbers represent different virtual characters, and character identification controls corresponding to three virtual characters, namely a second virtual character a, a second virtual character B and a second virtual character C, are displayed in a graphical user interface. In addition, a role identification control corresponding to the first virtual role is also displayed in the graphical user interface, which is not specifically limited by the present disclosure. Further, to avoid obscuring the game screen, these character identification controls may be displayed in the upper left corner of the graphical user interface.
In step S120, in response to the first operation of the character identification control for the second virtual character, switching of the game screen displayed in the graphical user interface, which is obtained by observing the game scene at the first game angle, to the game screen displayed by observing the game scene at the second game angle is controlled.
Wherein the first operation of the character identification control for the second virtual character refers to a view angle switching operation for the second virtual character.
In an alternative embodiment, the step S120 may be implemented by the following steps: in response to a touch operation of the character identification control for the second virtual character, controlling to switch a game picture obtained by observing the game scene through the first game view angle displayed in the graphical user interface to a game picture obtained by observing the game scene through the second game view angle; during the effective period of the touch operation, game pictures obtained by observing the game scene through the second game visual angle can be continuously displayed in the graphical user interface.
The virtual character identification element of the virtual character in the graphical user interface is used for realizing the switching of game pictures, so that the interface space can be saved, and an operation basis can be provided for further position marking operation.
Optionally, if a certain second virtual character is in a non-survival state, the character identification control of the corresponding second virtual character may be subjected to gray setting processing to prompt the player that the corresponding second virtual character is in the non-survival state, and no response is made to the touch operation of the second virtual character identification applied to the gray setting processing, so as to reduce meaningless response processing and reduce processing overhead.
Optionally, after displaying the game screen obtained by observing the game scene through the second game view angle, a view angle viewing prompt message may be generated in the graphical user interface, so as to prompt the game player whether the game screen switching is successfully executed.
As shown in fig. 2, a game screen obtained by observing a game scene through a game view angle of the second virtual character a may be displayed in a graphic user interface by touching the character identification control 201 of the second virtual character a, and view angle view prompt information 202 of "the game view angle screen of the second virtual character a is currently being viewed" is generated.
In an alternative embodiment, in a case where the graphical user interface displays a game screen obtained by observing a game scene through the second game viewing angle, the control may switch the game screen obtained by observing a game scene through the second game viewing angle displayed in the graphical user interface to display a game screen obtained by observing a game scene through the first game viewing angle in response to the touch operation being ended.
And the game picture is restored to the default state by responding to the touch operation, so that the player can continue to play the game without additional operation, and the operation is simple and convenient and easy to realize.
In an alternative embodiment, when a game screen obtained by observing a game scene from the second game view angle is displayed in the graphical user interface, the game screen obtained by observing a game scene from the second game view angle displayed in the graphical user interface may be controlled to be switched to a game screen obtained by observing a game scene from the game view angle corresponding to another second virtual character in response to a sliding operation from the character identification control of the second virtual character to the character identification control of the other second virtual character.
Through the sliding operation among the character identification controls, the game picture is switched, so that a game player can flexibly view the game pictures under the game view angles of the virtual characters of different friends according to the needs.
In an alternative embodiment, before the step of responding to the first operation of the character identification control for the second virtual character, the following steps may be further performed: controlling to switch and display game pictures obtained by observing game scenes at different first game view angles determined based on the view angle adjustment operation on the graphical user interface in response to the view angle adjustment operation acting on the graphical user interface; during the validation of the first operation, the following steps may also be performed: and controlling to switch and display a game picture obtained by observing the game scene at the different second game view angles determined based on the view angle adjustment operation on the graphical user interface in response to the view angle adjustment operation acting on the graphical user interface.
The viewing angle adjustment operation refers to an operation of adjusting the viewing angle with respect to the corresponding game viewing angle. Alternatively, the viewing angle adjustment operation may be, for example, a slide operation in the up-down, left-right, in the game screen.
It should be noted that the viewing angle adjustment operation may control only the game screen displayed in the graphical user interface of the first terminal device, and does not affect the game screen displayed in the second terminal device.
Through visual angle adjustment, the player can view the game scene where the virtual character of the friend is located in multiple directions, and more accurate direction communication is facilitated for the game player. In addition, during the effective period of the first operation, a game picture obtained by observing the game scene under the second game visual angle is displayed in the graphical user interface, and in this case, the visual angle adjustment operation is performed, so that the game player can perform position marking according to the requirement, and the game player has higher flexibility.
In the practical application process, in order to conform to the operation habit of the player, the player may perform the touch operation on the character identification control of the second virtual character by one hand, and perform the viewing angle adjustment operation by the other hand, and adjust the game screen by the cooperation of the left hand and the right hand.
In an alternative embodiment, when a game screen obtained by observing a game scene through a second game view angle is displayed in the graphical user interface, the first position mark information can be generated in response to a second operation of the character identification control for the second virtual character; the first position mark information is transmitted to a second terminal device controlling the second virtual character, so that the second terminal device generates a first position mark based on the first position mark information.
Wherein the second operation of the character identification control for the second virtual character refers to a position marking operation for the second virtual character. The second operation may be a sliding operation from the character identification control to the game screen, for example. The first location mark information refers to location mark information generated by the first terminal device, and may include, but is not limited to: information such as position information (e.g., azimuth coordinates) of the first position mark in the game scene, a marker (e.g., first virtual character) corresponding to the first position mark, a marked side (e.g., second virtual character a) corresponding to the first position mark, and the like.
After the first terminal device generates the first position mark information, the first position mark information may be transmitted to the second terminal device controlling the second virtual character, so that the second terminal device generates the first position mark in the game screen based on the first position mark information. By carrying out position marking under the visual angle of the virtual roles of the friends, teammates can clearly and accurately distinguish marking points from the visual angle of the teammates, the accuracy and intuitiveness of marking in the interaction process are improved, the targeted display of the position marking can be realized, and information interference to other irrelevant virtual roles is avoided.
In an alternative embodiment, the generating the first position mark information in response to the second operation of the character identification control for the second virtual character may be implemented by: in response to a sliding operation of a game screen obtained from the character identification control of the second virtual character to the second game view angle viewing the game scene, first position mark information is generated according to an end position of the sliding operation in the game screen obtained from the second game view angle viewing the game scene.
As shown in fig. 3, a schematic diagram of position marking under a second game view angle is provided, and when a game screen obtained by observing a game scene from the game view angle of the second virtual character a is displayed in a graphical user interface of the first terminal device, a touch point 301 of a touch operation acting on a character identification control of the second virtual character a may be slid into the game interface, and first position marking information may be generated according to an end position 302 of the sliding operation in the game screen.
The marking operation is carried out under the visual angle of the virtual character of the friend, so that teammates can clearly and accurately distinguish marking points from the visual angle of the teammate, and the effectiveness of information transmission is enhanced.
In an alternative embodiment, the generating the first position mark information according to the end position of the game screen obtained by observing the game scene at the second game view angle according to the sliding operation may further be implemented by: determining a first distance based on a mapping position of the ending position in the game scene and a position of the first virtual character in the game scene, and generating first position mark information if the first distance is smaller than a first preset distance threshold; or determining a second distance based on the mapping position of the ending position in the game scene and the position of the second virtual character in the game scene, and generating the first position mark information if the second distance is smaller than a second preset distance threshold value.
Wherein the mapping position refers to a position in the game scene to which the end position of the sliding operation is mapped. The first distance refers to a position distance between the mapped position and the first virtual character; the first preset distance threshold refers to a markable range preset for the marker. The second distance refers to a position distance between the mapped position and the second virtual character; the second preset distance threshold refers to a marker displayable range preset for the marked party.
By judging the relation between the first distance and a first preset distance threshold value, when the first distance is smaller than the first preset distance threshold value, first position mark information can be generated; when the first distance is greater than or equal to a first preset distance threshold, the mark can be used as an invalid mark, and invalid mark prompt information is generated, so that a game player can mark the position in a certain range.
The relation between the second distance and a second preset distance threshold is judged, and when the second distance is smaller than the second preset distance threshold, the first position mark information can be generated; when the second distance is greater than or equal to a second preset distance threshold, the current mark can be used as an invalid mark, and invalid mark prompt information is generated, so that the position mark can be displayed near the marked party, and the position mark recognition difficulty caused by the fact that the position mark is too far away from the marked party can be avoided.
As shown in fig. 4, a flowchart of position marking under a teammate viewing angle is provided, in which a graphical user interface (gui) provides a character identification control for identifying a second virtual character, and a game screen obtained by observing a game scene through a first game viewing angle corresponding to a first virtual character is displayed by default, and specifically includes the following steps:
Step S401, in response to the touch operation of the character identification control for the second virtual character, controlling to switch the game picture obtained by observing the game scene through the first game visual angle displayed in the graphic user interface to the game picture obtained by observing the game scene through the second game visual angle;
step S402, responding to the visual angle adjustment operation acted on the graphic user interface during the effective period of the touch operation, and controlling the graphic user interface to switch and display the game pictures obtained by observing the game scenes at the different second game visual angles determined based on the visual angle adjustment operation;
step S403, responding to the sliding operation from the character identification control of the second virtual character to the game picture obtained by observing the game scene at the second game view angle, and generating first position mark information according to the end position of the sliding operation in the game picture obtained by observing the game scene at the second game view angle;
step S404, the first position mark information is transmitted to the second terminal device controlling the second virtual character, so that the second terminal device generates the first position mark based on the first position mark information.
It may be understood that, in the actual application process, the graphical user interface of the second terminal device defaults to display a game screen obtained by observing the game scene through the second game view angle corresponding to the second virtual character, and the second terminal device may control, in response to the first operation of the character identification control for the first virtual character, to switch the game screen obtained by observing the game scene through the second game view angle displayed in the graphical user interface to display the game screen obtained by observing the game scene through the first game view angle. The graphical user interface of the second terminal device may generate second location mark information in response to a second operation of the character identification control for the first virtual character in a case where a game screen obtained by observing a game scene from the first game perspective is displayed, and transmit the second location mark information to the first terminal device controlling the first virtual character.
In an alternative embodiment, the first terminal device may further perform the steps of: receiving second position mark information, wherein the second position mark information is generated by a second terminal device controlling a second virtual character through responding to a second operation of a character identification control aiming at the first virtual character; and generating a second position mark in a game picture obtained by observing the game scene at the first game view angle according to the second position mark information.
Wherein the second location marking information may include, but is not limited to: information such as position information (e.g., azimuth coordinates) of the second position mark in the game scene, a marker (e.g., the second virtual character B) to which the second position mark corresponds, a marker (e.g., the first virtual character) of the second position mark, and the like.
The position mark information is synchronized to the corresponding terminal equipment, so that a game player can accurately grasp the corresponding mark position in time.
In an alternative embodiment, after generating the second location mark, the first terminal device may further perform the following steps: the second position marker is cleared in response to controlling the first avatar to reach the second position marker.
Illustratively, when the first virtual character is controlled to reach the second position mark, the marking function of the second position mark is completed, and at this time, invalid information interference can be avoided by clearing the second position mark.
In an alternative embodiment, after generating the second location mark, the first terminal device may further perform the following steps: and determining the role information corresponding to the second virtual role, and displaying the role information corresponding to the second virtual role at the second position mark.
Wherein the character information may include, but is not limited to, information of a virtual character number, a virtual character name, a virtual character attribute color, and the like. For example, character number information corresponding to the second virtual character may be displayed at the second location indicia to facilitate a game player in determining the indicia party to which the location indicia corresponds.
Optionally, after the first terminal device receives the second position mark information, mark prompt information may also be generated in the graphical user interface based on the second position mark information, so that the game player can know the mark party of the second position mark.
Optionally, after the first terminal device generates the second position mark, a mark guiding instruction and/or position distance information may be generated in the graphical user interface of the first terminal device according to a position corresponding to the second position mark in the game scene and a position corresponding to the first virtual character in the game scene, so that the game player can quickly identify and reach the mark point.
Optionally, during the display of the second position mark, if new second position mark information sent by the second terminal device controlling the second virtual character is received again, the original second position mark displayed in the graphical user interface may be cleared, and according to the new second position mark information, a new second position mark is generated in a game picture obtained by observing the game scene at the first game viewing angle, so as to update the position mark in real time.
Fig. 5 illustrates a game interaction device 500 in an exemplary embodiment of the present disclosure, where a graphical user interface is provided through a first terminal device, and the graphical user interface displays a game screen obtained by observing a game scene through a first game view angle corresponding to a first virtual character, where the game scene includes the first virtual character and a second virtual character, and the first virtual character is controlled through the first terminal device. As shown in fig. 5, the game interaction device 500 may include:
an identification control providing module 510 for providing a character identification control for identifying the second virtual character at the graphical user interface;
a game screen switching module 520 for controlling switching of a game screen obtained by observing a game scene through a first game viewing angle, which is displayed in the graphical user interface, to a game screen obtained by observing a game scene through a second game viewing angle, in response to a first operation of the character identification control for the second virtual character; the second game view angle is a game view angle corresponding to the second virtual character.
In an alternative embodiment, based on the foregoing, prior to the step of responding to the first operation of the character identification control for the second virtual character, the game interaction device 500 may further include: the first visual angle adjusting module is used for responding to the visual angle adjusting operation acted on the graphical user interface and controlling the graphical user interface to switch and display game pictures obtained by observing the game scene at different first game visual angles determined based on the visual angle adjusting operation; during the validation of the first operation, the game interaction device 500 may further include: and the second visual angle adjusting module is used for responding to the visual angle adjusting operation acted on the graphical user interface and controlling the graphical user interface to switch and display the game pictures obtained by observing the game scene at different second game visual angles determined based on the visual angle adjusting operation.
In an alternative embodiment, based on the foregoing, the game interaction device 500 may further include: the mark information generation module is used for responding to the second operation of the character identification control aiming at the second virtual character when the game picture obtained by observing the game scene through the second game visual angle is displayed in the graphic user interface, and generating first position mark information; and the mark information sending module is used for sending the first position mark information to the second terminal equipment controlling the second virtual character so that the second terminal equipment generates the first position mark based on the first position mark information.
In an alternative embodiment, based on the foregoing scheme, the tag information generating module may be configured to: in response to a sliding operation of a game screen obtained from the character identification control of the second virtual character to the second game view angle viewing the game scene, first position mark information is generated according to an end position of the sliding operation in the game screen obtained from the second game view angle viewing the game scene.
In an alternative embodiment, based on the foregoing, the mark information generating module generates first position mark information according to an end position in a game screen obtained by observing a game scene at a second game view angle according to a sliding operation, including: determining a first distance based on a mapping position of the ending position in the game scene and a position of the first virtual character in the game scene, and generating first position mark information if the first distance is smaller than a first preset distance threshold; or determining a second distance based on the mapping position of the ending position in the game scene and the position of the second virtual character in the game scene, and generating the first position mark information if the second distance is smaller than a second preset distance threshold value.
In an alternative embodiment, based on the foregoing, the game interaction device 500 may further include: the mark information receiving module is used for receiving second position mark information, wherein the second position mark information is generated by a second terminal device for controlling a second virtual character through responding to a second operation of a character identification control for the first virtual character; and the position mark generation module is used for generating a second position mark in a game picture obtained by observing the game scene at the first game view angle according to the second position mark information.
In an alternative embodiment, after generating the second position mark based on the foregoing, the game interaction device 500 may further include: and the mark clearing module is used for clearing the second position mark in response to controlling the first virtual character to reach the second position mark.
In an alternative embodiment, after generating the second position mark based on the foregoing, the game interaction device 500 may further include: and the role information display module is used for determining the role information corresponding to the second virtual role and displaying the role information corresponding to the second virtual role at the second position mark.
In an alternative embodiment, based on the foregoing, the game screen switching module 520 may be configured to: in response to a touch operation of the character identification control for the second virtual character, controlling to switch a game picture obtained by observing the game scene through the first game view angle displayed in the graphical user interface to a game picture obtained by observing the game scene through the second game view angle; and continuously displaying a game picture obtained by observing the game scene through the second game visual angle in the graphical user interface during the effective period of the touch operation.
In an alternative embodiment, based on the foregoing, the game interaction device 500 may further include: and the game picture restoring module is used for controlling to switch the game picture obtained by observing the game scene through the second game visual angle, which is displayed in the graphical user interface, to display the game picture obtained by observing the game scene through the first game visual angle in response to the touch operation.
In an alternative embodiment, based on the foregoing, the game interaction device 500 may further include: and the game picture switching sub-module is used for responding to the sliding operation from the character identification control of the second virtual character to the character identification control of the other second virtual character when the game picture obtained by observing the game scene through the second game visual angle is displayed in the graphic user interface, and controlling to switch the game picture obtained by observing the game scene through the second game visual angle displayed in the graphic user interface to the game picture obtained by observing the game scene through the game visual angle corresponding to the other second virtual character.
The specific details of the above modules in the game interaction device 500 are already described in the method section, and the details that are not disclosed may refer to the embodiment of the method section, so that they will not be described in detail.
Exemplary embodiments of the present disclosure also provide a computer-readable storage medium having stored thereon a program product capable of implementing the game interaction method described above in the present specification. In some possible implementations, aspects of the present disclosure may also be implemented in the form of a program product comprising program code for causing an electronic device to carry out the steps according to the various exemplary embodiments of the disclosure as described in the "exemplary methods" section of this specification, when the program product is run on an electronic device.
The program product may employ a portable compact disc read-only memory (CD-ROM) and comprise program code and may be run on an electronic device, such as a personal computer. However, the program product of the present disclosure is not limited thereto, and in this document, a readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
The program product may employ any combination of one or more readable media. The readable medium may be a readable signal medium or a readable storage medium. The readable storage medium can be, for example, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples (a non-exhaustive list) of the readable storage medium would include the following: an electrical connection having one or more wires, a portable disk, a hard disk, random Access Memory (RAM), read-only memory (ROM), erasable programmable read-only memory (EPROM or flash memory), optical fiber, portable compact disk read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
The computer readable signal medium may include a data signal propagated in baseband or as part of a carrier wave with readable program code embodied therein. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination of the foregoing. A readable signal medium may also be any readable medium that is not a readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF (Radio Frequency) and the like, or any suitable combination of the foregoing.
Program code for carrying out operations of the present disclosure may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, C++ or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computing device, partly on the user's device, as a stand-alone software package, partly on the user's computing device, partly on a remote computing device, or entirely on the remote computing device or server. In the case of remote computing devices, the remote computing device may be connected to the user computing device through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computing device (e.g., connected via the Internet using an Internet service provider).
The exemplary embodiment of the disclosure also provides an electronic device capable of implementing the game interaction method. An electronic device 600 according to such an exemplary embodiment of the present disclosure is described below with reference to fig. 6. The electronic device 600 shown in fig. 6 is merely an example and should not be construed to limit the functionality and scope of use of embodiments of the present disclosure in any way.
As shown in fig. 6, the electronic device 600 may be embodied in the form of a general purpose computing device. Components of electronic device 600 may include, but are not limited to: at least one processing unit 610, at least one memory unit 620, a bus 630 connecting the different system components (including the memory unit 620 and the processing unit 610), and a display unit 640.
The storage unit 620 stores program codes that can be executed by the processing unit 610, so that the processing unit 610 performs the steps according to various exemplary embodiments of the present disclosure described in the above "exemplary method" section of the present specification.
In particular, a program product stored on a computer readable storage medium may cause an electronic device to perform the steps of:
providing a character identification control at the graphical user interface for identifying the second virtual character;
in response to a first operation of the character identification control for the second virtual character, controlling to switch a game screen displayed in the graphical user interface, which is obtained by observing the game scene at the first game view angle, to display a game screen obtained by observing the game scene at the second game view angle; the second game view angle is a game view angle corresponding to the second virtual character.
In an alternative embodiment, based on the foregoing, the following steps may be further performed prior to the step of responding to the first operation of the character identification control for the second virtual character: controlling to switch and display game pictures obtained by observing game scenes at different first game view angles determined based on the view angle adjustment operation on the graphical user interface in response to the view angle adjustment operation acting on the graphical user interface; during the validation of the first operation, the following steps may also be performed: and controlling to switch and display a game picture obtained by observing the game scene at the different second game view angles determined based on the view angle adjustment operation on the graphical user interface in response to the view angle adjustment operation acting on the graphical user interface.
In an alternative embodiment, based on the foregoing, the following steps may also be performed: when a game picture obtained by observing a game scene through a second game view angle is displayed in the graphical user interface, responding to a second operation of a character identification control aiming at a second virtual character, and generating first position mark information; the first position mark information is transmitted to a second terminal device controlling the second virtual character, so that the second terminal device generates a first position mark based on the first position mark information.
In an alternative embodiment, based on the foregoing, generating the first location marking information in response to the second operation of the character identification control for the second virtual character may be accomplished by: in response to a sliding operation of a game screen obtained from the character identification control of the second virtual character to the second game view angle viewing the game scene, first position mark information is generated according to an end position of the sliding operation in the game screen obtained from the second game view angle viewing the game scene.
In an alternative embodiment, based on the foregoing, the generating the first position mark information according to the end position of the sliding operation in the game screen obtained by observing the game scene at the second game view angle may be achieved by: determining a first distance based on a mapping position of the ending position in the game scene and a position of the first virtual character in the game scene, and generating first position mark information if the first distance is smaller than a first preset distance threshold; or determining a second distance based on the mapping position of the ending position in the game scene and the position of the second virtual character in the game scene, and generating the first position mark information if the second distance is smaller than a second preset distance threshold value.
In an alternative embodiment, based on the foregoing, the following steps may also be performed: receiving second position mark information, wherein the second position mark information is generated by a second terminal device controlling a second virtual character through responding to a second operation of a character identification control aiming at the first virtual character; and generating a second position mark in a game picture obtained by observing the game scene at the first game view angle according to the second position mark information.
In an alternative embodiment, after the second position mark is generated based on the foregoing scheme, the following steps may be further performed: the second position marker is cleared in response to controlling the first avatar to reach the second position marker.
In an alternative embodiment, after the second position mark is generated based on the foregoing scheme, the following steps may be further performed: and determining the role information corresponding to the second virtual role, and displaying the role information corresponding to the second virtual role at the second position mark.
In an alternative embodiment, based on the foregoing aspect, in response to the first operation of the character identification control for the second virtual character, the control may be implemented by switching a game screen displayed in the graphical user interface and obtained by observing the game scene at the first game angle to a game screen obtained by observing the game scene at the second game angle, where: in response to a touch operation of the character identification control for the second virtual character, controlling to switch a game picture obtained by observing the game scene through the first game view angle displayed in the graphical user interface to a game picture obtained by observing the game scene through the second game view angle; and continuously displaying a game picture obtained by observing the game scene through the second game visual angle in the graphical user interface during the effective period of the touch operation.
In an alternative embodiment, based on the foregoing, the following steps may also be performed: and controlling to switch the game screen obtained by observing the game scene through the second game view angle displayed in the graphical user interface to display the game screen obtained by observing the game scene through the first game view angle in response to the touch operation ending.
In an alternative embodiment, based on the foregoing, the following steps may also be performed: and when the game picture obtained by observing the game scene through the second game view angle is displayed in the graphic user interface, controlling to switch the game picture obtained by observing the game scene through the second game view angle displayed in the graphic user interface to the game picture obtained by observing the game scene through the game view angle corresponding to the other second virtual character in response to the sliding operation from the role identification control of the second virtual character to the role identification control of the other second virtual character.
In the game interaction process, the game picture obtained by observing the game scene from the first game view angle is switched to the game picture obtained by observing the game scene from the second game view angle through the game picture displayed in the graphical user interface, so that the game picture obtained by observing the game scene from the view angles of other virtual roles is realized, the interaction experience of game players is enriched, and more accurate azimuth communication among the game players is facilitated.
The storage unit 620 may include readable media in the form of volatile storage units, such as Random Access Memory (RAM) 621 and/or cache memory 622, and may further include Read Only Memory (ROM) 623.
The storage unit 620 may also include a program/utility 624 having a set (at least one) of program modules 625, such program modules 625 including, but not limited to: an operating system, one or more application programs, other program modules, and program data, each or some combination of which may include an implementation of a network environment.
Bus 630 may be a local bus representing one or more of several types of bus structures including a memory unit bus or memory unit controller, a peripheral bus, an accelerated graphics port, a processing unit, or using any of a variety of bus architectures.
The electronic device 600 may also communicate with one or more external devices 700 (e.g., keyboard, pointing device, bluetooth device, etc.), one or more devices that enable a user to interact with the electronic device 600, and/or any device (e.g., router, modem, etc.) that enables the electronic device 600 to communicate with one or more other computing devices. Such communication may occur through an input/output (I/O) interface 650. Also, electronic device 600 may communicate with one or more networks such as a Local Area Network (LAN), a Wide Area Network (WAN), and/or a public network, such as the Internet, through network adapter 660. As shown in fig. 6, network adapter 660 communicates with other modules of electronic device 600 over bus 630. It should be appreciated that although not shown in fig. 6, other hardware and/or software modules may be used in connection with electronic device 600, including, but not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, RAID (Redundant Arrays of Independent Disks, redundant array of independent disks) systems, tape drives, data backup storage systems, and the like.
From the above description of embodiments, those skilled in the art will readily appreciate that the example embodiments described herein may be implemented in software, or may be implemented in software in combination with the necessary hardware. Thus, the technical solution according to the embodiments of the present disclosure may be embodied in the form of a software product, which may be stored in a non-volatile storage medium (may be a CD-ROM, a U-disk, a mobile hard disk, etc.) or on a network, including several instructions to cause a computing device (may be a personal computer, a server, a terminal device, or a network device, etc.) to perform the method according to the exemplary embodiments of the present disclosure.
Furthermore, the above-described figures are only schematic illustrations of processes included in the method according to the exemplary embodiments of the present disclosure, and are not intended to be limiting. It will be readily appreciated that the processes shown in the above figures do not indicate or limit the temporal order of these processes. In addition, it is also readily understood that these processes may be performed synchronously or asynchronously, for example, among a plurality of modules.
It should be noted that although in the above detailed description several modules or units of a device for action execution are mentioned, such a division is not mandatory. Indeed, the features and functionality of two or more modules or units described above may be embodied in one module or unit in accordance with exemplary embodiments of the present disclosure. Conversely, the features and functions of one module or unit described above may be further divided into a plurality of modules or units to be embodied.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This application is intended to cover any adaptations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It is to be understood that the present disclosure is not limited to the precise arrangements and instrumentalities shown in the drawings, and that various modifications and changes may be effected without departing from the scope thereof. The scope of the present disclosure is limited only by the appended claims.

Claims (14)

1. A game interaction method, characterized in that a graphical user interface is provided through a first terminal device, the graphical user interface displays a game picture obtained by observing a game scene through a first game view angle corresponding to a first virtual character, the game scene comprises the first virtual character and a second virtual character, the first virtual character is controlled by the first terminal device, the method comprises:
Providing a character identification control at the graphical user interface for identifying the second virtual character;
responsive to a first operation of a character identification control for the second virtual character, controlling switching of a game screen displayed in the graphical user interface, the game screen being obtained by observing the game scene through the first game view angle, to a game screen being obtained by observing the game scene through a second game view angle;
the second game view angle is a game view angle corresponding to the second virtual character.
2. The method of claim 1, wherein prior to the step of responding to the first operation of the character identification control for the second virtual character, the method further comprises:
controlling to display a game picture obtained by observing the game scene at different first game view angles determined based on the view angle adjustment operation in a switching manner on the graphical user interface in response to the view angle adjustment operation acting on the graphical user interface;
during the validation of the first operation, the method further comprises:
and responding to the visual angle adjusting operation acted on the graphical user interface, and controlling the graphical user interface to switch and display a game picture obtained by observing the game scene at a second different game visual angle determined based on the visual angle adjusting operation.
3. The method according to claim 1, wherein the method further comprises:
generating first position mark information in response to a second operation of a character identification control for the second virtual character when a game picture obtained by observing the game scene through a second game view angle is displayed in the graphical user interface;
and sending the first position mark information to second terminal equipment controlling the second virtual character so that the second terminal equipment generates a first position mark based on the first position mark information.
4. A method according to claim 3, wherein the generating first location marker information in response to a second operation of a character identification control for the second virtual character comprises:
and responding to a sliding operation of a game picture obtained by observing the game scene from a role identification control of the second virtual role to the second game view angle, and generating first position mark information according to an ending position of the sliding operation in the game picture obtained by observing the game scene at the second game view angle.
5. The method according to claim 4, wherein the generating the first position mark information according to the end position of the sliding operation in the game screen obtained by observing the game scene at the second game angle includes:
Determining a first distance based on a mapping position of the ending position in the game scene and a position of the first virtual character in the game scene, and generating first position mark information if the first distance is smaller than a first preset distance threshold; or (b)
And determining a second distance based on the mapping position of the ending position in the game scene and the position of the second virtual character in the game scene, and generating first position mark information if the second distance is smaller than a second preset distance threshold value.
6. The method according to claim 1, wherein the method further comprises:
receiving second position mark information, wherein the second position mark information is generated by a second terminal device controlling the second virtual character through responding to a second operation of a character identification control aiming at the first virtual character;
and generating a second position mark in a game picture obtained by observing the game scene at the first game view angle according to the second position mark information.
7. The method of claim 6, wherein after generating the second location mark, the method further comprises:
And clearing the second position mark in response to controlling the first virtual character to reach the second position mark.
8. The method of claim 6, wherein after generating the second location mark, the method further comprises:
and determining the role information corresponding to the second virtual role, and displaying the role information corresponding to the second virtual role at the second position mark.
9. The method of claim 1, wherein the controlling, in response to the first operation of the character identification control for the second virtual character, switching a game screen displayed in the graphical user interface that results from viewing the game scene through the first game perspective to a game screen that results from viewing the game scene through a second game perspective comprises:
in response to a touch operation of a character identification control for the second virtual character, controlling to switch a game picture obtained by observing the game scene through the first game view angle, which is displayed in the graphical user interface, to a game picture obtained by observing the game scene through a second game view angle;
and continuously displaying a game picture obtained by observing a game scene through the second game visual angle in the graphical user interface during the effective period of the touch operation.
10. The method according to claim 9, wherein the method further comprises:
and controlling to switch a game screen obtained by observing the game scene through the second game view angle, which is displayed in the graphical user interface, to display a game screen obtained by observing the game scene through the first game view angle in response to the touch operation ending.
11. The method according to claim 1, wherein the method further comprises:
and when a game picture obtained by observing the game scene through a second game visual angle is displayed in the graphical user interface, responding to the sliding operation from the role identification control of the second virtual role to the role identification control of another second virtual role, and controlling to switch the game picture obtained by observing the game scene through the second game visual angle displayed in the graphical user interface to display the game picture obtained by observing the game scene through the game visual angle corresponding to the other second virtual role.
12. A game interaction device, characterized in that a graphical user interface is provided through a first terminal device, the graphical user interface displays a game picture obtained by observing a game scene through a first game view angle corresponding to a first virtual character, the game scene includes the first virtual character and a second virtual character, the first virtual character is controlled by the first terminal device, the device includes:
An identification control providing module for providing a character identification control for identifying the second virtual character in the graphical user interface;
a game screen switching module, configured to control, in response to a first operation of a character identification control for the second virtual character, switching a game screen displayed in the graphical user interface, the game screen being obtained by observing the game scene through the first game view angle, to a game screen being obtained by observing the game scene through a second game view angle;
the second game view angle is a game view angle corresponding to the second virtual character.
13. A computer readable storage medium, on which a computer program is stored, characterized in that the computer program, when being executed by a processor, implements the method of any one of claims 1 to 11.
14. An electronic device, comprising:
one or more processors;
storage means for storing one or more programs which when executed by the one or more processors cause the one or more processors to implement the method of any of claims 1 to 11.
CN202310920788.7A 2023-07-25 2023-07-25 Game interaction method, game interaction device, computer readable storage medium and electronic equipment Pending CN116899237A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310920788.7A CN116899237A (en) 2023-07-25 2023-07-25 Game interaction method, game interaction device, computer readable storage medium and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310920788.7A CN116899237A (en) 2023-07-25 2023-07-25 Game interaction method, game interaction device, computer readable storage medium and electronic equipment

Publications (1)

Publication Number Publication Date
CN116899237A true CN116899237A (en) 2023-10-20

Family

ID=88352908

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310920788.7A Pending CN116899237A (en) 2023-07-25 2023-07-25 Game interaction method, game interaction device, computer readable storage medium and electronic equipment

Country Status (1)

Country Link
CN (1) CN116899237A (en)

Similar Documents

Publication Publication Date Title
CN109091861B (en) Interactive control method in game, electronic device and storage medium
CN108287657B (en) Skill applying method and device, storage medium and electronic equipment
US10984595B2 (en) Method and apparatus for providing guidance in a virtual environment
KR20170048137A (en) Method for transmitting media contents, apparatus for transmitting media contents, method for receiving media contents, apparatus for receiving media contents
CN113965807A (en) Message pushing method, device, terminal, server and storage medium
WO2022156504A1 (en) Mark processing method and apparatus, and computer device, storage medium and program product
US20200341541A1 (en) Simulated reality cross platform system
CN111481923B (en) Rocker display method and device, computer storage medium and electronic equipment
CN112619133A (en) Game picture display method, device, equipment and storage medium
CN116899237A (en) Game interaction method, game interaction device, computer readable storage medium and electronic equipment
CN114935973A (en) Interactive processing method, device, equipment and storage medium
CN116615271A (en) System and method for accurate positioning using touch screen gestures
KR102676846B1 (en) Operation method for dome display in a metaverse environment
CN117899467A (en) Game interaction method, game interaction device, computer readable storage medium and electronic equipment
CN117379786A (en) Game interaction method, game interaction device, computer readable storage medium and electronic equipment
CN113663330B (en) Game role control method, game role control device, storage medium and electronic equipment
US20230377248A1 (en) Display control method and apparatus, terminal, and storage medium
US20240046588A1 (en) Virtual reality-based control method, apparatus, terminal, and storage medium
US20240175717A1 (en) Information processing method, information processing apparatus, and program
KR20230103135A (en) Operation method for dome display in a metaverse environment
CN111973984A (en) Coordinate control method and device for virtual scene, electronic equipment and storage medium
CN117572994A (en) Virtual object display processing method, device, equipment and medium
US20240028130A1 (en) Object movement control method, apparatus, and device
CN115970283A (en) Game control method, device, storage medium and electronic equipment
CN116999798A (en) Interaction method, device, product, medium and equipment for game

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination