CN113559520A - Interactive control method and device in game, electronic equipment and readable storage medium - Google Patents

Interactive control method and device in game, electronic equipment and readable storage medium Download PDF

Info

Publication number
CN113559520A
CN113559520A CN202110850549.XA CN202110850549A CN113559520A CN 113559520 A CN113559520 A CN 113559520A CN 202110850549 A CN202110850549 A CN 202110850549A CN 113559520 A CN113559520 A CN 113559520A
Authority
CN
China
Prior art keywords
game
virtual character
interactive
sliding operation
chat window
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110850549.XA
Other languages
Chinese (zh)
Inventor
陶欣怡
胡志鹏
程龙
刘勇成
袁思思
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Netease Hangzhou Network Co Ltd
Original Assignee
Netease Hangzhou Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netease Hangzhou Network Co Ltd filed Critical Netease Hangzhou Network Co Ltd
Priority to CN202110850549.XA priority Critical patent/CN113559520A/en
Publication of CN113559520A publication Critical patent/CN113559520A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/85Providing additional services to players
    • A63F13/87Communicating with other players during game play, e.g. by e-mail or chat
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/214Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads
    • A63F13/2145Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads the surface being also a display device, e.g. touch screens
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/70Game security or game management aspects
    • A63F13/79Game security or game management aspects involving player-related data, e.g. identities, accounts, preferences or play histories
    • A63F13/795Game security or game management aspects involving player-related data, e.g. identities, accounts, preferences or play histories for finding other players; for building a team; for providing a buddy list
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0485Scrolling or panning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1068Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals being specially adapted to detect the point of contact of the player on a surface, e.g. floor mat, touch pad
    • A63F2300/1075Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals being specially adapted to detect the point of contact of the player on a surface, e.g. floor mat, touch pad using a touch screen

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Business, Economics & Management (AREA)
  • Computer Security & Cryptography (AREA)
  • General Business, Economics & Management (AREA)
  • Information Transfer Between Computers (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application provides an interactive control method, an interactive control device, electronic equipment and a readable storage medium in a game, wherein a graphical user interface displaying a game interface is provided through a first terminal, and the game interface displays a chat window and part or all of game scenes; a second virtual role controlled by a second terminal exists in the game scene; responding to the trigger operation aiming at the second virtual role, controlling the second virtual role to enter an interactive response state, and displaying the first interactive visual special effect; when the second virtual character is in an interactive response state, responding to a first sliding operation aiming at the second virtual character, and when the first sliding operation slides to an area associated with the chat window, displaying a second interactive visual special effect; and if the first sliding operation is finished in the area related to the chat window, setting the chat window as a window for message intercommunication with the second terminal. Therefore, the private chat can be quickly initiated to the second virtual character in the game interface, and the steps of private chat initiating operation are reduced.

Description

Interactive control method and device in game, electronic equipment and readable storage medium
Technical Field
The present application relates to the field of game interaction technologies, and in particular, to an interaction control method and apparatus in a game, an electronic device, and a readable storage medium.
Background
Along with the development of science and technology, more and more types of games appear in the lives of people. The types of Games include many kinds, such as cards, RTS (Real-Time Strategy Game), and Moba (Multiplayer Online Battle sports Game), MMO (Massive Multiplayer Online Role Playing Game), and so on.
At present, in most MMO games, although a chat window for a player to chat is always displayed in a main interface, a chat window of a team channel or a world channel to which the player belongs is usually displayed by default, if the player wants to initiate a private chat to a certain virtual character at his or her side, the player needs to open the chat window through a complicated operation, and if the player is in the game process at this time, the player cannot timely talk with other virtual characters, which affects the operation efficiency in the game process.
Disclosure of Invention
In view of this, an object of the present application is to provide an interaction control method and apparatus in a game, an electronic device, and a readable storage medium, which can quickly initiate a private chat to a second virtual character in a game scene displayed on a graphical user interface, reduce operation steps of initiating an operation by the private chat in a game process, and simplify operation difficulty of the game.
The embodiment of the application provides an interactive control method in a game, a graphical user interface is provided through a first terminal, the graphical user interface comprises a game interface, and the content displayed by the game interface comprises a chat window and part or all of game scenes; a first virtual character controlled by the first terminal and a second virtual character controlled by a second terminal exist in the game scene; the method comprises the following steps:
responding to the trigger operation aiming at the second virtual character, controlling the second virtual character to enter an interactive response state, and displaying a first interactive visual special effect;
when the second virtual character is in the interaction response state, responding to a first sliding operation aiming at the second virtual character, and when the first sliding operation slides to the area associated with the chat window, displaying a second interaction visual special effect;
and responding to the area associated with the chat window after the first sliding operation is finished, and setting the chat window as a window for message intercommunication with the second terminal.
In one possible embodiment, the triggering operation is a long press operation for the second avatar; the first sliding operation is a sliding operation continuous with the long press operation.
In a possible embodiment, the triggering operation is a click operation for the second virtual character.
In one possible embodiment, the first sliding operation is a sliding operation starting from a display position of the second virtual character.
In one possible embodiment, the method further comprises:
executing a second game action in response to a second sliding operation starting from outside the display position of the second virtual character, the second game action including any one of the following actions:
controlling the second virtual character to move in the game scene; alternatively, the first and second electrodes may be,
and adjusting the display visual angle of the game scene.
In one possible embodiment, the method further comprises:
when the second virtual character is not in the interactive response state, responding to a third sliding operation starting from the display position of the second virtual character, and executing a third game behavior, wherein the third game behavior comprises any one of the following behaviors:
controlling the second virtual character to move in the game scene; alternatively, the first and second electrodes may be,
and adjusting the display visual angle of the game scene.
In one possible embodiment, the displaying the first interactive visual special effect comprises: displaying the first interactive visual special effect in the associated area of the trigger operation action position;
the first interactive visual special effect comprises at least one of: graphic identification and character identification.
In one possible implementation, the displaying the second interactive visual effect includes:
displaying the chat window as the second interactive visual special effect.
In a possible implementation manner, after the setting the chat window as the window for message interworking with the second terminal, the method further includes:
and controlling the second virtual character to exit the interactive response state and hiding the first interactive visual special effect.
In one possible embodiment, the method further comprises:
and responding to a first sliding operation aiming at the second virtual character, generating an interactive icon, and controlling the interactive icon to move along with the first sliding operation.
In one possible embodiment, the method further comprises:
and controlling the second virtual character to exit the interactive response state and hide the first interactive visual special effect in response to the first sliding operation ending in the area outside the area associated with the chat window.
The embodiment of the application also provides an interactive control device in the game, which provides a graphical user interface through a first terminal, wherein the graphical user interface comprises a game interface, and the content displayed by the game interface comprises a chat window and part or all of game scenes; a first virtual character controlled by the first terminal and a second virtual character controlled by a second terminal exist in the game scene; the method comprises the following steps:
the state response module is used for responding to the triggering operation aiming at the second virtual role, controlling the second virtual role to enter an interactive response state and displaying a first interactive visual special effect;
the effect display module is used for responding to a first sliding operation aiming at the second virtual character when the second virtual character is in the interactive response state, and displaying a second interactive visual special effect when the first sliding operation slides to the area associated with the chat window;
and the window setting module is used for setting the chat window as a window for message intercommunication with the second terminal in response to the first sliding operation ending in the area associated with the chat window.
In one possible embodiment, the triggering operation is a long press operation for the second avatar; the first sliding operation is a sliding operation continuous with the long press operation.
In a possible embodiment, the triggering operation is a click operation for the second virtual character.
In one possible embodiment, the first sliding operation is a sliding operation starting from a display position of the second virtual character.
In a possible implementation, the interactive control device further includes a first adjusting module, and the first adjusting module is configured to:
executing a second game action in response to a second sliding operation starting from outside the display position of the second virtual character, the second game action including any one of the following actions:
controlling the second virtual character to move in the game scene; alternatively, the first and second electrodes may be,
and adjusting the display visual angle of the game scene.
In a possible implementation, the interactive control device further includes a second adjusting module, and the second adjusting module is configured to:
when the second virtual character is not in the interactive response state, responding to a third sliding operation starting from the display position of the second virtual character, and executing a third game behavior, wherein the third game behavior comprises any one of the following behaviors:
controlling the second virtual character to move in the game scene; alternatively, the first and second electrodes may be,
and adjusting the display visual angle of the game scene.
In one possible implementation, when the status response module is configured to display the first interactive visual effect, the status response module is configured to:
displaying the first interactive visual special effect in the associated area of the trigger operation action position;
the first interactive visual special effect comprises at least one of: graphic identification and character identification.
In one possible implementation, the window setting module, when configured to display the second interactive visual effect, is configured to:
displaying the chat window as the second interactive visual special effect.
In a possible implementation manner, the interactive control device further includes a first state exit module, and the first state exit module is configured to:
and controlling the second virtual character to exit the interactive response state and hiding the first interactive visual special effect.
In a possible implementation manner, the interaction control apparatus further includes an icon generation module, and the icon generation module is configured to:
and responding to a first sliding operation aiming at the second virtual character, generating an interactive icon, and controlling the interactive icon to move along with the first sliding operation.
In a possible implementation manner, the interactive control device further includes a second state exit module, and the second state exit module is configured to:
and controlling the second virtual character to exit the interactive response state and hide the first interactive visual special effect in response to the first sliding operation ending in the area outside the area associated with the chat window.
An embodiment of the present application further provides an electronic device, including: a processor, a memory and a bus, the memory storing machine-readable instructions executable by the processor, the processor and the memory communicating via the bus when the electronic device is running, the machine-readable instructions being executable by the processor to perform the steps of the method of interactive control in a game as described above.
The embodiment of the present application also provides a computer-readable storage medium, on which a computer program is stored, where the computer program is executed by a processor to perform the steps of the interaction control method in the game as described above.
According to the method and the device for controlling interaction in the game, the electronic equipment and the readable storage medium, the graphical user interface is provided through the first terminal, the graphical user interface comprises the game interface, and the content displayed by the game interface comprises the chat window and part or all of the game scenes; a first virtual character controlled by the first terminal and a second virtual character controlled by a second terminal exist in the game scene; the method comprises the following steps: responding to the trigger operation aiming at the second virtual character, controlling the second virtual character to enter an interactive response state, and displaying a first interactive visual special effect; when the second virtual character is in the interaction response state, responding to a first sliding operation aiming at the second virtual character, and when the first sliding operation slides to the area associated with the chat window, displaying a second interaction visual special effect; and responding to the area associated with the chat window after the first sliding operation is finished, and setting the chat window as a window for message intercommunication with the second terminal. Therefore, the private chat can be quickly initiated to the second virtual character in the game scene through the trigger operation and the sliding operation aiming at the second virtual character in the game scene displayed on the graphical user interface, the operation steps of the private chat initiating operation can be reduced, and the operation difficulty of the game is simplified.
In order to make the aforementioned objects, features and advantages of the present application more comprehensible, preferred embodiments accompanied with figures are described in detail below.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are required to be used in the embodiments will be briefly described below, it should be understood that the following drawings only illustrate some embodiments of the present application and therefore should not be considered as limiting the scope, and for those skilled in the art, other related drawings can be obtained from the drawings without inventive effort.
FIG. 1a is a schematic diagram of a graphical user interface provided by an embodiment of the present application;
FIG. 1b is a second schematic diagram of a graphical user interface provided by an embodiment of the present application;
FIG. 1c is a third schematic diagram of a graphical user interface provided by an embodiment of the present application;
FIG. 1d is a fourth schematic diagram of a graphical user interface provided by an embodiment of the present application;
FIG. 2 is a flowchart of an interaction control method in a game according to an embodiment of the present disclosure;
FIG. 3 is a schematic diagram of a chat window provided in an embodiment of the present application;
fig. 4 is a second schematic display view of a chat window provided in the embodiment of the present application;
FIG. 5 is a schematic structural diagram of an interactive control in a game according to an embodiment of the present disclosure;
FIG. 6 is a second schematic structural diagram of an interactive control in a game according to an embodiment of the present application;
FIG. 7 is a third exemplary diagram illustrating an interactive control structure in a game according to an embodiment of the present disclosure;
FIG. 8 is a fourth exemplary diagram illustrating an interactive control in a game according to an embodiment of the present disclosure;
fig. 9 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present application clearer, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all the embodiments. The components of the embodiments of the present application, generally described and illustrated in the figures herein, can be arranged and designed in a wide variety of different configurations. Thus, the following detailed description of the embodiments of the present application, presented in the accompanying drawings, is not intended to limit the scope of the claimed application, but is merely representative of selected embodiments of the application. Every other embodiment that can be obtained by a person skilled in the art without making creative efforts based on the embodiments of the present application falls within the protection scope of the present application.
Virtual object:
refers to a dynamic object that can be controlled in a virtual scene. Alternatively, the dynamic object may be a virtual character, a virtual animal, an animation character, or the like. The virtual object is a Character controlled by a Player through an input device, or an Artificial Intelligence (AI) set in a virtual environment match-up through training, or a Non-Player Character (NPC) set in a virtual scene match-up. Optionally, the virtual object is a virtual character playing a game in a virtual scene. Optionally, the number of virtual objects in the virtual scene match is preset, or dynamically determined according to the number of clients participating in the match, which is not limited in the embodiment of the present application. In one possible implementation, the user can control the virtual object to move in the virtual scene, e.g., control the virtual object to run, jump, crawl, etc., and can also control the virtual object to fight against other virtual objects using skills, virtual props, etc., provided by the application.
And (3) game scene:
the interface is provided or displayed through a graphical user interface, and the interface comprises a UI interface and a game picture for a player to interact. In alternative embodiments, game controls (e.g., skill controls, movement controls, functionality controls, etc.), indicators (e.g., directional indicators, character indicators, etc.), information presentation areas (e.g., number of clicks, game play time, etc.), or game setting controls (e.g., system settings, stores, coins, etc.) may be included in the UI interface. In an optional embodiment, the game screen is a display screen corresponding to a virtual scene displayed by the terminal device, and the game screen may include virtual objects such as a game character, an NPC character, and an AI character that execute a game logic in the virtual scene. Optionally, the game scene is a simulated environment of the real world, or a semi-simulated semi-fictional game environment, or a purely fictional game environment. The game scene is any one of a two-dimensional virtual scene and a three-dimensional virtual scene, and can be sky, land, sea and the like, wherein the land comprises environmental elements such as deserts, cities and the like. The game scene is a scene of a complete game logic of a virtual object such as user control.
The game scene in the present application refers to a scene involved in a game process, for example, in a process of executing a game task; rather than the scenario involved before the game starts, e.g., on the player's home page.
The method for controlling interaction in a game in one embodiment of the present disclosure may be executed on a terminal device or a server. The terminal device may be a local terminal device. When the in-game interaction control method runs on the server, the in-game interaction control method can be implemented and executed based on a cloud interaction system, wherein the cloud interaction system comprises the server and the client device.
In an optional embodiment, various cloud applications may be run under the cloud interaction system, for example: and (5) cloud games. Taking a cloud game as an example, a cloud game refers to a game mode based on cloud computing. In the cloud game operation mode, the game program operation main body and the game picture presentation main body are separated, the storage and the operation of the information processing method are completed on the cloud game server, and the client device is used for receiving and sending data and presenting the game picture, for example, the client device can be a display device with a data transmission function close to a user side, such as a mobile terminal, a television, a computer, a palm computer and the like; however, the terminal device performing the information processing is a cloud game server in the cloud. When a game is played, a player operates the client device to send an operation instruction to the cloud game server, the cloud game server runs the game according to the operation instruction, data such as game pictures and the like are encoded and compressed, the data are returned to the client device through a network, and finally the data are decoded through the client device and the game pictures are output.
In an alternative embodiment, the terminal device may be a local terminal device. Taking a game as an example, the local terminal device stores a game program and is used for presenting a game screen. The local terminal device is used for interacting with the player through a graphical user interface, namely, a game program is downloaded and installed and operated through an electronic device conventionally. The manner in which the local terminal device provides the graphical user interface to the player may include a variety of ways, for example, it may be rendered for display on a display screen of the terminal or provided to the player through holographic projection. For example, the local terminal device may include a display screen for presenting a graphical user interface including a game screen and a processor for running the game, generating the graphical user interface, and controlling display of the graphical user interface on the display screen.
It has been found through research that in most MMO games, although a chat window for a player to chat is always displayed in a main interface of a game scene during a game of the player, the chat window is set as a default chat window for the player to issue information to a team channel or a world channel, and the player can check information issued to the team channel or the world channel by other players in the chat window, at this time, if the player inputs information in the chat window, the information is issued to the team channel to which the player belongs or the world channel in the game, as shown in fig. 1a, fig. 1a is one of the graphical user interface diagrams provided in the embodiment of the present application, a chat window 12, a first virtual character 13 and a second virtual character 14 are displayed in the graphical user interface 11, and if the player inputs information in the chat window 12, the information is published to the team channel of the player or the world channel of the game;
because the player is currently in the game process, if the player wants to initiate a private chat to a virtual character at the player's side in the current game scene, the chat window needs to be opened through complicated operations, for example: as shown in fig. 1a to 1d, firstly, as shown in fig. 1a, a player needs to click a second virtual character 14 displayed in a graphical user interface 11 by a finger, a mouse or a key, so that business card information 15 of the second virtual character 14 is displayed in the graphical user interface 11 (as shown in fig. 1b, fig. 1b is a second graphical user interface schematic diagram provided by the embodiment of the present application); then, (as shown in fig. 1 b) the player clicks the business card information 15 displayed in the graphical user interface 11 by using a finger, a mouse or a key, so that the interaction list 16 of the second virtual character 14 is displayed in the graphical user interface 11 (as shown in fig. 1c, fig. 1c is a third graphical user interface schematic diagram provided by the embodiment of the present application); finally, the player clicks an interaction control 17 of "send message" in the interaction list 16 displayed in the graphical user interface 11 by using a finger, a mouse, or a key to open a private chat window 18 that can control the second terminal to the second virtual character 14 (as shown in fig. 1d, fig. 1d is a fourth graphical user interface schematic diagram provided by the embodiment of the present application); because the player is in the game process at this time, if the player can have a conversation with other virtual characters through multiple click operations, the communication efficiency of the player in the game process is reduced.
Therefore, the method for controlling interaction in the game can reduce the operation step of initiating private chat to the second terminal used by the player to which the second virtual character belongs and displayed in the game scene in the game process, and reduce the operation difficulty of the game.
Referring to fig. 2, fig. 2 is a flowchart illustrating an interaction control method in a game according to an embodiment of the present disclosure. Providing a graphical user interface through a first terminal, wherein the graphical user interface comprises a game interface, and the content displayed by the game interface comprises a chat window and part or all of game scenes; a first virtual character controlled by the first terminal and a second virtual character controlled by a second terminal exist in the game scene; the first virtual character can interact with a second virtual character in the game scene according to a control command issued by the first user. As shown in fig. 2, an interaction control method in a game provided in an embodiment of the present application includes:
s201, responding to the trigger operation aiming at the second virtual character, controlling the second virtual character to enter an interactive response state, and displaying a first interactive visual special effect.
S202, when the second virtual character is in the interactive response state, responding to a first sliding operation aiming at the second virtual character, and when the first sliding operation slides to an area associated with the chat window, displaying a second interactive visual special effect.
S203, responding to the area associated with the chat window after the first sliding operation is finished, and setting the chat window as a window for message intercommunication with the second terminal.
The terminal mentioned in the present application mainly refers to an intelligent device for displaying a game screen and capable of controlling and operating a virtual character, and the terminal may include any one of the following devices: smart phones, tablet computers, notebook computers, desktop computers, and the like.
The graphic user interface refers to an interface of a game screen displayed in a display screen of the terminal. The game scene refers to a virtual game space bearing a virtual character in the game process (for example, in the process of executing a game task by the virtual character), the virtual character can perform actions such as movement and skill release under the control of an operation instruction issued by a user to the intelligent terminal in the game scene, and the game scene can include any one or more of the following elements: game background elements, game virtual character elements, game prop elements, and the like. The game screen refers to a partial screen in the virtual world observed at a specified viewing angle (e.g., the viewing angle is the eye of the first virtual character controlled by the first user), and the partial screen is presented on the graphical user interface.
Displayed in the graphical user interface is a game picture that can be seen by the eyes. The game scene is a scene used in a normal game process, and a large number of virtual characters (a first virtual character controlled by a first user and a second virtual character controlled by a second user) controlled by different users may exist in the game scene. Common game scenes comprise a battle scene, an interactive scene and the like, wherein the battle scene refers to a scene in which at least two virtual characters fight against each other, and the interactive scene refers to a scene in which at least two virtual characters perform a conversation or exchange of articles and the like. Of course, in some cases, both the first virtual character and the second virtual character may be controlled by the same user.
In the scheme provided by the application, in the triggering operation or the sliding operation, a player can apply the triggering operation or the sliding operation by pressing the touch key corresponding to the triggering operation or the sliding operation; the player can also issue a trigger operation or sliding operation instruction by pressing a preset combined key; specifically, the player can touch the keys through fingers, a mouse and the like; or the preset keys can be manually set according to the requirements of the player through preset combination keys in the keyboard, for example, the ctrl key, alt key, a key and other keys in the keyboard.
The chat window is used for allowing a virtual character to communicate with another virtual character in language and/or expression in the game process, namely, the virtual character can send characters, voice or expressions to the other virtual character through the chat window to express own viewpoint, attitude and the like. An information input box is arranged in the chat window, and the information input box refers to a position for the virtual character to input characters, voice or expressions.
In step S201, in response to a trigger operation applied to the second virtual character in the graphical user interface by a finger, a mouse, or a button, the second virtual character selected by the trigger operation is controlled to enter an interaction response state.
Here, in order to enable the user to clearly know that the second virtual character displayed in the game scene has entered the interactive response state, a first interactive visual special effect is displayed in the game scene displayed in the graphical user interface, wherein the first interactive visual special effect may be displayed in the area associated with the trigger operation action position, for example, around the trigger operation touch position; or the periphery of the second virtual character for which the operation is triggered; or the second virtual character to which the triggering operation is directed is displayed in an overlapping mode; the first interactive visual effect may include at least one of: graphic identification and character identification; the first interactive visual special effect can be used for prompting the player that the second virtual character is in an interactive response state at the current moment; it should be noted that the text identifier may be used to prompt the player how to initiate a private chat to a second terminal of the player to which the second virtual character belongs when the second virtual character is in the interactive response state, in addition to prompting the player that the second virtual character is in the interactive response state; specifically, the character identifier can be displayed in a floating manner within a preset range around the second virtual character; or displaying the text identifier in an overlapping way at any position in the graphical user interface, for example, displaying the text identifier of "dragging to the chat window and starting private chat" at any position in the graphical user interface.
For the graphical identifier, the graphical identifier may be displayed in a floating manner within a preset range around the second virtual character in the graphical user interface, for example, a hand-shaped icon is displayed in a floating manner within a preset range around the second virtual character; for another example, a graphical identifier is displayed in a floating manner at the touch position where the operation is triggered; and/or, displaying a graphical identifier in a graphical user interface superimposed on the second avatar, for example, displaying an aperture below the second avatar (an exclamation mark or the like may also be displayed around the second avatar); or displaying a diaphragm on the outer circle of the displayed second virtual character in an overlapping manner; a second avatar with high transparency may also be displayed superimposed on the already displayed second avatar, and so on.
In step S202, on the premise that the second virtual character enters the interactive response state, by applying a first sliding operation to the second virtual character in the game scene, a corresponding first game behavior is executed; specifically, the sliding position of the first sliding operation is determined, when the sliding position indicates that the first sliding operation slides to the area associated with the chat window, a second interactive visual special effect is displayed in the graphical user interface to prompt the player through the second interactive visual special effect, and if the first sliding operation is finished at this time, message intercommunication can be performed with the second terminal.
Here, the displayed second interactive visual special effect may include at least one of: graphic identification and character identification; the character identifier can be used for prompting a user to finish a first sliding operation at the current position, and can initiate a private chat to a second terminal of the player to which the second virtual character belongs; specifically, the character identifier can be displayed in a floating manner within a preset range around the second virtual character; or displaying the text identifier in an overlapping manner at any position in the graphical user interface, for example, displaying the text identifier of "end-of-drag at current position can initiate private chat" at any position in the graphical user interface.
For the graphic identifier, the graphic identifier may be displayed in a floating manner in a preset range around the chat window in the graphical user interface, for example, a dialog bubble icon is displayed in a floating manner in the preset range around the chat window; and/or, displaying a graphic identifier in an overlapping manner on a chat window displayed in the graphical user interface, for example, displaying an aperture in an overlapping manner outside the displayed chat window; still alternatively, the chat window may be displayed enlarged, and so on.
Here, the second interactive visual effect may also be displaying the chat window as the second interactive visual effect; for example, a chat window for the player to perform message intercommunication is not displayed in the original graphical user interface, and at this time, when the first sliding operation slides to the area associated with the chat window, the chat window can be displayed in the graphical user interface as the second interactive visual special effect.
The chat window is set for a player in a game scene displayed on a graphical user interface, and is used for sending a message to a virtual world constructed by the game or sending a message to a team to which the player belongs.
Here, whether or not the first sliding operation is ended in the area related to the chat window may be determined based on the touch position at the end of the first sliding operation, and if the touch position at the end of the first sliding operation is within the area related to the chat window, it may be determined that the first sliding operation is ended in the area related to the chat window.
When the chat window is a chat window used by the player for performing private chat with the controlled terminals of other virtual characters before, if the first sliding operation is finished in the area associated with the chat window, the chat window used for performing private chat with the controlled terminals of other virtual characters can be set as a window for performing message intercommunication with the second terminal of the player to which the second virtual character belongs.
Here, in one embodiment, the trigger operation may be a long press operation for the second avatar; the first sliding operation is a sliding operation continuous with the long-press operation, namely, the player can select a second virtual character which wants to carry out private chat through one continuous operation and realize message intercommunication with a second terminal of the player to which the second virtual character belongs.
In another embodiment, the first sliding operation is a sliding operation starting from a display position of the second virtual character. For example, after the second virtual character enters the interactive response state, the user may perform the second action, that is, the first sliding operation again, and the user starts to perform the first sliding operation from the display position of the second virtual character, so that the chat window may start to be transferred to the second virtual character.
In one embodiment, the triggering operation is a click operation for the second virtual character, that is, the player selects the second virtual character to be privately chatted in advance through the click operation, and then performs message interworking with the second terminal of the player to which the second virtual character belongs through the first sliding operation.
It should be noted that, unlike the long-press operation, the touch time of the click operation is much shorter than that of the long-press operation.
The starting position of the first sliding operation is a touch position at the beginning of the first sliding operation, and when a user applies a click operation or a sliding operation by touching a graphical user interface through a finger or the like, the position where the finger of the user presses on the graphical user interface is the starting touch position of the first sliding operation; when the user applies the first sliding operation by pressing a preset combination key in the keyboard, the position of a touch icon (e.g., a mouse icon) in the graphical user interface when the user presses the combination key is the starting touch position of the first sliding operation.
In one embodiment, for the purpose of prompting the user of the real-time touch position of the first sliding operation, the method further includes: and responding to a first sliding operation aiming at the second virtual character, generating an interactive icon, and controlling the interactive icon to move along with the first sliding operation.
In the step, in the sliding process of the first sliding operation, an interactive icon used for representing the real-time position of the first sliding operation is generated in a graphical user interface, and the generated interactive icon is controlled to move along with the sliding track of the first sliding operation, so that the purpose of prompting the user of the current touch position of the first sliding operation is achieved.
Specifically, when the user applies a first sliding operation by touching the graphical user interface with a finger or the like, the interactive icon moves along with the position of the finger; when the user applies the sliding touch operation by pressing the preset combination keys in the keyboard, the interactive icon moves along with the position of the touch icon displayed in the graphical user interface.
Here, the chat window may be displayed in the graphical user interface in a pop-up manner, and may also be displayed in the graphical user interface in a slide-out manner, which is not limited herein.
The chat window can comprise a text input area, an expression selection control and a voice input control; when the character input area is clicked, a virtual keyboard for a player to input characters can be popped up; similarly, when the emotion selection control is touched, an emotion list can be popped up for the player to select; when the voice input control is touched, a voice input prompt identification is popped up to prompt the player to input voice information.
As an example, as shown in fig. 3, fig. 3 is a schematic display diagram of a chat window provided in an embodiment of the present application, in which a portion of a game scene, a first virtual character 3b, a second virtual character 3c and a chat window 3d in the game scene are displayed in a graphical user interface 3a, and in response to a trigger operation for the second virtual character 3c in the game scene, the second virtual character 3c is controlled to enter an interaction response state; at the same time, a first interactive visual effect is displayed in the graphical user interface 3a, for example, an exclamation mark 3e is displayed around the second avatar 3c to inform the player that the second avatar 3c has entered the interactive response state.
When the second avatar 3c is in the interactive response state, in response to a first sliding operation applied to the second avatar 3c, displaying an interactive icon 3f around the second avatar 3c (in fig. 3, the interactive icon 3f is an example of displaying a second avatar image with high transparency on the second avatar 3c display image in an overlapping manner, and in other embodiments, the interactive mark may also be another image or mark, which is not limited herein), and controlling the interactive icon 3f to move along with the sliding track of the sliding touch operation during the sliding process of the first sliding operation.
In response to the end of the applied first sliding operation, if the first sliding operation is ended in the chat window related area 3g (in this embodiment, the chat window related area 3g coincides with the chat window 3d displayed in the graphical user interface 3a, and in other embodiments, the occupied range of the chat window related area 3g may have a certain deviation from the occupied range of the chat window 3d, for example, the occupied range of the chat window related area 3g is slightly larger than the occupied range of the chat window 3d, which may be specifically set according to the actual situation, but is not limited thereto), the chat window 3d is set as a window for communicating messages with the second terminal of the player to which the second virtual character 3c belongs in the graphical user interface 3a, and the display form of the chat window may be appropriately changed during the setting process, for example, the display form of the original chat window is changed, and the character of the second virtual character 3c is displayed in the chat window Name, etc. (as shown in 3g in fig. 3).
It should be noted that, the display form of the original chat window may not be changed in the setting process, which may be determined according to the situation.
In one embodiment, after the setting the chat window as a window for message interworking with the second terminal, the method further comprises: and controlling the second virtual character to exit the interactive response state and hiding the first interactive visual special effect.
After the message intercommunication interaction is carried out, the player needs to continue to control the virtual character to play the game, so that the displayed chat window needs to be closed; after the player closes the chat window, the player can think that the player chooses to finish the message intercommunication and reenters the game state, at the moment, the second virtual character can be controlled to exit the interactive response state, and meanwhile, the first interactive visual special effect displayed to the player is hidden, so that the vision of the player in the game process is prevented from being shielded. Of course, in the embodiment of the present invention, the first virtual character may also be moved while displaying the chat window, and when the movement of the first virtual character is detected, the chat window is automatically closed or hidden, so that the limited screen resources may be fully utilized, and the second virtual character is controlled to exit the interactive response state and the first interactive visual special effect is hidden.
In one embodiment, the method further comprises: and controlling the second virtual character to exit the interactive response state and hide the first interactive visual special effect in response to the first sliding operation ending in the area outside the area associated with the chat window.
Here, while the player abandons the private chat for various reasons during the process of initiating the private chat, in the case where the second virtual character is in the interactive response state, the player may control the second virtual character to exit the interactive response state in such a manner that the first sliding operation is dragged to an end of the area other than the area associated with the chat window; and after the second virtual character exits the interactive response state, hiding the first interactive visual special effect displayed in the graphical user interface.
As shown in fig. 4, fig. 4 is a second schematic display diagram of a chat window provided in the embodiment of the present application, as shown in fig. 4, a portion of a game scene, a first virtual character 4b, a second virtual character 4c in the game scene, and a chat window 4d are displayed in a graphical user interface 4a, at this time, the second virtual character 4c in the game scene has already entered into an interactive response state, and accordingly, a first interactive visual special effect, that is, an exclamation mark 4e is displayed around the second virtual character 4c to indicate that the second virtual character 4c is in the interactive response state; meanwhile, in response to the first sliding operation, an interactive icon 4f representing the sliding position of the first sliding operation is further displayed around the second virtual character 4c in the graphical user interface 4a, the interactive icon 4f moves along with the movement of the first sliding operation, and when the position of the interactive icon 4f indicates that the ending position of the first sliding operation is located in an area 4h other than the area 4g associated with the chat window, it is determined that the player abandons the message intercommunication, at this time, the second virtual character 4c can be controlled to exit the interactive response state, and the first interactive visual special effect, namely the hidden exclamation mark identification 4e, is hidden.
In one embodiment, the method further comprises: executing a second game action in response to a second sliding operation starting from outside the display position of the second virtual character, the second game action including any one of the following actions: controlling the second virtual character to move in the game scene; or adjusting the display visual angle of the game scene.
Here, the first sliding operation is a sliding operation applied above the display position of the second virtual character, and if a second sliding operation other than the display position of the second virtual character is applied during the game, a corresponding second game action is executed in response to the second sliding operation starting from the display position of the second virtual character, for example, the selected second virtual character is controlled to move in the game scene; or, adjusting the display visual angle of the game scene in the current graphical user interface; the adjusting of the display visual angle may be from a first person visual angle to a third person visual angle; the adjustment of the display angle of the game scene may also be referred to, for example, the display angle is originally behind the second virtual character (at this time, the image displayed on the back of the second virtual character in the graphical user interface), and the display angle may be adjusted to the left of the second virtual character through the second sliding operation (at this time, the image displayed on the left of the second virtual character is displayed); or the display angle of view is adjusted to the front of the second avatar (in this case, the front image of the second avatar is displayed).
In one embodiment, the method further comprises: when the second virtual character is not in the interactive response state, responding to a third sliding operation starting from the display position of the second virtual character, and executing a third game behavior, wherein the third game behavior comprises any one of the following behaviors: controlling the second virtual character to move in the game scene; or adjusting the display visual angle of the game scene.
Here, the first sliding operation is a sliding operation applied to the display position of the second virtual character when the second virtual character is in the interaction response state, and when there is a third sliding operation applied to the display position of the second virtual character during the game if the second virtual character in the game scene is not in the interaction response state, a corresponding third game action is executed in response to the third sliding operation started to the display position of the second virtual character, for example, the selected second virtual character is controlled to move in the game scene; or, adjusting the display visual angle of the game scene in the current graphical user interface; the adjusting of the display visual angle may be from a first person visual angle to a third person visual angle; the adjustment of the display angle of the game scene may also be referred to, for example, the display angle is originally behind the second virtual character (at this time, the image displayed on the back of the second virtual character in the graphical user interface), and the display angle may be adjusted to the left of the second virtual character through the second sliding operation (at this time, the image displayed on the left of the second virtual character is displayed); or the display angle of view is adjusted to the front of the second avatar (in this case, the front image of the second avatar is displayed).
According to the interactive control method in the game, the graphical user interface is provided through the first terminal, the graphical user interface comprises the game interface, and the content displayed by the game interface comprises a chat window and part or all of game scenes; a first virtual character controlled by the first terminal and a second virtual character controlled by a second terminal exist in the game scene; responding to the trigger operation aiming at the second virtual character, controlling the second virtual character to enter an interactive response state, and displaying a first interactive visual special effect; when the second virtual character is in the interaction response state, responding to a first sliding operation aiming at the second virtual character, and when the first sliding operation slides to the area associated with the chat window, displaying a second interaction visual special effect; and responding to the area associated with the chat window after the first sliding operation is finished, and setting the chat window as a window for message intercommunication with the second terminal. Therefore, the private chat can be quickly initiated to the second virtual character in the game scene through the trigger operation and the sliding operation aiming at the second virtual character in the game scene displayed on the graphical user interface, the operation steps of the private chat initiating operation can be reduced, and the operation difficulty of the game is simplified.
Please refer to fig. 5 to 8, fig. 5 is a first structural schematic diagram of an interactive control in a game provided by the embodiment of the present application, fig. 6 is a second structural schematic diagram of an interactive control in a game provided by the embodiment of the present application, fig. 7 is a third structural schematic diagram of an interactive control in a game provided by the embodiment of the present application, and fig. 8 is a fourth structural schematic diagram of an interactive control in a game provided by the embodiment of the present application.
As shown in fig. 5, the interactive control device 500 may be applied to a first terminal, and provide a graphical user interface through the first terminal, where the graphical user interface includes a game interface, and the content displayed by the game interface includes a chat window and part or all of a game scene; in the game scene, there are a first virtual character controlled by the first terminal and a second virtual character controlled by the second terminal, and the interaction control apparatus 500 includes:
a state response module 510, configured to, in response to a trigger operation for the second virtual character, control the second virtual character to enter an interactive response state, and display a first interactive visual special effect;
an effect display module 520, configured to respond to a first sliding operation for the second virtual character when the second virtual character is in the interaction response state, and display a second interaction visual special effect when the first sliding operation slides to an area associated with the chat window;
a window setting module 530, configured to set the chat window as a window for performing message interworking with the second terminal in response to that the first sliding operation is ended in the area associated with the chat window.
Further, as shown in fig. 6, the interactive control device 500 further includes a first adjusting module 540, and the first adjusting module 540 is configured to:
executing a second game action in response to a second sliding operation starting from outside the display position of the second virtual character, the second game action including any one of the following actions:
controlling the second virtual character to move in the game scene; alternatively, the first and second electrodes may be,
and adjusting the display visual angle of the game scene.
Further, as shown in fig. 6, the interactive control device 500 further includes a second adjusting module 550, where the second adjusting module 550 is configured to:
when the second virtual character is not in the interactive response state, responding to a third sliding operation starting from the display position of the second virtual character, and executing a third game behavior, wherein the third game behavior comprises any one of the following behaviors:
controlling the second virtual character to move in the game scene; alternatively, the first and second electrodes may be,
and adjusting the display visual angle of the game scene.
Further, as shown in fig. 7, the interactive control device 500 further includes a first state exit module 560, where the first state exit module 560 is configured to:
and controlling the second virtual character to exit the interactive response state and hiding the first interactive visual special effect.
Further, as shown in fig. 7, the interaction control apparatus 500 further includes a second state exit module 570, where the second state exit module 570 is configured to:
and controlling the second virtual character to exit the interactive response state and hide the first interactive visual special effect in response to the first sliding operation ending in the area outside the area associated with the chat window.
Further, as shown in fig. 8, the interaction control apparatus 500 further includes an icon generating module 580, where the icon generating module 580 is configured to:
and responding to a first sliding operation aiming at the second virtual character, generating an interactive icon, and controlling the interactive icon to move along with the first sliding operation.
Further, the trigger operation is a long press operation for the second virtual character; the first sliding operation is a sliding operation continuous with the long press operation.
Further, the triggering operation is a click operation for the second virtual character.
Further, the first sliding operation is a sliding operation starting from the display position of the second virtual character.
Further, when the status response module 510 is configured to display the first interactive visual effect, the status response module 510 is configured to:
displaying the first interactive visual special effect in the associated area of the trigger operation action position;
the first interactive visual special effect comprises at least one of: graphic identification and character identification.
Further, when the window setting module 530 is configured to display the second interactive visual special effect, the window setting module 530 is configured to:
displaying the chat window as the second interactive visual special effect.
According to the interactive control device in the game, the graphical user interface is provided through the first terminal, the graphical user interface comprises a game interface, and the content displayed by the game interface comprises a chat window and part or all of game scenes; a first virtual character controlled by the first terminal and a second virtual character controlled by a second terminal exist in the game scene; the method comprises the following steps: responding to the trigger operation aiming at the second virtual character, controlling the second virtual character to enter an interactive response state, and displaying a first interactive visual special effect; when the second virtual character is in the interaction response state, responding to a first sliding operation aiming at the second virtual character, and when the first sliding operation slides to the area associated with the chat window, displaying a second interaction visual special effect; and responding to the area associated with the chat window after the first sliding operation is finished, and setting the chat window as a window for message intercommunication with the second terminal. Therefore, the private chat can be quickly initiated to the second virtual character in the game interface by the trigger operation and the sliding operation aiming at the second virtual character in the game scene displayed on the graphical user interface, so that the operation steps of the private chat initiating operation can be reduced, and the operation difficulty of the game is simplified.
Referring to fig. 9, fig. 9 is a schematic structural diagram of an electronic device according to an embodiment of the present disclosure. As shown in fig. 9, the electronic device 900 includes a processor 910, a memory 920, and a bus 930.
The memory 920 stores machine-readable instructions executable by the processor 910, when the electronic device 900 runs, the processor 910 communicates with the memory 920 through the bus 930, and when the machine-readable instructions are executed by the processor 910, the steps of the interaction control method in the game in the method embodiment shown in fig. 2 may be executed.
An embodiment of the present application further provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the steps of the interactive control method in a game in the method embodiment shown in fig. 2 may be executed.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described systems, apparatuses and units may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the several embodiments provided in the present application, it should be understood that the disclosed system, apparatus and method may be implemented in other ways. The above-described embodiments of the apparatus are merely illustrative, and for example, the division of the units is only one logical division, and there may be other divisions when actually implemented, and for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection of devices or units through some communication interfaces, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a non-volatile computer-readable storage medium executable by a processor. Based on such understanding, the technical solution of the present application or portions thereof that substantially contribute to the prior art may be embodied in the form of a software product stored in a storage medium and including instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
Finally, it should be noted that: the above-mentioned embodiments are only specific embodiments of the present application, and are used for illustrating the technical solutions of the present application, but not limiting the same, and the scope of the present application is not limited thereto, and although the present application is described in detail with reference to the foregoing embodiments, those skilled in the art should understand that: any person skilled in the art can modify or easily conceive the technical solutions described in the foregoing embodiments or equivalent substitutes for some technical features within the technical scope disclosed in the present application; such modifications, changes or substitutions do not depart from the spirit and scope of the exemplary embodiments of the present application, and are intended to be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (14)

1. An interactive control method in a game is characterized in that a first terminal provides a graphical user interface, the graphical user interface comprises a game interface, and the content displayed by the game interface comprises a chat window and part or all of game scenes; a first virtual character controlled by the first terminal and a second virtual character controlled by a second terminal exist in the game scene; the method comprises the following steps:
responding to the trigger operation aiming at the second virtual character, controlling the second virtual character to enter an interactive response state, and displaying a first interactive visual special effect;
when the second virtual character is in the interaction response state, responding to a first sliding operation aiming at the second virtual character, and when the first sliding operation slides to the area associated with the chat window, displaying a second interaction visual special effect;
and responding to the area associated with the chat window after the first sliding operation is finished, and setting the chat window as a window for message intercommunication with the second terminal.
2. The interaction control method according to claim 1, wherein the trigger operation is a long press operation for the second avatar; the first sliding operation is a sliding operation continuous with the long press operation.
3. The interaction control method according to claim 1, wherein the trigger operation is a click operation for the second avatar.
4. The interaction control method according to claim 1, wherein the first sliding operation is a sliding operation starting from a display position of the second avatar.
5. The interaction control method of claim 4, wherein the method further comprises:
executing a second game action in response to a second sliding operation starting from outside the display position of the second virtual character, the second game action including any one of the following actions:
controlling the second virtual character to move in the game scene; alternatively, the first and second electrodes may be,
and adjusting the display visual angle of the game scene.
6. The interaction control method of claim 4, wherein the method further comprises:
when the second virtual character is not in the interactive response state, responding to a third sliding operation starting from the display position of the second virtual character, and executing a third game behavior, wherein the third game behavior comprises any one of the following behaviors:
controlling the second virtual character to move in the game scene; alternatively, the first and second electrodes may be,
and adjusting the display visual angle of the game scene.
7. The interactive control method of claim 1, wherein displaying the first interactive visual effect comprises: displaying the first interactive visual special effect in the associated area of the trigger operation action position;
the first interactive visual special effect comprises at least one of: graphic identification and character identification.
8. The interactive control method of claim 1, wherein displaying the second interactive visual effect comprises:
displaying the chat window as the second interactive visual special effect.
9. The interaction control method according to claim 1, wherein after the setting of the chat window as a window for message interworking with the second terminal, the method further comprises:
and controlling the second virtual character to exit the interactive response state and hiding the first interactive visual special effect.
10. The interaction control method according to claim 1, wherein the method further comprises:
and responding to a first sliding operation aiming at the second virtual character, generating an interactive icon, and controlling the interactive icon to move along with the first sliding operation.
11. The interaction control method according to claim 1, wherein the method further comprises:
and controlling the second virtual character to exit the interactive response state and hide the first interactive visual special effect in response to the first sliding operation ending in the area outside the area associated with the chat window.
12. An interactive control device in a game is characterized in that a first terminal provides a graphical user interface, the graphical user interface comprises a game interface, and the content displayed by the game interface comprises a chat window and part or all of game scenes; a first virtual character controlled by the first terminal and a second virtual character controlled by a second terminal exist in the game scene; the device comprises:
the state response module is used for responding to the triggering operation aiming at the second virtual role, controlling the second virtual role to enter an interactive response state and displaying a first interactive visual special effect;
the effect display module is used for responding to a first sliding operation aiming at the second virtual character when the second virtual character is in the interactive response state, and displaying a second interactive visual special effect when the first sliding operation slides to the area associated with the chat window;
and the window setting module is used for setting the chat window as a window for message intercommunication with the second terminal in response to the first sliding operation ending in the area associated with the chat window.
13. An electronic device, comprising: a processor, a memory and a bus, the memory storing machine-readable instructions executable by the processor, the processor and the memory communicating over the bus when the electronic device is run, the machine-readable instructions when executed by the processor performing the steps of the in-game interaction control method of any of claims 1 to 11.
14. A computer-readable storage medium, having stored thereon a computer program for performing, when executed by a processor, the steps of the method of interactive control in a game according to any one of claims 1 to 11.
CN202110850549.XA 2021-07-27 2021-07-27 Interactive control method and device in game, electronic equipment and readable storage medium Pending CN113559520A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110850549.XA CN113559520A (en) 2021-07-27 2021-07-27 Interactive control method and device in game, electronic equipment and readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110850549.XA CN113559520A (en) 2021-07-27 2021-07-27 Interactive control method and device in game, electronic equipment and readable storage medium

Publications (1)

Publication Number Publication Date
CN113559520A true CN113559520A (en) 2021-10-29

Family

ID=78167930

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110850549.XA Pending CN113559520A (en) 2021-07-27 2021-07-27 Interactive control method and device in game, electronic equipment and readable storage medium

Country Status (1)

Country Link
CN (1) CN113559520A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114217711A (en) * 2021-12-08 2022-03-22 北京字跳网络技术有限公司 Virtual role control method, terminal, electronic device and storage medium
WO2024041270A1 (en) * 2022-08-23 2024-02-29 腾讯科技(深圳)有限公司 Interaction method and apparatus in virtual scene, device, and storage medium

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140298210A1 (en) * 2013-04-02 2014-10-02 Samsung Electronics Co., Ltd. Apparatus and method for providing private chat in group chat
CN104461299A (en) * 2014-12-05 2015-03-25 蓝信工场(北京)科技有限公司 Method and equipment for joining chat
CN106575196A (en) * 2014-07-31 2017-04-19 三星电子株式会社 Electronic device and method for displaying user interface thereof
CN110233742A (en) * 2018-03-06 2019-09-13 阿里巴巴集团控股有限公司 A kind of group's method for building up, system, terminal and server
CN110691027A (en) * 2019-08-29 2020-01-14 维沃移动通信有限公司 Information processing method and device, electronic equipment and medium
CN111298436A (en) * 2020-02-26 2020-06-19 网易(杭州)网络有限公司 Message sending method and device in game
CN112346636A (en) * 2020-11-05 2021-02-09 网易(杭州)网络有限公司 In-game information processing method and device and terminal
KR20210019358A (en) * 2019-08-12 2021-02-22 주식회사 엔씨소프트 Appartus and method for providing user interface

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140298210A1 (en) * 2013-04-02 2014-10-02 Samsung Electronics Co., Ltd. Apparatus and method for providing private chat in group chat
CN106575196A (en) * 2014-07-31 2017-04-19 三星电子株式会社 Electronic device and method for displaying user interface thereof
CN104461299A (en) * 2014-12-05 2015-03-25 蓝信工场(北京)科技有限公司 Method and equipment for joining chat
CN110233742A (en) * 2018-03-06 2019-09-13 阿里巴巴集团控股有限公司 A kind of group's method for building up, system, terminal and server
KR20210019358A (en) * 2019-08-12 2021-02-22 주식회사 엔씨소프트 Appartus and method for providing user interface
CN110691027A (en) * 2019-08-29 2020-01-14 维沃移动通信有限公司 Information processing method and device, electronic equipment and medium
CN111298436A (en) * 2020-02-26 2020-06-19 网易(杭州)网络有限公司 Message sending method and device in game
CN112346636A (en) * 2020-11-05 2021-02-09 网易(杭州)网络有限公司 In-game information processing method and device and terminal

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
百度网友83D86FA: "QQ三国怎么和别人私聊?", pages 1, Retrieved from the Internet <URL:https://zhidao.baidu.com/question/377355014.html?fr=iks&word=QQ%E4%B8%89%E5%9B%BD+%E7%A7%81%E8%81%8A&dyTabStr=MCwxLDMsMiw0LDYsNyw4LDUsOQ==> *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114217711A (en) * 2021-12-08 2022-03-22 北京字跳网络技术有限公司 Virtual role control method, terminal, electronic device and storage medium
WO2024041270A1 (en) * 2022-08-23 2024-02-29 腾讯科技(深圳)有限公司 Interaction method and apparatus in virtual scene, device, and storage medium

Similar Documents

Publication Publication Date Title
CN111729306A (en) Game character transmission method, device, electronic equipment and storage medium
CN111913624B (en) Interaction method and device for objects in virtual scene
CN111298449B (en) Control method and device in game, computer equipment and storage medium
CN111265872B (en) Virtual object control method, device, terminal and storage medium
CN112416196B (en) Virtual object control method, device, equipment and computer readable storage medium
WO2022142626A1 (en) Adaptive display method and apparatus for virtual scene, and electronic device, storage medium and computer program product
CN113559520A (en) Interactive control method and device in game, electronic equipment and readable storage medium
WO2022222592A9 (en) Method and apparatus for displaying information of virtual object, electronic device, and storage medium
CN113350779A (en) Game virtual character action control method and device, storage medium and electronic equipment
KR20220002419A (en) Method and apparatus, device, and storage medium for processing avatar usage data
CN111729291B (en) Interaction method, device, electronic equipment and storage medium
JP2009070076A (en) Program, information storage medium, and image generation device
CN113648650A (en) Interaction method and related device
CN113476825B (en) Role control method, role control device, equipment and medium in game
CN112891939B (en) Contact information display method and device, computer equipment and storage medium
CN113332716A (en) Virtual article processing method and device, computer equipment and storage medium
WO2024027165A1 (en) Information interaction method and apparatus, and electronic device and storage medium
CN111151004A (en) Game unit deployment method and device, electronic equipment and storage medium
CN115634450A (en) Control method, control device, equipment and medium for virtual role
CN113680062A (en) Information viewing method and device in game
JP2015054043A (en) Program, game device, and server system
JP2012120849A (en) Program, information storage medium, and image generator
WO2023231557A9 (en) Interaction method for virtual objects, apparatus for virtual objects, and device, storage medium and program product
WO2024060924A1 (en) Interaction processing method and apparatus for virtual scene, and electronic device and storage medium
WO2023221716A1 (en) Mark processing method and apparatus in virtual scenario, and device, medium and product

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination