CN117861205A - Interaction control method and device in game, electronic equipment and readable storage medium - Google Patents

Interaction control method and device in game, electronic equipment and readable storage medium Download PDF

Info

Publication number
CN117861205A
CN117861205A CN202410018641.3A CN202410018641A CN117861205A CN 117861205 A CN117861205 A CN 117861205A CN 202410018641 A CN202410018641 A CN 202410018641A CN 117861205 A CN117861205 A CN 117861205A
Authority
CN
China
Prior art keywords
game
prop
message
target game
information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202410018641.3A
Other languages
Chinese (zh)
Inventor
王依冉
欧阳书舟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Netease Hangzhou Network Co Ltd
Original Assignee
Netease Hangzhou Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netease Hangzhou Network Co Ltd filed Critical Netease Hangzhou Network Co Ltd
Priority to CN202410018641.3A priority Critical patent/CN117861205A/en
Publication of CN117861205A publication Critical patent/CN117861205A/en
Pending legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/85Providing additional services to players
    • A63F13/87Communicating with other players during game play, e.g. by e-mail or chat
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/30Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device
    • A63F2300/308Details of the user interface

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application discloses an interaction control method, an interaction control device, electronic equipment and a readable storage medium in a game, wherein the method comprises the following steps: providing, by the first terminal, a graphical user interface displaying at least a portion of a game scene including a first virtual character controlled by the first terminal and a second virtual character controlled by the second terminal; displaying a prop presentation interface including at least one game prop through a graphical user interface; determining a target game play object among the at least one game play object in response to the selection operation; in response to a placement instruction for the target game item, displaying the target game item in the game scene; and displaying first message information generated by the first terminal aiming at the target game prop and/or second message information generated by the second terminal aiming at the target game prop through a graphical user interface. The method can realize the interactive effect of asynchronous messages in the game scene and improve the interactive experience of players in the game.

Description

Interaction control method and device in game, electronic equipment and readable storage medium
Technical Field
The present invention relates to the field of computers, and in particular, to a method and apparatus for interactive control in a game, an electronic device, and a computer readable storage medium.
Background
With the rapid development of online games, the level difficulty in online games is higher and the game scenario is more complex, and the online games are full of unpredictability. This places higher demands on the player's strategic planning, skill in operation, spatial awareness, etc.
Currently, when a player encounters a level with a high difficulty in a game, it is generally required to ask for a game player who has passed the level, that is, add other players as friends and ask for a clearance method. The method is influenced by the character of the player, most players are not willing to add unknown players, and the attraction of the game to the players is easily reduced. Or, the player usually selects a mode of searching the game course to learn the clearance mode, but the mode of searching the game course causes the lack of interactivity among players in the game, has complicated flow, is easy to interrupt the game progress and has poor game experience. .
It should be noted that the information disclosed in the foregoing background section is only for enhancing understanding of the background of the present application and thus may include information that does not form the prior art that is already known to those of ordinary skill in the art.
Disclosure of Invention
In view of this, the application provides an interactive control method, an apparatus, an electronic device and a readable storage medium in a game, so that each player in the game can realize an interactive effect of asynchronous messages through game props placed in a game scene, and the interactive experience of the player in the game can be improved, and the immersion of the player in the game is increased.
In a first aspect, an embodiment of the present application provides an interaction control method in a game, where the method includes: displaying a prop display interface through the graphical user interface, wherein the prop display interface comprises at least one game prop;
determining a target game play object among the at least one game play object in response to the selection operation;
displaying the target game item in the game scene in response to a placement instruction for the target game item;
and displaying first message information and/or second message information contained in the target game prop through the graphical user interface, wherein the first message information is information generated by the first terminal aiming at the target game prop, and the second message information is information generated by the second terminal aiming at the target game prop.
In a second aspect, an embodiment of the present application provides an interactive control device in a game, where a graphical user interface is provided by a first terminal, where the graphical user interface includes at least a part of a game scene, and the game scene includes a first virtual character controlled by the first terminal and a second virtual character controlled by a second terminal; the device comprises: a display unit and a determination unit;
the display unit is used for displaying a prop display interface through the graphical user interface, and the prop display interface comprises at least one game prop;
the determining unit is used for determining a target game prop in the at least one game prop in response to the selection operation;
the display unit is further used for responding to a placement instruction for the target game prop and displaying the target game prop in the game scene;
the display unit is further configured to display, through the graphical user interface, first message information and/or second message information included in the target game prop, where the first message information is information generated by the first terminal for the target game prop, and the second message information is information generated by the second terminal for the target game prop.
In a third aspect, an embodiment of the present application provides an electronic device, including:
a processor; and
a memory for storing a data processing program, the electronic device being powered on and executing the program by the processor, to perform the method as in the first aspect.
In a fourth aspect, embodiments of the present application provide a computer readable storage medium storing a data processing program for execution by a processor to perform a method as in the first aspect.
According to the interactive control method in the game, a graphical user interface is provided through a first terminal, wherein the graphical user interface comprises at least part of game scenes, and the game scenes comprise a first virtual role controlled through the first terminal and a second virtual role controlled through a second terminal; in the method, a prop display interface can be displayed through a graphical user interface, wherein the prop display interface comprises at least one game prop; determining a target game play object among the at least one game play object in response to the selection operation; then, in response to a placement instruction for the target game item, displaying the target game item in the game scene; after the target game prop is displayed, first message information and/or second message information contained in the target game prop can be displayed through a graphical user interface, wherein the first message information is information generated by the first terminal aiming at the target game prop, and the second message information is information generated by the second terminal aiming at the target game prop.
Therefore, by placing the target game props in the game scene, each player can leave a message through the target game props, and the message information is displayed in the graphical user interface. Therefore, an asynchronous message mode tightly combined with a game scene can be provided for each player in the game, and the immersion of the player in the game can be increased; in addition, each player can carry out message interaction aiming at the same target game prop, so that the interaction among the players is more topical, the interaction experience of the players in the game can be improved, the game experience of the players is further improved, and the social attribute of the game is improved.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed in the description of the embodiments of the present application will be briefly described below, it being obvious that the drawings in the following description are only some embodiments of the present application, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a schematic diagram of an exemplary gaming system according to an embodiment of the present disclosure;
FIG. 2 is a flowchart illustrating an example of an interactive control method in a game according to an embodiment of the present application;
FIG. 3 is a schematic diagram of an example of a graphical user interface for displaying a target game prop and first message information and/or second message information in the graphical user interface according to an embodiment of the present application;
fig. 4 is a schematic diagram of some graphical user interfaces of a message interface provided in an embodiment of the present application;
FIG. 5 is a schematic diagram of an example of a graphical user interface for configuring first message information for a target game prop through an editing interface according to an embodiment of the present application;
FIG. 6 is a schematic view of another editing interface according to an embodiment of the present disclosure;
fig. 7 is a schematic display diagram of a combat logo display area according to an embodiment of the present application;
FIG. 8 is a schematic diagram showing a floating layer interface for a description of a battle activity according to an embodiment of the present disclosure;
FIG. 9 is a schematic diagram illustrating an example of a message interface for a target game prop of a battle-assisting type according to an embodiment of the present disclosure;
fig. 10 is a schematic display diagram of an example of a graphical user interface for replying to each message in a message interface according to an embodiment of the present application;
FIG. 11 is a schematic diagram illustrating an example of reply information according to an embodiment of the disclosure;
fig. 12 is a schematic display diagram of another graphical user interface for replying to each message in the message interface according to the embodiment of the present application;
FIG. 13 is a diagram illustrating an example of a graphical user interface for displaying role information of a virtual role corresponding to a publisher identifier according to an embodiment of the present application;
FIG. 14 is a diagram of an example graphical user interface for deleting a target game item provided in an embodiment of the present application;
FIG. 15 is a diagram of an example graphical user interface for leaving a message for a target play object according to an embodiment of the present application;
FIG. 16 is a schematic diagram of an example of a graphical user interface for folding and unfolding a message provided in an embodiment of the present application;
FIG. 17 is a schematic structural diagram of an interactive control device in a game according to an embodiment of the present application;
fig. 18 is a block diagram of an electronic device for implementing an interactive control method in a game according to an embodiment of the present application.
Detailed Description
In the following description, numerous specific details are set forth in order to provide a thorough understanding of the present application. This application is, however, susceptible of embodiment in many other ways than those herein described and similar generalizations can be made by those skilled in the art without departing from the spirit of the application and the application is therefore not limited to the specific embodiments disclosed below.
It should be noted that the terms "first," "second," "third," and the like in the claims, specification, and drawings herein are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. The data so used may be interchanged where appropriate to facilitate the embodiments of the present application described herein, and may be implemented in sequences other than those illustrated or otherwise described herein. Furthermore, the terms "comprises," "comprising," and their variants are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
It should be understood that in embodiments of the present application, "at least one" means one or more, and "a plurality" means two or more. "and/or" is merely an association relationship describing an association object, meaning that there may be three relationships, e.g., a and/or B, may represent: a exists alone, A and B exist together, and B exists alone. The character "/" generally indicates that the context-dependent object is an "or" relationship. "comprising A, B and/or C" means comprising any 1 or any 2 or 3 of A, B, C.
It should be understood that in the embodiments of the present application, "B corresponding to a", "a corresponding to B", or "B corresponding to a", means that B is associated with a, from which B may be determined. Determining B from a does not mean determining B from a alone, but may also determine B from a and/or other information.
Based on the above-mentioned problems, embodiments of the present application provide a method, an apparatus, an electronic device, and a computer-readable storage medium for controlling interaction in a game.
The interaction control method in the game provided by the embodiment of the application can be executed by electronic equipment, and the electronic equipment can be a terminal or a server and other equipment. The terminal can be terminal equipment such as a smart phone, a tablet personal computer, a notebook computer and the like. The server may be an independent physical server, a server cluster or a distributed system formed by a plurality of physical servers, or a cloud server providing cloud services, cloud databases, cloud computing, cloud storage, network services, cloud communication, middleware services, domain name services, security services, CDNs, basic cloud computing services such as big data and artificial intelligence platforms, and the like.
In an alternative embodiment, when the interactive control method in the game is run on the terminal device, the terminal device may include a display screen for presenting a game screen and receiving instructions generated by a player acting on the game screen, and a processor. The game screen may include a portion of a virtual game scene that is a virtual world in which virtual characters are active. The processor is used for storing a game application program, running the game, generating a game picture, responding to an instruction and controlling the display of the game picture on the display screen. When the player operates the game screen through the display screen, the game screen can control the local content of the terminal equipment by responding to the received operation instruction. The manner in which the terminal device provides the graphical user interface to the player may include a variety of ways, for example, the graphical user interface may be rendered for display on a display screen of the terminal device, or presented by holographic projection.
In an alternative embodiment, when the interactive control method in the game is run on a server, the method may be implemented and executed based on a cloud game system. Cloud gaming systems refer to gaming modalities based on cloud computing. The cloud game system comprises a server and client equipment. The running main body of the game application program and the game picture presentation main body are separated, and the storage and running of the interaction control method in the game are completed on a server. The game screen presentation is performed at a client, and the client is mainly used for receiving and sending game data and presenting the game screen, for example, the client may be a display device with a data transmission function near a player side, such as a mobile terminal, a television, a computer, a palm computer, a personal digital assistant, a head-mounted display device, etc., but the terminal device for processing game data is a cloud server. When playing the game, the player operates the client to send an instruction to the server, the server controls the game to run according to the instruction, codes and compresses data such as game pictures and the like, the data is returned to the client through a network, and finally, the game pictures are decoded and output through the client.
It should be noted that in the embodiment of the present application, the execution body of the interaction control method in the game may be a terminal device or a server, where the terminal device may be a local terminal device or a client device in the aforementioned cloud game. The embodiment of the application does not limit the type of the execution body.
By way of example, in connection with the above description, fig. 1 illustrates a gaming system 1000 for implementing interactive control in a game provided in an embodiment of the present application, the gaming system 1000 may include at least one terminal 1001, at least one server 1002, at least one database 1003, and a network. The terminal 1001 held by the user may be connected to different servers through a network. A terminal is any device having computing hardware capable of supporting software application tools corresponding to executing a game.
In the above-described game system 1000, the terminal 1001 is used to install and run a game application, and in some cases, the game application may not be installed in advance in the terminal 1001, and a player may directly access to enter a game through a client such as a browser. And the player logs in the game application program by using the registered game account number, and can control the virtual character corresponding to the game account number to participate in the game. When a player logs in to a game application, terminal 1001 sends a login request to server 1002, server 1002 verifies the game account number used by the player, and server 1002 determines the game mechanism corresponding to the game account number according to the login request, and the verification passes returns a login success notification to terminal 1001. During the game of a player through a game application, data interaction is performed between the terminal 1001 and the server 1002, the terminal 1001 transmits various information to the server 1002, the server 1002 determines display data of the terminal 1001 according to a stored game mechanism and the received information, and transmits the display data to the terminal 1001 to show the display data transmitted by the server 1002 to the player through the terminal 1001.
In a possible application scenario, different terminals 1001 may be served by different servers 1002, so in order to distinguish between the servers 1002 corresponding to different gaming terminals 1001, the first and second modes will be described when referred to in the embodiments of the present application. In practice, the servers 1002 corresponding to the different gaming terminals 1001 may be the same server 1002. It can be understood that the terminals 1001 corresponding to the virtual characters located in the same game scene are served by the same server 1002 without distinguishing the first and second.
In addition, when the game system 1000 includes a plurality of terminals, a plurality of servers, a plurality of networks, different terminals may be connected to each other through different networks, through different servers. The network may be a wireless network or a wired network, such as a Wireless Local Area Network (WLAN), a Local Area Network (LAN), a cellular network, a 2G network, a 3G network, a 4G network, a 5G network, etc. In addition, the different terminals may be connected to other terminals or to a server or the like using their own bluetooth network or hotspot network. In addition, the system 100 may include multiple databases coupled to different servers and information related to the game may be continuously stored in the databases as different users play the multi-user game online.
It should be noted that, in the embodiment of the present application, the same virtual game is executed on a plurality of terminal devices, so that data interaction between the plurality of terminal devices may be implemented through a server of the virtual game. Thus, the sending of data by the terminal device 1 to the terminal device 2 can be understood as: the terminal device 1 sends data to a server of the virtual game, which sends the data to the terminal device 2. The reception of data transmitted by the terminal device 2 by the terminal device 1 can be understood as: the terminal device 1 receives data transmitted from the server of the virtual game, the data being data transmitted from the terminal device 2 to the server. Alternatively, there may be no game server, and the terminal device 1 may directly transmit game data to the terminal device 2.
It should be noted that, the schematic game system shown in fig. 1 is only an example, and the game system 1000 described in the embodiment of the present application is for more clearly describing the technical solution of the embodiment of the present application, and does not constitute a limitation to the technical solution provided in the embodiment of the present application, and those skilled in the art can know that, with the evolution of the game system and the appearance of a new service scenario, the technical solution provided in the embodiment of the present application is equally applicable to similar technical problems.
It should be noted that, the triggering operations that occur in the following detailed description of the interactive control method in the game provided in the embodiment of the present application may be regarded as triggering operations implemented by the player through a finger or a medium such as a control mouse, a keyboard or a stylus. Which medium is specifically used may be determined according to the type of electronic device. For example, when the electronic device is a touch screen device such as a cell phone, tablet, gaming machine, etc., the player may operate on the touch screen via any suitable object or accessory such as a finger, stylus, etc. When the terminal equipment is a non-touch screen terminal equipment such as a desktop computer, a notebook computer and the like, a player can operate through external equipment such as a mouse, a keyboard and the like.
The technical scheme of the present application is described in detail below through specific embodiments. It should be noted that the following embodiments may be combined with each other, and the same or similar concepts or processes may not be described in detail in some embodiments.
In the embodiment of the application, a graphical user interface is provided through a first terminal, and at least part of game scenes of the virtual game are displayed on the graphical user interface, wherein the game scenes comprise a first virtual role controlled through the first terminal and a second virtual role controlled through a second terminal.
The game scene may be understood as a simulation environment of the real world in the game, may be a half-simulation half-fictional virtual environment, or may be a pure fictional virtual environment. The game scene may be any one of a two-dimensional virtual scene, a 2.5-dimensional virtual scene, and a three-dimensional virtual scene. A virtual scene may generally include a plurality of scene elements, which refer to individual elements required to construct the virtual scene. For example, at least one of the following may be included, but is not limited to: virtual character elements, virtual article elements, virtual building elements, virtual terrain elements, virtual vegetation elements, and the like. Wherein, the virtual topography elements can include, but are not limited to, natural topography element virtual scenes such as land, sea, lake, river, etc., as scenes where players control virtual characters to complete game logic.
It will be appreciated that a virtual character is a game character that a player manipulates in a game, i.e., the player manipulates the virtual character to perform various game activities in a game scene, such as picking up props, fighting, exploring or decrypting. The virtual characters may represent player characters, and each virtual character may be implemented as a three-dimensional virtual model or a two-dimensional virtual model, and the embodiment is not particularly limited. The virtual character includes, but is not limited to, at least one of a virtual character, a virtual animal, and a virtual machine.
In this embodiment, the first terminal provides a graphical user interface in which a first virtual character controlled by the first terminal device and a second virtual character controlled by the second terminal device are displayed, and the number of the first virtual character and the second virtual character is not particularly limited in this embodiment and may include one or more.
The virtual game in the embodiment of the application can comprise an open world adventure game, which allows players to freely explore and adventure in the game world of the virtual game, each player in the game corresponds to a respective game world, and players in the same game world can be regarded as teammates. That is, the first virtual character and the second virtual character may be virtual characters that are in the same camp.
Note that in the embodiment of the present application, the method is described by taking display control on the terminal side that controls the first virtual character as an example. I.e. the above-mentioned game scene is displayed by means of a display screen provided by the terminal device controlling the first virtual character.
Fig. 2 is a flowchart of an example of an interactive control method in a game according to an embodiment of the present application, as shown in fig. 2. It should be noted that the steps shown may be performed in a different logical order than that shown in the method flow diagram. The method may include the following steps S210 to S240.
Step S210: and displaying a prop display interface through the graphical user interface, wherein the prop display interface comprises at least one game prop.
It will be appreciated that in a virtual game, a player may manipulate virtual characters to accomplish a game task through the use of various play objects. A game prop may be understood as an item acquired by a player in a game for completing a game task, for example, the game prop may be used to help the player learn about the game task (such as a newspaper, a blackboard, a sign board, or the like in the game that carries a game task introduction, a game scene guidance), a virtual item that maintains the survival of a virtual character in a game scene, a virtual item that defeats other virtual characters, or a virtual item that is used to upgrade each virtual prop, and so on.
It will be appreciated that various interface contents to be displayed are typically presented in a graphical user interface in layers, which refers to a layered display manner in the graphical user interface, and that different contents may be placed on different layers in order to better control the display and hiding of the interface contents. Such as: in a game, a game scene may be placed on a scene layer, i.e., the scene layer is primarily used to render the game world and objects therein, including characters, environments, dynamic elements, and the like. And placing the display of the game prop icons in the backpack on a prop layer. Because the game item icons in the backpack are interface elements for the player to interact with the game, the display layer of the item layer on which the game item icons are placed is typically higher than the scene layer of the game scene. That is, the display level of the icons of each game item is higher than the display level of the scene screen. It should be noted that, in this embodiment, the scene picture is a game scene, and both express the same meaning.
Likewise, the virtual controls displayed in the graphical user interface referred to in the embodiments of the present application are located in separate control layers, the display layers of which are higher than the scene layers of the game scene.
The prop presentation interface is a layer provided in the graphical user interface for displaying game props during play by a player. Each game prop can be displayed in the graphical user interface through a prop display interface for selection by a player.
As shown in fig. 3, game scene 100 and prop presentation interface 20 are included in graphical user interface 300. Wherein a first virtual character 10 and a second virtual character 11 are included in the game scene 100; a number of play objects may be included in object display interface 20, and the number of play objects owned by a player may also be displayed at the display location of each play object, for example, 5 guideboard objects, 3 liquid medicine objects, and 1 weapon object are included in the object display interface.
Step S220: in response to the selection operation, a target game item is determined among the at least one game item.
It will be appreciated that the selection operation is an operation by which the player selects a game item to be used in the item display interface. The operation may include a touch trigger operation such as a click operation and a drag operation for the game prop, or a non-touch trigger operation such as a shortcut key operation, a voice command operation, a space gesture operation, etc., which is not limited in this embodiment.
By selecting the game play object selected by the selecting operation as the target game play object, it is to be understood that the number of the target game play objects is not particularly limited and may be one or more.
As with graphical user interface 300 shown in fig. 3, a player applies a click operation (selected in a gray filled state representation) to a guideboard prop, and the guideboard prop is the target game prop.
Step S230: in response to a placement instruction for the target game item, the target game item is displayed in the game scene.
The above-described placement instructions for the target game play object may be understood as instructions for placing and displaying the selected target game play object in a game scene.
The placement instruction may be an instruction generated by the above-described selection operation, for example, a placement instruction is generated when the player lifts his finger, and the target game item is placed in the game scene to be displayed.
Alternatively, the placement command may be generated after the player selects the target game prop and the preset time period, or the placement command may be generated by touching the placement control after selecting the target game prop, which is not limited in this embodiment. And in response to the placement instruction, placing the target game prop positioned on the prop display interface in the game scene, and at the moment, newly adding target game prop elements in a scene layer of the game scene and rendering and displaying the target game prop elements in the graphical user interface.
Step S240: displaying first message information and/or second message information contained in a target game prop through a graphical user interface, wherein the first message information is information generated by the first terminal aiming at the target game prop, and the second message information is information generated by the second terminal aiming at the target game prop.
The first message information and the second message information may be messages of a insight, a clearance method, or mutual challenges of each player's game play in the game scene. That is, each player can place information such as his own game experience in a game scene through a target game prop to realize information sharing.
It should be noted that the number of the first message may include one or more pieces, and the number of the second message may include one or more pieces, which is not particularly limited in this embodiment.
Next, an exemplary description will be given of display timings of the first message and the second message.
In some embodiments, after a player controlling the first virtual character selects and places a target game item in a game scene, first message information is configured for the target game item displayed in the game scene, and then the first message information is displayed in the game scene. Alternatively, the player may configure the first message information for the target game stage in advance, and display the first message information configured for the target game stage in advance for the first virtual character in the vicinity of the target game stage while placing the target game stage in the game scene. In this way, players may leave messages through the target game props to provide assistance to other players to share policies. Thus, personalized playing methods are added, and the investment feeling and the attribution feeling of the player are improved. The other players can quickly learn the game mechanism and skill by reading the messages, so that the game playing difficulty of the novice is reduced.
After the second virtual character enters the game world (i.e., the game scene) of the first virtual character, the second virtual character may leave a message on a target game prop in the game world of the first virtual character. Naturally, after the target game prop is placed in the game world of the first virtual character, the target game prop can be synchronously moved to the corresponding position of the game world of the other virtual characters, so that the second virtual character can also leave a message in the game world of the second virtual character aiming at the target game prop in the game world of the second virtual character.
As shown in fig. 3, after the player performs a clicking operation on the icon of the guideboard prop displayed in the graphical user interface 300, the graphical user interface 300 may be switched to be displayed as a graphical user interface 301, the target game prop 12 is displayed in the game scene 100 displayed in the graphical user interface 301, and the first message of "having a trap in front" is displayed above the target game prop 12.
In some embodiments, in addition to displaying the first message information, second message information may be displayed, where the second message information is information generated by the second terminal for the target game prop. And after the target game prop is displayed in the game scene where the second virtual character is located, the second virtual character leaves a message for the target game prop and displays the second message information on the graphical user interface provided by the first terminal. Therefore, players can leave messages and interact based on the target game props, can mutually assist or mutually blaze, and can improve the interestingness and the competitiveness of the game.
Then, as shown in fig. 3, after the player applies the clicking operation to the icon of the guideboard prop displayed in the gui 300, the gui 300 may be switched to be displayed as the gui 302, and when the first message information "trap in front of" configured by the first virtual character for the target game prop is displayed above the target game prop 12 displayed in the game scene 100, if the second virtual character leaves a message for the target game prop, the second message information "thank you share" and "three-linked with one key" may also be displayed above the target game prop.
Alternatively, in some embodiments, when the player does not leave a message on the target game prop while the target game prop is placed, i.e., the target game prop does not include the first message information, the second virtual character may leave a message on the target game prop to generate the second message information when the target game prop is seen in the game scene. In combination with the above example, as shown in fig. 3, after the player applies the clicking operation to the icon of the road sign prop displayed in the graphical user interface 300, the graphical user interface 300 may be switched to be displayed as the graphical user interface 303, and only the second message of "you want to say" may be displayed in the graphical user interface 303, so that the interest of the game may be improved.
It will be appreciated that the display position for displaying the first message information and/or the second message information in the game scene is not particularly limited herein, and may be a position near the target game prop or a fixed position in the graphical user interface. The display mode of the first message information and the second message information is not particularly limited, the first message information and the second message information can be completely displayed, the preset word number can be displayed for the first message information and the second message information, and the complete first message information and the complete second message information are displayed in response to the checking operation of the player for the first message information and the second message information.
Therefore, by placing the target game props in the game scene, each player can leave a message through the target game props, and the message information is displayed in the graphical user interface. Therefore, an asynchronous message mode tightly combined with a game scene can be provided for each player in the game, and the immersion of the player in the game can be increased; in addition, each player can carry out message interaction aiming at the same target game prop, so that the interaction among the players is more topical, the interaction experience of the players in the game can be improved, the game experience of the players is further improved, and the social attribute of the game is improved.
Optionally, the first message information or the second message information may include at least one of the following: text message information, picture message information, video message information, and voice message information. Namely, the player can realize word messages, voice messages, picture leave words and/or video messages through the target game prop, the message forms are various, the requirements of different players can be met, and the satisfaction degree of the player on the interaction mode of the virtual game is improved.
In one embodiment, when the first virtual character is controlled to place the target game prop in the game scene, prop information corresponding to the target game prop can be synchronized to at least one second terminal, so that each second terminal displays the target game prop according to the prop information; the prop information comprises prop attribute information of the target game prop and the placement position of the target game prop in the game scene. And synchronizing the prop information of the target game prop placed by the first virtual character in the game scene to the second terminal, and displaying the target game prop at the same position in the game scene where the second virtual character is located when a player of the second terminal controls the second virtual character to enter the game scene. The game scenes corresponding to all players are synchronized, so that interaction among the players is realized, and the social attribute of the open world game is enhanced.
It should be noted that, in the embodiment of the present application, the above-mentioned synchronization process of synchronizing the prop information of the target game prop to each second terminal is a continuous process, that is, when different terminals interact with the target game prop, so that the prop attribute information of the target game prop is changed, the synchronization mechanism may be triggered, so that each virtual character located in the same game scene may see the same message information.
In an alternative embodiment, the above step S240 may be implemented by the following step S241.
Step S241: responding to a viewing instruction aiming at the target game prop, displaying a message information interface corresponding to the target game prop through the graphical user interface, wherein the message information interface comprises the first message information and/or the second message information.
After the electronic device receives a placement instruction for the target game prop, the target game prop is placed and displayed in the game scene first. And then, the first virtual character can be controlled to apply viewing operation to the target game prop at different moments so as to view the first leave word and/or the second leave word information contained in the target game prop.
The above-mentioned viewing instruction for the target game prop may be understood as a viewing instruction for each message information included in the target game prop, where the viewing instruction may be generated by a viewing operation for the target game prop, and the viewing operation may include a touch trigger operation such as a click operation, a drag operation, a press operation, or a non-touch trigger operation such as a shortcut key operation, a voice instruction operation, or a space gesture operation. The present application is not limited in this regard.
The message interface is an interface for displaying the first message and/or the second message.
In an optional embodiment, the message information interface may further include at least one of the following information in addition to the first message information and/or the second message information:
the evaluation control corresponding to the first message information or the second message information; a deletion control corresponding to the first message information or the second message information; the number of all the message information in the message information interface; the method comprises the steps of carrying out a first treatment on the surface of the Issuing character identifiers of the message information; the release time of the message information; prop identification of the target game prop; the residual display duration of the target game prop in the game scene; the residual display time length of the target game prop in the game scene is the difference value between the preset display time length and the displayed time length; and an interaction control for interacting with the target game prop.
The first message information is message information generated by the first terminal aiming at the target game prop, and the second message information is message information generated by the second terminal aiming at the target game prop. The evaluation control can comprise a praise control and/or a click-step control; the character identifier for issuing each message information can comprise a head portrait identifier and/or a name identifier; the interactive control for interacting with the target game prop may include releasing a preset game skill for the target game prop, so that prop attributes of the target game prop are changed, for example, strengthening is performed for the target game prop, so as to increase display duration of the target game prop in a game scene; or, special effect expression is added for the target game props so as to present various target game props and increase game fun. The specific interactive controls may be set by themselves according to game play, and the embodiment is not particularly limited.
As shown in fig. 4, after a player selects a target game item from the item presentation interfaces 20 displayed in the graphical user interface 300 and places it in the game scene 100, the graphical user interface 300 is switched to be displayed as a graphical user interface 301, in which case the first message information and/or the second message information is not displayed in the graphical user interface 301.
Then, in the graphical user interface 301, after a viewing operation to generate a viewing instruction is applied to the displayed target game stage 12 in the graphical user interface 301, (in this embodiment, the state after the target game stage 12 is subjected to the viewing operation is represented in a gray filled state), the graphical user interface 301 may be displayed as a graphical user interface 304, a graphical user interface 305, or a graphical user interface 306 by switching.
When the first message information generated by the first terminal aiming at the target game prop 12 is text message information, switching the graphical user interface 301 to display the graphical user interface 304; when the first message is a picture message or a video message, switching the graphical user interface 301 to display the graphical user interface 305; when the first message is a text message and a picture message, the graphical user interface 301 is switched to display the graphical user interface 306.
The message information interface 21 is displayed in the graphical user interface 304, the graphical user interface 305, and the graphical user interface 306. In the message information interface 21, a prop identifier 29 of the target game prop, a character identifier 23 of the first virtual character, a trap in front of the first message information, a deletion control 24 for deleting the first message information, a remaining display time 25 of the target game prop in the game scene, an evaluation control 26, second message information 27 and a message identifier 28 are displayed. In graphical user interface 304, a second virtual character, with character identification "YYY1", issues a "thank you share-! The second message information is connected with one key three times; a second virtual character, the character of which is identified as "YYY2", issues a "high-hand-! "second message information; a second virtual character, the character of which is identified as "YYY3", issues a "high-hand-! The apprentice is-! "second message information.
The type of the first message information displayed in the graphical user interface may be displayed according to a specific configuration of the player, and the embodiment is not particularly limited. In the following method description, the first message is taken as the text message.
Optionally, arranging and displaying each message in the message interface in a preset ordering mode; in this embodiment of the present application, the preset ordering manner includes any one of the following: sequencing according to the release time; sorting according to the number of praise times; ordering according to the replied times.
In an alternative embodiment, the above step S230 may be implemented specifically by the following steps S231 to S233.
Step S231: and after receiving a placement instruction for the target game prop, determining a region to be placed.
Step S232: and determining the target type of the target game prop according to the to-be-placed area, wherein the target type comprises a battle type and a non-battle type.
Step S233: and displaying the target game props of the target type in the game scene.
The to-be-placed area is the area where the to-be-placed position of the target game prop in the game scene belongs. In general, the area to be placed is the current location area of the first virtual character.
In the embodiment of the application, the game scene can be divided into a battle scene area and a non-battle scene area according to the game playing method. A battle scene area may be understood as a game scene area in which a player may invite friends to team combat. A non-battle scene area may be understood as a game scene area in which a player cannot invite friends to team combat.
When the target game prop is placed, the target type of the target game prop to be placed can be determined according to the type of the area to be placed. If the area to be placed is a battle type, the target type of the target game prop is a battle type; if the area to be placed is of a type which can not be assisted, the target type of the target game prop is of a type which can not be assisted;
and after the target type of the target game prop is determined, placing and displaying the target game prop of the target type in the game scene. In this way, the object game props with different functions can be placed in different scene areas in the game, the playing method of the game is enriched, and the game fun is improved.
It will be appreciated that different prop appearances may be employed for display of different target game props of different target types, facilitating visual understanding of the type of target game prop as the player sees the target game prop.
In an alternative embodiment, the above step S230 may further be performed as follows step S250 to step S260 "after the target game stage is placed and displayed in the game scene".
Step S250: and responding to a trigger instruction aiming at the target game prop, and displaying an editing interface corresponding to the target game prop in the graphical user interface.
Step S260: and responding to the message leaving operation in the editing interface, and configuring the first message leaving information for the target game prop.
It can be understood that the trigger instruction for the target game prop is an instruction generated by a trigger operation different from a viewing operation for generating the viewing instruction, and the instruction is used for displaying an editing interface corresponding to the target game prop. The triggering operation may include, but is not limited to, a touch triggering operation such as a click operation, a drag operation, a press operation, etc.; and non-contact triggering operations such as shortcut key operation, voice instruction operation or space gesture operation. The application is not limited in this regard, and different triggering modes can be adopted according to different functions to be realized.
The editing interface is an interface for configuring first message information for the target game prop.
In an alternative embodiment, the editing interface may include at least one of: text editing area, picture editing area, video editing area, voice input area and combat identification display area. The player can configure text message information for the target game prop through the text editing area, can configure picture message information for the target game prop through the picture editing area, can configure video message information for the target game prop through the video editing area, and can configure voice message information for the target game prop through the voice input area. The picture editing area and the video editing area can be two or one.
The battle-assisted identification display area is used for displaying the battle-assisted identification, the battle-assisted identification is used for prompting the type of the area to be placed when the player places the target game prop, through the battle-assisted identification, the player can know the type of the scene area when leaving a message, and then the message information matched with the type of the area to be placed is input when leaving a message. For example, in a combat scene, according to the foregoing description, a player may leave his own message in a target game prop, thus inviting him to combat when the remaining players see the message through the target game prop.
As can be seen from the above description, in the embodiments of the present application, game scenes may be classified into a battle scene and a non-battle scene according to game play methods. If the game scene is a battle scene, the battle identification display area contains battle identification; if the game scene is a non-battle scene, the battle identification display area does not contain the battle identification. In an optional embodiment, if the game scene is a non-battle scene, the non-battle mark may be displayed in a battle mark display area.
The above-mentioned message operation may be understood as an input operation of inputting first message information for the target game prop in the editing interface.
As shown in fig. 5, in graphical user interface 301, after a player places target game play object 12 in game scene 100, a trigger operation that generates a trigger instruction may be applied to target game play object 12, and in response to the trigger instruction, graphical user interface 301 is displayed switchably as graphical user interface 307. In the graphical user interface 307, the game scene 100 and the editing interface 40 are displayed. A text editing area 41, a picture editing or video editing area 42, a combat logo display area 43, and a placement control 45 are displayed in the editing interface 40.
For example, when the player inputs the first message "trap in front of" in the text editing area 41, and triggers the placement control 45, the graphical user interface 304 is switched to be displayed as the graphical user interface 301, and the first message "trap in front of" is displayed in the graphical user interface 301.
In an alternative embodiment, further, the step S250 is implemented by the following steps S251 to S253.
Step S251: and responding to a trigger instruction aiming at the target game prop, and determining the current scene area of the target game prop.
Step S252: and determining the type of an editing interface to be displayed according to the current scene area of the target game prop, wherein the type of the editing interface comprises an editing interface with a combat identification and an editing interface without a combat identification.
Step S253: displaying an editing interface matched with the determined editing interface type through the graphical user interface.
When a player applies trigger operation for generating a trigger instruction for displaying an editing interface to a target game prop, firstly, a scene area where the target game prop is currently located can be determined, and whether the scene area is a battle scene or a non-battle scene is judged. And determining the type of the editing interface to be displayed according to the judging result, wherein when the scene area is a battle scene, the type of the editing interface to be displayed is an editing interface with a battle identification, and when the scene area is a battle scene, the type of the editing interface to be displayed is an editing interface without the battle identification.
An edit interface 40 displayed in a graphical user interface 307 shown in fig. 5 includes a combat identification display area 43 in the edit interface 40, and combat identification can be displayed in the combat identification display area 43. As shown in FIG. 6, FIG. 6 provides a schematic illustration of an editing interface that does not include a combat aid identifier. I.e. only text editing areas and picture/video editing areas are displayed in the graphical user interface 307 shown in fig. 6.
Or in an alternative embodiment, if the scene area is a battle scene, the type of the editing interface to be displayed is an editing interface with a battle identifier; if the scene area where the target game prop is located is a scene which can not be assisted, the type of the editing interface to be displayed is an editing interface with an identification which can not be assisted.
As shown in fig. 7 (a), when the game scene 100 is a battle scene, a battle identifier is displayed in the battle identifier display area 43, and for example, the battle identifier may be "on battle". When the game scene 100 is a non-battle scene, as shown in fig. 7 (b), a non-battle mark is displayed in the battle mark display area 43, and if the non-battle mark is "not in the battle area". Optionally, the non-battle logo "not in the battle area" may be grayed out to enrich the player's visual experience.
In an alternative embodiment, in the case that the combat flag display area is displayed, a combat instruction control may be further displayed in the combat flag display area, where the combat instruction control is used to display a combat activity instruction for the game scene in response to a trigger instruction. That is, in the embodiment of the application, the electronic device displays, in response to a trigger operation for the fight instruction control, a floating layer interface of a fight instruction in the game scene in the graphical user interface. As shown in fig. 8, a fight instruction control 44 is displayed in the fight instruction display area 43 in the graphical user interface 307, and when a trigger operation is applied to the fight instruction control 44 by the player, a float layer interface 46 of the fight instruction in the scene area is displayed in the graphical user interface 307.
Based on any of the foregoing embodiments, in an optional implementation manner, if the target type of the target game prop is a battle-capable type, a battle-aid invitation identifier is further displayed in the message interface, where the battle-aid invitation identifier is used to characterize that the first virtual character can be invited to perform battle.
As shown in fig. 9, fig. 9 provides a schematic display diagram of a message interface displayed in response to a view instruction for a target game prop when the target type of the target game prop is a battle type. In the graphical user interface 308 shown in fig. 9, a game scene 100 is displayed, the game scene 100 is a battle scene, the target type of the target game prop 12 placed in the game scene 100 is a battle type, and the battle invitation identifier 30 is displayed in the message information interface 29 displayed in the graphical user interface 308 in response to a view instruction for the target game prop 12. The combat invitation identity may be used to characterize the first virtual character that may be invited to combat.
In an optional embodiment, when the first virtual character places the target game prop of the fightable type in the game scene, the first virtual character may not display the fight invitation identifier in the message information interface displayed when viewing the target game prop, and only displays the fight invitation identifier in the graphical user interface provided by the second terminal corresponding to each second virtual character, so that each second virtual character invites the first virtual character to fight through the fight invitation identifier.
Correspondingly, in some embodiments, responding to receiving a combat invitation sent by the second terminal through the combat invitation identifier, and displaying the combat invitation in a graphical user interface, wherein the combat invitation is an instruction for triggering generation of the combat invitation identifier contained in the target game prop in a game scene where the second virtual character is located; and responding to the confirmation of the fight-assisting invitation, and controlling the first virtual character to enter a game scene where the second virtual character is located.
In some embodiments, the above method may further include the following steps S270 to S280.
Step S270: responding to a reply trigger instruction aiming at any one of the message interfaces, and displaying an information reply interface, wherein the information reply interface is used for receiving reply information;
Step S280: and responding to a reply completion instruction, and displaying the reply information in the message information interface.
The reply triggering instruction is generated according to reply operation aiming at any message, and the reply operation can realize reply of any message in a message interface.
The reply completion instruction is an instruction generated after the player finishes the reply information input of the replied message information on the information reply interface. And responding to the reply completion instruction and displaying the reply information in a message information interface. It will be appreciated that there are various ways to display the reply message in the message interface, and the embodiment is not particularly limited. For example, the reply message may be displayed following the replied message, i.e., the reply message is displayed below the replied message; as another example, the reply message may be displayed as the latest message at the lowest of the messages displayed, and at the same time, the message to be replied to is displayed at the lower side of the reply message.
Next, a process of replying to each message in the message interface is described by way of a specific example.
In an optional embodiment, the reply command for the message may be generated by a triggering operation for the message, for example, a clicking operation for the message, a reply command for the message is generated, an information reply interface is displayed in response to the reply command, and the player inputs the reply message based on the information reply interface.
As shown in FIG. 10, in the graphical user interface 304, when the player issues a second message "high-hand-! The apprentice is-! "apply click operation, second message information" high-hand-! The apprentice is-! "click operation is applied. In response to the click operation, the graphical user interface 304 is displayed as a graphical user interface 309, and the information reply interface 32 is displayed in the graphical user interface 309.
Included in the information reply interface 32 are an information input control 33, a cancel input control 34, and a publish control 35. The information input control 33 is a control for inputting reply information by the player, the cancel input control 34 is used for deleting information input by the player in the information input control 33, and the release control is used for releasing information input in the information input control 33 in the message information interface.
Optionally, the character identifier of the second virtual character to be replied can also be displayed in the information input control, so that the player who inputs the reply information is helped to further confirm whether the second virtual character replied currently is accurate or not. For example, text information of "reply message YYY3" is displayed in the information input control 33 displayed in the graphical user interface 309 shown in fig. 10.
As shown in fig. 10, in the graphical user interface 309, when the player inputs the reply message "is the one that helps to brush the daily" in the message input control 33, and then triggers the release control 35, the graphical user interface 309 is switched to be displayed as the graphical user interface 310, and in the message information interface 21 displayed in the graphical user interface 310, the reply message "is the one that helps to brush the daily" input by the player who controls the first virtual character is displayed as the latest message information at the lowest part of each displayed message information, and the second message information "high-hand-! The apprentice is-! ".
As another example, as shown in FIG. 11, a second message, the "high-hand-! The apprentice is-! "display reply message under the branch" is that which helps the student brush his daily.
Of course, the display formats of the reply message and the replied second message information in the graphical user interface 310 shown in fig. 10 and 11 are merely examples, and are not intended to be limiting.
In an optional embodiment, the responding to the reply triggering instruction for any message in the message interface, and displaying the message reply interface is specifically implemented in the following manner.
And responding to the non-contact operation on the message information, and displaying an information reply control at the corresponding position of the message information. Then, in response to a triggering operation for the information reply control, an information reply interface is displayed.
It can be understood that the above-mentioned non-contact operation can be understood as an operation mode that the second message information is not in physical contact with each other, for example, in a game of a computer end, a mouse is slid to suspend a cursor above the second message information (no click or press operation is performed); alternatively, in the hand tour, the operation of the touchless screen is achieved by detecting a player gesture.
Optionally, according to the non-contact operation, highlighting the message information selected by the non-contact operation. Thus, the player can intuitively see which message information is aimed at to execute the subsequent operation.
It should be noted that, highlighting in the embodiments of the present application refers to displaying an interface element in a manner different from the current display of the interface element. Highlighting may include one or more of the following exemplary display modes: highlighting, stroking, or strobing. Highlighting refers to displaying the display brightness of the interface element at a brightness that exceeds the original display brightness. The description display refers to thickening display or display with preset color of the border of the interface element. The stroboscopic display means that the interface element is flashed according to a preset frequency in a preset time. In this embodiment, the method is described with a gray filled state as a highlighting mode.
The information reply control is a control for replying message information corresponding to non-contact operation. The corresponding position of the message information is a preset position for displaying an information reply control for the message information, for example, may be the right side of the second message information, and the embodiment is not particularly limited.
The triggering operation for the information reply control may be understood as touch-and-touch operation for the information reply control, and may be a clicking operation, a dragging operation, or the like, which is not particularly limited.
As shown in fig. 12, in the graphic user interface 304, when the player controls the mouse to slide so as to hover the cursor, the message information "high-hand-! The apprentice is-! When ' above ', to leave message information ' high hand-! The display of the trekking "in light gray indicates that the message was selected with the cursor hovering. And displaying an information reply control 36 on the right side of the selected message information, when the cursor slides in a suspension state along the direction indicated by the dotted arrow and slides to the position of the information reply control 36, controlling the mouse to execute clicking operation on the information reply control 36, switching and displaying the graphical user interface 304 as a graphical user interface 311, and displaying the information reply interface 32 in the graphical user interface 311. An information input control 33, a cancel input control 34, and a release control 35 are displayed in the information reply interface 32.
In an alternative embodiment, as can be seen from the foregoing description, the message interface includes a publisher identifier of each message, and the method further includes: and responding to a triggering operation aiming at a first publisher identifier in all publisher identifiers in the message information interface, and displaying role information of a virtual role corresponding to the first publisher identifier.
It can be understood that the publisher identifier is a role identifier of the virtual role for publishing each message. The first publisher identifier is any one of the publisher identifiers.
The above character information may include, but is not limited to, information such as a character name, a character avatar, an account level, an experience value, a personality signature, etc. of the virtual character corresponding to the first publisher identification.
As shown in fig. 13, in the graphical user interface 304, when a trigger operation is applied for the publisher identification "YYY1", the graphical user interface 304 is displayed as a graphical user interface 312 by switching. The character information interface 37 of the virtual character corresponding to the publisher identifier "YYY1" is displayed in the graphic user interface 312, and the virtual character having the character name "YYY1" is displayed in the character information interface 37 with the character class of 1, the experience value of 256987, and the personal signature of "AAAAA".
Optionally, the role information interface 37 may further include an add friend control 38, and if the player wants to implement adding friends, a friend adding request is sent to the terminal device where the virtual role corresponding to the publisher identifier is located by triggering the add friend control 38.
In some embodiments, after a target game item is placed in the game scene, the target game item may also be deleted. Specifically, this can be achieved by the following step S290.
Step S290: and deleting the target game prop in the game scene in response to a deleting instruction for the target game prop.
In a specific embodiment, the deletion instruction may be generated in any of the following manners.
Responding to the deleting operation of the target game prop, and generating a deleting instruction of the target game prop;
and responding to the display time of the target game prop in the game scene to reach the preset time, and generating a deleting instruction aiming at the target game prop.
The deleting operation for the target game prop can be a triggering operation through a set deleting control. And responding to the triggering operation of the deleting control, and generating a deleting instruction of the target game prop.
In some specific embodiments, as shown in fig. 14, a deletion control 24 for a target game prop is displayed in a message information interface 21 displayed in a graphical user interface 304, a deletion instruction for the target game prop is generated through a triggering operation for the deletion control 24, and in response to the deletion instruction, the graphical user interface 304 is switched to be displayed as a graphical user interface 313, so that the target game prop 12 in the game scene 100 is deleted.
In some embodiments, the message information interface displays a remaining display duration of the target game prop in the game scene, and when the remaining display duration is less than or equal to 0, the target game prop in the game scene is deleted. And deleting the target game props when the display time of the target game props reaches the preset time.
In some embodiments, the message interface further includes a message identifier, where the message identifier is provided with message logic. Responding to the triggering operation of the message mark, executing the message logic preset for the message mark to display an information input control in a message information interface, wherein the information input control is used for receiving message information; and responding to an issuing instruction for the message information received by the information input control, and displaying the message information received by the information input control in the message information interface.
The instruction for issuing the message information received in the information input control may be generated by a trigger operation for an issuing control displayed simultaneously with the information input control, or an issuing instruction generated when the duration of receiving the message information reaches a preset duration.
As shown in fig. 15, the message identifier 28 is displayed in the graphical user interface 304, and in response to a trigger operation for the message identifier 28, the graphical user interface 304 is switched to be displayed as a graphical user interface 314, and the information input control 33, the cancel input control 34, and the release control 35 are displayed in the graphical user interface 314.
Optionally, the information input control may be displayed with prompt information for receiving message information for the target game prop, for example, prompt information of "input prop message" is displayed in the information input control displayed in the graphical user interface 314, and the above-mentioned information input control for replying to each message information is distinguished. After displaying the information input control, the player may input the message information in the information input control, and after receiving the message information, the message information is displayed in the message information interface in response to the triggering operation for the release control 35 described above.
In some optional embodiments, when the message interface includes a picture message, a video message, or a battle invitation identifier, the display space in the message interface is limited, and each message in the message interface may be displayed in a folded manner. In other words, only the message information meeting the preset condition is displayed in the message information interface, and at this time, all the message information cannot be checked through the up-and-down sliding operation. The preset condition may include at least one of: the replied times reach the preset times; the number of praise times reaches the preset number of praise times; arranged at preset positions.
As shown in fig. 9, when the fight-aid invitation flag 30 is displayed on the message interface 29, the display space in the message interface 29 is limited, so that the message information satisfying the preset condition can be displayed on the message interface. For example, for 10 pieces of message information of the target game prop 12 displayed in the graphic user interface 308, only message information issued by the second virtual character with the character identification of "YYY1" and message information issued by the second virtual character with the character identification of "YYY2" are displayed in the message information interface 29, so that the display space of the message information interface 29 is saved.
In an optional embodiment, in the case of displaying each message based on the folding, a message viewing control may be included in the message interface; responding to the triggering operation for the message check control, and displaying a message display interface in the graphical user interface; the message display interface is used for unfolding and displaying each message.
As shown in fig. 16, for the triggering operation of the message view control 31 displayed in the graphic user interface 308, the graphic user interface 308 is switched to be displayed as a graphic user interface 315, the message display interface 37 is displayed in the graphic user interface 315, and each message information generated for the target game stage 12 can be expanded and displayed in the message display interface 37. Limited by the size of the display screen of the electronic device, only a preset amount of message information can be displayed in the message display interface 37, but the rest of message information can be displayed by sliding up and down by the triggering operation of the sliding control 38 displayed on the right side of the message display interface 37.
It should be noted that, the graphical user interface 304 in fig. 4, the graphical user interface 309 in fig. 10, and the dotted lines in fig. 12 are lines that are fictitious for easy understanding, and these dotted lines do not exist on the graphical user interface displayed by the terminal device.
It should be noted that the above-mentioned information of the size, appearance, layout, display text, etc. of each element in the graphical user interface like those shown in fig. 3 to 16 is exemplary and not limiting to the actual display.
Thus far, the method provided in this embodiment has been described, by placing the target game prop in the game scene, each player can leave a message through the target game prop, and the message information is displayed in the graphical user interface. Therefore, an asynchronous message mode tightly combined with a game scene can be provided for each player in the game, and the immersion of the player in the game can be increased; in addition, each player can carry out message interaction aiming at the same target game prop, so that the interaction among the players is more topical, the interaction experience of the players in the game can be improved, the game experience of the players is further improved, and the social attribute of the game is improved.
Corresponding to the interactive control method in the game provided by the embodiment of the application, the embodiment of the application also provides an interactive control device 400 in the game, wherein the device 400 provides a graphical user interface through a first terminal, the graphical user interface comprises at least part of game scenes, and the game scenes comprise a first virtual character controlled by the first terminal and a second virtual character controlled by a second terminal; as shown in fig. 17, the apparatus 400 includes: a display unit 401 and a determination unit 402;
The display unit 401 is configured to display a prop display interface through the graphical user interface, where the prop display interface includes at least one game prop;
the determining unit 402 is configured to determine a target game prop among the at least one game prop in response to a selection operation;
the display unit 401 is further configured to display the target game prop in the game scene in response to a placement instruction for the target game prop;
the display unit 401 is further configured to display, through the graphical user interface, first message information and/or second message information included in the target game prop, where the first message information is information generated by the first terminal for the target game prop, and the second message information is information generated by the second terminal for the target game prop.
Optionally, the display unit 401 is specifically configured to respond to a viewing instruction for the target game prop, and display, through the graphical user interface, a message interface corresponding to the target game prop, where the message interface includes the first message information and/or the second message information.
Optionally, the display unit 401 is further configured to display, in the graphical user interface, an editing interface corresponding to the target game prop in response to a trigger instruction for the target game prop;
the apparatus 400 further comprises a configuration unit 403;
and a configuration unit 403, configured to configure the first message information for the target game prop in response to a message operation in the editing interface.
Optionally, the display unit 401 is specifically further configured to determine a to-be-placed area in response to a placement instruction for the target game prop; determining a target type of the target game prop according to the to-be-placed area, wherein the target type comprises a battle type and a non-battle type; and displaying the target game props of the target type in the game scene.
Optionally, the display unit 401 is specifically further configured to determine, in response to a trigger instruction for the target game prop, a current scene area where the target game prop is located; determining the type of an editing interface to be displayed according to the current scene area of the target game prop, wherein the type of the editing interface comprises an editing interface with a combat identification and an editing interface without a combat identification; displaying an editing interface matched with the determined editing interface type through the graphical user interface.
Optionally, if the target type of the target game prop is a battle type, the message information interface further includes a battle invitation identifier, where the battle invitation identifier is used to characterize that the first virtual character can be invited to perform battle.
Optionally, the display unit 401 is further configured to respond to receiving a combat invitation sent by the second terminal, and display the combat invitation, where the combat invitation is an instruction generated by triggering the target game prop in a game scene where the second virtual character is located;
the apparatus 400 further comprises a control unit 404;
and a control unit 404, configured to control the first virtual character to enter a game scene where the second virtual character is located in response to the confirmation of the fight-assist invitation.
Optionally, the first message information or the second message information includes at least one of the following: text message information, picture message information, video message information, and voice message information.
Optionally, the editing interface includes at least one of: text editing area, picture editing area, video editing area, voice input area and combat identification display area.
Optionally, the apparatus 400 further includes a synchronization unit 405;
The synchronization unit 405 is configured to synchronize prop information corresponding to the target game prop to at least one second terminal, so that each second terminal displays the target game prop according to the prop information; the prop information comprises prop attribute information of the target game prop and a placement position of the target game prop in the game scene.
Optionally, the message information interface further includes at least one of the following information: the evaluation control corresponding to the first message information or the second message information; a deletion control corresponding to the first message information or the second message information; the number of all the message information in the message information interface; the release time of the message information; prop identification of the target game prop; the residual display duration of the target game prop in the game scene; and the residual display time length of the target game prop in the game scene is the difference value between the preset display time length and the displayed time length.
Optionally, the message information in the message information interface is arranged and displayed in a preset ordering mode; the preset ordering mode comprises any one of the following steps: sequencing according to the release time; sorting according to the number of praise times; ordering according to the replied times.
Optionally, the display unit 401 is further configured to respond to a reply trigger instruction for any message in the message interface, and display an information reply interface, where the information reply interface is configured to receive a reply message; and responding to a reply completion instruction, and displaying the reply information in the message information interface.
Optionally, the message interface includes a publisher identifier of each message;
the display unit 401 is further configured to display role information of a virtual role corresponding to a first publisher identifier in each publisher identifier in the message information interface in response to a trigger operation for the first publisher identifier.
Optionally, the apparatus further includes a deleting unit 406;
a deleting unit 406, configured to delete the target game prop in the game scene in response to a deletion instruction for the target game prop.
Optionally, the deletion instruction is generated by any one of the following modes: responding to the deleting operation of the target game prop, and generating a deleting instruction of the target game prop; and responding to the display time of the target game prop in the game scene to reach the preset time, and generating a deleting instruction aiming at the target game prop.
Optionally, the message information interface includes a message identifier;
the display unit 401 is further configured to display an information input control in the message interface in response to a triggering operation for the message identifier, where the information input control is configured to receive message information; and responding to an issuing instruction for the message information received by the information input control, and displaying the message information received by the information input control in the message information interface.
Corresponding to the method for controlling interaction in a game provided in the embodiment of the present application, the embodiment of the present application further provides an electronic device for implementing the method for controlling interaction in a game, as shown in fig. 18, where the electronic device includes: a processor 501; and a memory 502 for storing a program of an interactive control method in a game, the apparatus, after being powered on and running the program of the interactive control method in a game by a processor, performs the steps of:
displaying a prop display interface through the graphical user interface, wherein the prop display interface comprises at least one game prop;
determining a target game play object among the at least one game play object in response to the selection operation;
Displaying the target game item in the game scene in response to a placement instruction for the target game item;
and displaying first message information and/or second message information contained in the target game prop through the graphical user interface, wherein the first message information is information generated by the first terminal aiming at the target game prop, and the second message information is information generated by the second terminal aiming at the target game prop.
Corresponding to the method for controlling interaction in a game provided in the embodiment of the present application, the embodiment of the present application further provides a computer readable storage medium storing a program for controlling interaction in a game, the program being executed by a processor to perform the steps of:
displaying a prop display interface through the graphical user interface, wherein the prop display interface comprises at least one game prop;
determining a target game play object among the at least one game play object in response to the selection operation;
displaying the target game item in the game scene in response to a placement instruction for the target game item;
and displaying first message information and/or second message information contained in the target game prop through the graphical user interface, wherein the first message information is information generated by the first terminal aiming at the target game prop, and the second message information is information generated by the second terminal aiming at the target game prop.
It should be noted that, for the detailed description of the apparatus, the electronic device, and the computer readable storage medium provided in the embodiments of the present application, reference may be made to the related description of the embodiments of the method for controlling interaction in the game provided in the embodiments of the present application, which is not repeated here.
While the preferred embodiment has been described, it is not intended to limit the invention thereto, and any person skilled in the art may make variations and modifications without departing from the spirit and scope of the present invention, so that the scope of the present invention shall be defined by the claims of the present application.
In one typical configuration, the electronic device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
The memory may include volatile memory in a computer-readable medium, random Access Memory (RAM) and/or nonvolatile memory, such as Read Only Memory (ROM) or flash memory (flash RAM). Memory is an example of computer-readable media.
1. Computer readable media, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable operations, data structures, modules of the program, or other data. Examples of storage media for a computer include, but are not limited to, phase change memory (PRAM), static Random Access Memory (SRAM), dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), read Only Memory (ROM), electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape magnetic disk storage or other magnetic storage devices, or any other non-transmission medium, which can be used to store information that can be accessed by a computing device. Computer readable media, as defined herein, does not include non-transitory computer readable media (transmission media), such as modulated data signals and carrier waves.
2. It will be appreciated by those skilled in the art that embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
While the preferred embodiment has been described, it is not intended to limit the invention thereto, and any person skilled in the art may make variations and modifications without departing from the spirit and scope of the present invention, so that the scope of the present invention shall be defined by the claims of the present application.

Claims (20)

1. An interactive control method in a game is characterized in that a graphical user interface is provided through a first terminal, wherein the graphical user interface comprises at least part of game scenes, and the game scenes comprise a first virtual character controlled through the first terminal and a second virtual character controlled through a second terminal; the method comprises the following steps:
Displaying a prop display interface through the graphical user interface, wherein the prop display interface comprises at least one game prop;
determining a target game play object among the at least one game play object in response to the selection operation;
displaying the target game item in the game scene in response to a placement instruction for the target game item;
and displaying first message information and/or second message information contained in the target game prop through the graphical user interface, wherein the first message information is information generated by the first terminal aiming at the target game prop, and the second message information is information generated by the second terminal aiming at the target game prop.
2. The method of claim 1, wherein displaying, via the graphical user interface, the first message information and/or the second message information contained by the target play object comprises:
responding to a viewing instruction aiming at the target game prop, displaying a message information interface corresponding to the target game prop through the graphical user interface, wherein the message information interface comprises the first message information and/or the second message information.
3. The method of claim 2, wherein after the displaying the target game item in the game scene, the method further comprises:
responding to a trigger instruction aiming at the target game prop, and displaying an editing interface corresponding to the target game prop in the graphical user interface;
and responding to the message leaving operation in the editing interface, and configuring the first message leaving information for the target game prop.
4. The method of claim 1, wherein the displaying the target game item in the game scene in response to a placement instruction for the target game item comprises:
determining a region to be placed in response to a placement instruction for the target game prop;
determining a target type of the target game prop according to the to-be-placed area, wherein the target type comprises a battle type and a non-battle type;
and displaying the target game props of the target type in the game scene.
5. A method according to claim 3, wherein said displaying an editing interface corresponding to said target game play object in said graphical user interface in response to a triggering instruction for said target game play object comprises:
Responding to a trigger instruction aiming at the target game prop, and determining a scene area where the target game prop is currently positioned;
determining the type of an editing interface to be displayed according to the current scene area of the target game prop, wherein the type of the editing interface comprises an editing interface with a combat identification and an editing interface without a combat identification;
displaying an editing interface matched with the determined editing interface type through the graphical user interface.
6. The method of claim 4, wherein if the target type of the target game prop is a battle-capable type, further comprising a battle invitation identifier in the message interface, wherein the battle invitation identifier is used for characterizing that the first virtual character can be invited to perform battle.
7. The method of claim 6, wherein the method further comprises:
responding to receiving a combat invitation sent by a second terminal, and displaying the combat invitation, wherein the combat invitation is an instruction generated by triggering the target game prop in a game scene where a second virtual character is located;
and responding to the confirmation of the fight-assisting invitation, and controlling the first virtual character to enter a game scene where the second virtual character is located.
8. The method of claim 1, wherein the first message or the second message comprises at least one of: text message information, picture message information, video message information, and voice message information.
9. A method according to claim 3, wherein the editing interface comprises at least one of: text editing area, picture editing area, video editing area, voice input area and combat identification display area.
10. The method according to claim 1, wherein the method further comprises:
synchronizing prop information corresponding to the target game props to at least one second terminal so that each second terminal displays the target game props according to the prop information;
the prop information comprises prop attribute information of the target game prop and the placement position of the target game prop in the game scene.
11. The method of claim 2, wherein the message interface further comprises at least one of the following information:
the evaluation control corresponding to the first message information or the second message information;
A deletion control corresponding to the first message information or the second message information;
the number of all the message information in the message information interface;
the release time of the message information;
prop identification of the target game prop;
the residual display duration of the target game prop in the game scene; and the residual display time length of the target game prop in the game scene is the difference value between the preset display time length and the displayed time length.
12. The method according to claim 2, wherein the message information in the message information interface is displayed in a preset ordering manner;
the preset ordering mode comprises any one of the following steps:
sequencing according to the release time;
sorting according to the number of praise times;
ordering according to the replied times.
13. The method according to claim 2, wherein the method further comprises:
responding to a reply trigger instruction aiming at any one of the message interfaces, and displaying an information reply interface, wherein the information reply interface is used for receiving reply information;
and responding to a reply completion instruction, and displaying the reply information in the message information interface.
14. The method of claim 2, wherein the message interface includes a publisher identification for each message, the method further comprising:
and responding to a triggering operation aiming at a first publisher identifier in all publisher identifiers in the message information interface, and displaying role information of a virtual role corresponding to the first publisher identifier.
15. The method according to claim 1, wherein the method further comprises:
and deleting the target game prop in the game scene in response to a deleting instruction for the target game prop.
16. The method of claim 17, wherein the delete instruction is generated by any one of:
responding to the deleting operation of the target game prop, and generating a deleting instruction of the target game prop;
and responding to the display time of the target game prop in the game scene to reach the preset time, and generating a deleting instruction aiming at the target game prop.
17. The method of claim 2, wherein the message interface includes a message identifier; the method further comprises the steps of:
responding to the triggering operation for the message identification, and displaying an information input control in the message information interface, wherein the information input control is used for receiving message information;
And responding to an issuing instruction for the message information received by the information input control, and displaying the message information received by the information input control in the message information interface.
18. An interactive control device in a game is characterized in that a graphical user interface is provided through a first terminal, wherein the graphical user interface comprises at least part of game scenes, and the game scenes comprise a first virtual character controlled through the first terminal and a second virtual character controlled through a second terminal; the device comprises: a display unit and a determination unit;
the display unit is used for displaying a prop display interface through the graphical user interface, and the prop display interface comprises at least one game prop;
the determining unit is used for determining a target game prop in the at least one game prop in response to the selection operation;
the display unit is further used for displaying the target game prop in the game scene after receiving a placement instruction for the target game prop;
the display unit is further configured to display, through the graphical user interface, first message information and/or second message information included in the target game prop, where the first message information is information generated by the first terminal for the target game prop, and the second message information is information generated by the second terminal for the target game prop.
19. An electronic device, comprising:
a processor; and
a memory for storing a data processing program, the electronic device being powered on and executing the program by the processor, to perform the method of any one of claims 1 to 17.
20. A computer readable storage medium, characterized in that a data processing program is stored, which program is run by a processor, performing the method according to any of claims 1-17.
CN202410018641.3A 2024-01-05 2024-01-05 Interaction control method and device in game, electronic equipment and readable storage medium Pending CN117861205A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202410018641.3A CN117861205A (en) 2024-01-05 2024-01-05 Interaction control method and device in game, electronic equipment and readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202410018641.3A CN117861205A (en) 2024-01-05 2024-01-05 Interaction control method and device in game, electronic equipment and readable storage medium

Publications (1)

Publication Number Publication Date
CN117861205A true CN117861205A (en) 2024-04-12

Family

ID=90582443

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202410018641.3A Pending CN117861205A (en) 2024-01-05 2024-01-05 Interaction control method and device in game, electronic equipment and readable storage medium

Country Status (1)

Country Link
CN (1) CN117861205A (en)

Similar Documents

Publication Publication Date Title
CN109885367B (en) Interactive chat implementation method, device, terminal and storage medium
US8734255B2 (en) Methods and systems for providing a game center having player specific options and statistics
US11810234B2 (en) Method and apparatus for processing avatar usage data, device, and storage medium
CN113559520B (en) Interaction control method and device in game, electronic equipment and readable storage medium
WO2022222592A1 (en) Method and apparatus for displaying information of virtual object, electronic device, and storage medium
GB2622261A (en) System and method for providing a relational terrain for social worlds
US20230065576A1 (en) Battle settlement interface display method, apparatus, device, and storage medium
CN113332716A (en) Virtual article processing method and device, computer equipment and storage medium
JP7547646B2 (en) Contact information display method, device, electronic device, and computer program
WO2023142425A1 (en) Social interaction method and apparatus, and device, storage medium and program product
CN117861205A (en) Interaction control method and device in game, electronic equipment and readable storage medium
CN113941152A (en) Virtual object control method and device, electronic equipment and storage medium
CN118022342A (en) Information processing method and device in game, electronic equipment and readable storage medium
CN117861208A (en) Game interaction method, game interaction device, electronic equipment and computer readable storage medium
WO2024187945A1 (en) Virtual character display method, apparatus, device and storage medium
CN117839207A (en) Interaction control method and device in game, electronic equipment and readable storage medium
WO2024060924A1 (en) Interaction processing method and apparatus for virtual scene, and electronic device and storage medium
WO2024060888A1 (en) Virtual scene interaction processing method and apparatus, and electronic device, computer-readable storage medium and computer program product
CN116943243A (en) Interaction method, device, equipment, medium and program product based on virtual scene
WO2023221716A1 (en) Mark processing method and apparatus in virtual scenario, and device, medium and product
CN116392807A (en) Method and device for sending message in game and electronic equipment
Vieira Creation of dynamic virtual tours in multimedia spaces
CN118363687A (en) Social element display method, device, equipment, medium and program product
CN116983625A (en) Social scene-based message display method, device, equipment, medium and product
CN117764758A (en) Group establishment method, device, equipment and storage medium for virtual scene

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination