CN115591231A - Interactive control method and device in game and electronic equipment - Google Patents

Interactive control method and device in game and electronic equipment Download PDF

Info

Publication number
CN115591231A
CN115591231A CN202211055391.8A CN202211055391A CN115591231A CN 115591231 A CN115591231 A CN 115591231A CN 202211055391 A CN202211055391 A CN 202211055391A CN 115591231 A CN115591231 A CN 115591231A
Authority
CN
China
Prior art keywords
target
interaction
game
scene map
scene
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211055391.8A
Other languages
Chinese (zh)
Inventor
樊晓晴
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Netease Hangzhou Network Co Ltd
Original Assignee
Netease Hangzhou Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netease Hangzhou Network Co Ltd filed Critical Netease Hangzhou Network Co Ltd
Priority to CN202211055391.8A priority Critical patent/CN115591231A/en
Publication of CN115591231A publication Critical patent/CN115591231A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/53Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game
    • A63F13/537Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game using indicators, e.g. showing the condition of a game character on screen
    • A63F13/5378Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game using indicators, e.g. showing the condition of a game character on screen for displaying an additional top view, e.g. radar screens or maps

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention provides an interactive control method, an interactive control device and electronic equipment in a game, wherein a part of scene map of a game scene is displayed in a graphical user interface provided by terminal equipment, and an object identifier of a virtual object is displayed in the scene map; responding to a trigger operation acting on the scene map, determining at least one target object meeting a preset condition from the virtual objects based on a trigger position of the trigger operation in the scene map, and displaying an interactive control of the target object; and controlling to execute the interactive operation related to the first object in response to the triggering operation of the interactive control acting on the first object in the target objects. In the method, a player can determine a target object from a plurality of virtual objects contained in a certain position only by triggering the position in a scene map, so that the target object can be selected quickly; and the interaction control of the target object can be displayed in the graphical user interface, and can interact with the target object through the interaction control, so that the error touch operation of a player on other virtual objects is avoided.

Description

Interactive control method and device in game and electronic equipment
Technical Field
The invention relates to the technical field of game interaction design, in particular to an in-game interaction control method and device and electronic equipment.
Background
In the related art, a player usually clicks the name of an NPC (Non-player character) on a map to automatically find a way, however, there are many NPCs on the map, the NPC names overlap, and the number of times that the player usually goes to a certain number of NPCs is large, however, all the NPCs are displayed on the map, which makes it difficult for the player to find the frequently-arriving NPC, and since the NPC names overlap on the map, when the player selects a target NPC, a miss-touch situation is likely to occur.
Disclosure of Invention
The invention aims to provide an interactive control method, an interactive control device and electronic equipment in a game, so that a player can quickly find a frequently-arriving NPC (network provider control) and the condition of mistaken touch is reduced.
In a first aspect, the present invention provides an interactive control method in a game, which provides a graphical user interface through a terminal device, wherein a scene picture of a game scene is displayed in the graphical user interface; the method comprises the following steps: in response to specifying a trigger action, displaying at least a partial scene map of the game scene in the graphical user interface; the method comprises the following steps that an object identifier of a virtual object is displayed at a specified position in a scene map; the designated position is matched with the position of the virtual object in the game scene; responding to a trigger operation acting on the scene map, determining at least one target object meeting preset conditions from the virtual objects based on a trigger position of the trigger operation in the scene map, and displaying an interactive control of the target object; the interactive control is used for controlling and executing interactive operation related to the target object; and controlling to execute the interactive operation related to the first object in response to the triggering operation of the interactive control acting on the first object in the target objects.
In an optional embodiment, the step of determining, in response to the trigger operation on the scene map, at least one target object satisfying a preset condition from the virtual objects based on a trigger position of the trigger operation in the scene map includes: responding to a trigger operation acting on the scene map, and determining a target area in the scene map based on a trigger position of the trigger operation in the scene map; in a scene map, identifying virtual objects contained in a target area; at least one target object meeting a preset condition is determined from the virtual objects contained in the target area.
In an optional embodiment, the step of determining the target area in the scene map based on the trigger position of the trigger operation in the scene map includes: in the scene map, a circular area with the triggering position as the center of a circle and the designated number of pixels as the radius is determined as a target area.
In an optional embodiment, the step of determining at least one target object satisfying a preset condition from among the virtual objects included in the target area includes: determining the interaction times of the virtual object contained in the target area and the controlled virtual object; and selecting at least one target object with the interaction times meeting the preset conditions from the virtual objects contained in the target area.
In an optional embodiment, the step of selecting at least one target object whose interaction times satisfy a preset condition from the virtual objects included in the target area includes: sequencing virtual objects which have interaction with the controlled virtual object in the virtual objects contained in the target area according to the sequence of the interaction times from a few to obtain a sequencing result; and selecting a specified number of target objects ranked in the top from the ranking result.
In an optional embodiment, the method further comprises: if the number of the virtual objects which have interaction with the controlled virtual objects in the sequencing result is less than the specified number, determining the interaction times of the virtual objects contained in the target area and the virtual objects except the controlled virtual objects; sequencing the virtual objects which are contained in the target area and interact with the virtual objects except the controlled virtual object according to the sequence of the interaction times, and selecting a preset number of target objects which are sequenced in the front; the preset number and the number of the virtual objects which have interaction with the controlled virtual objects in the sequencing result are equal to the specified number.
In an optional embodiment, the step of determining the number of times of interaction between the virtual object included in the target area and the controlled virtual object includes: obtaining historical game data corresponding to the controlled virtual object within the specified game duration; based on the historical game data, the number of interactions of the controlled virtual object with the virtual objects contained in the target area is determined.
In an optional embodiment, the triggering operation performed in response to the action on the scene map includes: and clicking the trigger position of the scene map through a right mouse button.
In an alternative embodiment, the virtual object includes: a non-player character in the game.
In an alternative embodiment, the step of controlling, in response to a trigger operation of the interaction control acting on the first object in the target object, to execute an interaction operation related to the first object includes: and responding to the triggering operation of the interaction control acting on the first object in the target objects, selecting the first object, and controlling the controlled virtual object to move to the position of the first object.
In a second aspect, the present invention provides an interactive control device in a game, which provides a graphical user interface through a terminal device, wherein a scene picture of a game scene is displayed in the graphical user interface; the device includes: the map display module is used for responding to the specified trigger operation and displaying at least part of scene map of the game scene in the graphical user interface; the method comprises the following steps that an object identifier of a virtual object is displayed at a specified position in a scene map; the designated position is matched with the position of the virtual object in the game scene; the target object determining module is used for responding to the trigger operation acted on the scene map, determining at least one target object meeting preset conditions from the virtual objects based on the trigger position of the trigger operation in the scene map, and displaying an interactive control of the target object; the interaction control is used for controlling and executing interaction operation related to the target object; and the interaction control module is used for responding to the triggering operation of the interaction control acting on the first object in the target object and controlling the execution of the interaction operation related to the first object.
In a third aspect, the present invention provides an electronic device, which includes a processor and a memory, where the memory stores machine executable instructions capable of being executed by the processor, and the processor executes the machine executable instructions to implement the above-mentioned in-game interaction control method.
In a fourth aspect, the present invention provides a computer-readable storage medium storing computer-executable instructions that, when invoked and executed by a processor, cause the processor to implement the in-game interaction control method described above.
The embodiment of the invention has the following beneficial effects:
the invention provides an interactive control method, an interactive control device and electronic equipment in a game, which are characterized in that at least part of scene maps of game scenes are displayed in a graphical user interface in response to specified trigger operation; the method comprises the following steps that an object identifier of a virtual object is displayed at a specified position in a scene map; the designated position is matched with the position of the virtual object in the game scene; then responding to a trigger operation acting on the scene map, determining at least one target object meeting preset conditions from the virtual objects based on a trigger position of the trigger operation in the scene map, and displaying an interactive control of the target object; the interaction control is used for controlling and executing interaction operation related to the target object; and then controlling to execute the interactive operation related to the first object in response to the triggering operation of the interactive control acting on the first object in the target object. In the method, a player can determine a target object from a plurality of virtual objects contained in a certain position by triggering the position in the scene map, so that the target object can be selected quickly; and the interaction control of the target object can be displayed in the graphical user interface, and can interact with the target object through the interaction control, so that the misoperation of a player on other virtual objects is avoided.
Additional features and advantages of the invention will be set forth in the description which follows, and in part will be obvious from the description, or may be learned by the practice of the invention as set forth hereinafter.
In order to make the aforementioned and other objects, features and advantages of the present invention comprehensible, preferred embodiments accompanied with figures are described in detail below.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, and it is obvious that the drawings in the following description are some embodiments of the present invention, and other drawings can be obtained by those skilled in the art without creative efforts.
FIG. 1 is a flowchart of an in-game interaction control method according to an embodiment of the present invention;
FIG. 2 is a flow chart of another method for controlling interaction in a game according to an embodiment of the present invention;
FIG. 3 is a schematic diagram of a display of an interactive control according to an embodiment of the present invention;
FIG. 4 is a flow chart of another method for controlling interaction in a game according to an embodiment of the present invention;
FIG. 5 is a schematic structural diagram of an in-game interaction control apparatus according to an embodiment of the present invention;
fig. 6 is a schematic structural diagram of an electronic device according to an embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, but not all, embodiments of the present invention. The components of embodiments of the present invention generally described and illustrated in the figures herein may be arranged and designed in a wide variety of different configurations.
Thus, the following detailed description of the embodiments of the present invention, as presented in the figures, is not intended to limit the scope of the invention, as claimed, but is merely representative of selected embodiments of the invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
In the related art, a player usually clicks the name of an NPC (Non-player character) on a map to automatically find a way to reach, however, there are many NPCs on the map, the NPCs names overlap with each other, and the player usually only goes to a certain number of NPCs for a large number of times, but all the NPCs are displayed on the map, which makes it difficult for the player to find the NPCs that arrive frequently.
Based on the above problems, embodiments of the present invention provide an in-game interaction control method, an apparatus, and an electronic device, where the technology is applied in a target object selection scene, especially a scene in which a large number of non-player characters select a target object.
The in-game interaction control method in one embodiment of the disclosure can be operated on a local terminal device or a server. When the in-game interaction control method runs on the server, the method can be implemented and executed based on a cloud interaction system, wherein the cloud interaction system comprises the server and the client device.
In an optional embodiment, various cloud applications may be run under the cloud interaction system, for example: and (5) cloud games. Taking a cloud game as an example, a cloud game refers to a game mode based on cloud computing. In the running mode of the cloud game, the running main body of the game program and the game picture presenting main body are separated, the storage and the running of the social relationship display method in the game are completed on the cloud game server, and the client device is used for receiving and sending data and presenting the game picture, for example, the client device can be a display device with a data transmission function close to a user side, such as a mobile terminal, a television, a computer, a palm computer and the like; but the cloud game server which performs information processing is a cloud. When a game is played, a player operates the client device to send an operation instruction to the cloud game server, the cloud game server runs the game according to the operation instruction, data such as game pictures and the like are encoded and compressed, the data are returned to the client device through a network, and finally the data are decoded through the client device and the game pictures are output.
In an optional implementation manner, taking a game as an example, the local terminal device stores a game program and is used for presenting a game screen. The local terminal device is used for interacting with the player through a graphical user interface, namely, a game program is downloaded and installed and operated through an electronic device conventionally. The manner in which the local terminal device provides the graphical user interface to the player may include a variety of ways, for example, it may be rendered for display on a display screen of the terminal or provided to the player by holographic projection. For example, the local terminal device may include a display screen for presenting a graphical user interface including a game screen and a processor for running the game, generating the graphical user interface, and controlling display of the graphical user interface on the display screen.
In a possible implementation manner, an embodiment of the present invention provides an interactive control method in a game, in which a terminal device provides a graphical user interface, and a scene picture of a game scene is displayed in the graphical user interface; as shown in fig. 1, the method comprises the following specific steps:
step S102, responding to the appointed trigger operation, displaying at least part of scene map of the game scene in the graphical user interface; the method comprises the following steps that an object identifier of a virtual object is displayed at a specified position in a scene map; the specified location matches the location of the virtual object in the game scene.
The specified trigger operation may be an operation of clicking a map control displayed in a scene picture of a game scene, or an operation of clicking a specified area, or an operation of dragging a certain object in the scene picture, which is not specifically limited herein. The object identifiers of a plurality of virtual objects are displayed at the specified positions in the scene map, the specified positions can be positions where the virtual objects are arranged in the scene map, and the specified positions can be all areas in the scene map or some fixed areas in the scene map. Specifically, the object identifier of the virtual object may be an object name of the virtual object, a code number of the virtual object, or a unique identifier of the virtual object, and is specifically set according to research and development requirements.
Step S104, responding to the trigger operation acted on the scene map, determining at least one target object meeting preset conditions from the virtual objects based on the trigger position of the trigger operation in the scene map, and displaying an interactive control of the target object; the interactive control is used for controlling and executing interactive operation related to the target object.
The trigger operation acting on the scene map may be an operation of clicking a certain position in the scene map, where the position may be any area in the scene map; specifically, in order to find a target object that a player wants to interact with, the player usually performs a triggering operation at a position where the target object is located approximately, then obtains a plurality of virtual objects in a range near the triggering position (i.e., the position where the target object is located approximately), selects a target object that meets a preset condition from the plurality of virtual objects, and then displays an interaction control of the target object in the graphical user interface, where generally one target object corresponds to one interaction control.
In a specific implementation, the preset condition may be set according to a research and development requirement, for example, the preset condition may be that a virtual object with the largest number of interactions with a controlled virtual object controlled by a player is selected from a plurality of virtual objects, or that a virtual object with the number of interactions exceeding a preset threshold is selected, or that a virtual object with a designated identifier is selected. The virtual object may be a non-player character (NPC for short) in the game, or a scene, a monster, a building, or the like in the game.
And step S106, responding to the trigger operation of the interactive control acting on the first object in the target object, and controlling to execute the interactive operation related to the first object.
The triggering operation of the interaction control acting on the first object in the target object may be a click operation, a slide operation, a drag operation, or the like of the interaction control of the first object. The interactive operation related to the first object may be selecting the first object, moving a controlled virtual object controlled by a player to a position of the first object, performing a dialog with the first object, decorating the first object, and the like, and a specific manner corresponding to the interactive operation may be set according to a research and development requirement, which is not limited specifically herein.
The interactive control method in the game provided by the embodiment of the invention is characterized in that at least part of scene maps of game scenes are displayed in a graphical user interface in response to specified trigger operation; the method comprises the following steps that an object identifier of a virtual object is displayed at a specified position in a scene map; the designated position is matched with the position of the virtual object in the game scene; further responding to a trigger operation acting on the scene map, determining at least one target object meeting preset conditions from the virtual objects based on a trigger position of the trigger operation in the scene map, and displaying an interactive control of the target object; the interaction control is used for controlling and executing interaction operation related to the target object; and then controlling to execute the interactive operation related to the first object in response to the triggering operation of the interactive control acting on the first object in the target objects. In the method, a player can determine a target object from a plurality of virtual objects contained in a certain position by triggering the position in the scene map, so that the target object can be selected quickly; and the interaction control of the target object can be displayed in the graphical user interface, and can interact with the target object through the interaction control, so that the misoperation of a player on other virtual objects is avoided.
The embodiment of the present invention further provides another interactive control method in a game, which is implemented on the basis of the above embodiment, and the method mainly describes a specific process of responding to a trigger operation acting on the scene map, and determining at least one target object satisfying a preset condition from the virtual objects based on a trigger position of the trigger operation in the scene map (specifically, implemented by the following steps S204 to S208); as shown in fig. 2, the method comprises the following specific steps:
step S202, responding to the appointed trigger operation, displaying at least part of scene map of the game scene in the graphical user interface; the method comprises the following steps that an object identifier of a virtual object is displayed at a specified position in a scene map; the specified location matches the location of the virtual object in the game scene.
Step S204, responding to the trigger operation acted on the scene map, and determining a target area in the scene map based on the trigger position of the trigger operation in the scene map.
In specific implementation, an area in the scene map, which includes the specified range of the trigger position, may be determined as a target area, and a determination rule of the target area may be set according to research and development requirements. Specifically, in the scene map, a circular area with the trigger position as the center and the designated number of pixels as the radius may be determined as the target area. The specified number of pixels here may be 33 pixels, 35 pixels, or another number of pixels.
In some embodiments, a square area with the triggering position as the center and the specified number of pixels as the side length can be determined as the target area; a polygon area of a specified size centered on the trigger position may be determined as the target area.
In practical applications, the present invention can be applied to computer games, and the triggering operation applied to the scene map includes: and clicking the trigger position of the scene map through a right mouse button. In some embodiments, the trigger position in the scene map may also be triggered by long pressing the right mouse button or long pressing the left mouse button.
In step S206, in the scene map, the virtual object included in the target area is identified.
Step S208, determining at least one target object satisfying a preset condition from the virtual objects included in the target area.
In a specific implementation, the step S208 can be implemented by the following steps 10-11:
and step 10, determining the interaction times of the virtual object contained in the target area and the controlled virtual object.
Specifically, historical game data corresponding to the controlled virtual object within the specified game duration may be obtained first; the number of interactions of the controlled virtual object with the virtual objects contained in the target area is then determined based on historical game data. The specified game duration can be the latest 100 hours or the latest 120 hours, and the specific duration of the specified game duration can be set according to the research and development requirements. The historical game data records the interaction times of the controlled virtual object operated by the player and the virtual object in the scene map, the interaction content with the virtual object each time, and the like.
And 11, selecting at least one target object with the interaction times meeting preset conditions from the virtual objects contained in the target area.
The preset condition may be to select a virtual object with a higher interaction frequency, or to select a virtual object with an interaction frequency reaching a preset threshold. In a specific implementation, the step 11 can be implemented by the following steps 20 to 21:
and 20, sequencing the virtual objects which interact with the controlled virtual object in the virtual objects contained in the target area according to the sequence of the interaction times to obtain a sequencing result.
The sequencing result comprises all the virtual objects which have interaction with the controlled virtual object, and the virtual objects are sequenced according to the sequence of the interaction times.
And step 21, selecting a specified number of target objects ranked in the top from the ranking results. Namely, selecting a specified number of virtual objects with higher interaction times as target objects; the specified number may be set according to development requirements, for example, the specified number may be set to 3 or 4, and the like.
In practical application, if the number of the virtual objects which interact with the controlled virtual objects in the sequencing result is less than the specified number, determining the interaction times of the virtual objects contained in the target area and the virtual objects except the controlled virtual objects; sequencing the virtual objects which are contained in the target area and interact with the virtual objects except the controlled virtual object according to the sequence of the interaction times, and selecting a preset number of target objects which are sequenced in the front; the preset number and the number of the virtual objects which have interaction with the controlled virtual object in the sequencing result are equal to the specified number. At this time, the selected target objects with the preset number and the target objects having interaction with the controlled virtual object in the sequencing result can be displayed in the graphical user interface, and the interaction control corresponding to each target object is displayed.
Step S210, displaying an interactive control corresponding to each target object; the interaction control is used for controlling and executing interaction operation related to the target object.
In specific implementation, a drop-down frame may be set at the trigger position, the interactive control corresponding to each target object is displayed in the drop-down frame, and each interactive control is named with the object identifier of the target object corresponding to the interactive control. Fig. 3 is a schematic diagram illustrating a display of an interaction control according to an embodiment of the present invention, in which a non-player character represented by a small person with an NPC is displayed overhead in fig. 3, the NPC on the overhead is used to indicate an object identifier of the non-player character; the interactive controls on which 3 target objects are displayed in fig. 3 are an interactive control corresponding to NPC1, an interactive control corresponding to NPC2, and an interactive control corresponding to NPC3, respectively; and the NPC1, the NPC2 and the NPC3 are respectively object identifications of virtual objects corresponding to the 3 interactive controls.
And S212, in response to the triggering operation of the interactive control acting on the first object in the target object, controlling to execute the interactive operation related to the first object.
According to the interaction control method in the game, the method adopts an agile interaction mode, the time of a player for searching the target object is shortened, and the target object with high interaction times can be selected by clicking in a rough area. The interaction control of the target object is displayed in the graphical user interface, so that a player can accurately select the required target object without selecting on a map with dense and overlapped object identifications, and the frequency of error touch is greatly reduced for the player.
The embodiment of the invention also provides another interactive control method in the game, which is realized on the basis of the embodiment, and the method mainly describes the interactive control method of the game under the condition that the virtual object is a non-player character in the game; as shown in fig. 4, the method includes the following specific steps:
step S402, responding to the appointed trigger operation, displaying at least part of scene map of the game scene in the graphical user interface; wherein, the appointed position in the scene map displays the object identification of the non-player character; the specified location matches the location of the non-player character in the game scene.
Step S404, responding to the trigger operation acting on the scene map, determining at least one target object meeting preset conditions from the non-player characters based on the trigger position of the trigger operation in the scene map, and displaying the interactive control of the target object.
In specific implementation, reference may be made to the above embodiment for the specific implementation of the steps S402-S404, and here, only the virtual object in the above embodiment is replaced by a non-player character, so the operation manner is the same, and is not described herein again.
Step S406, responding to the trigger operation of the interactive control acting on the first object in the target object, selecting the first object, and controlling the controlled virtual object to move to the position of the first object.
The triggering operation of the interaction control acting on the first object in the target object may be a clicking operation of the interaction control of the first object, or a sliding operation of the interaction control of the first object. When the interactive control of the first object is triggered, the controlled virtual exclusive object can be controlled to automatically seek to the position corresponding to the first object.
According to the method for controlling the interaction in the game, the non-player characters in a certain area on a scene map can be screened, so that a plurality of non-player characters with the front interaction frequency can be determined, the searching time of a player for a certain non-player character is shortened, and the non-player character with a high arrival rate can be selected by clicking in a general area; meanwhile, the player can accurately select a certain non-player character with a high arrival rate through the interactive control, selection on a map with dense and overlapped non-player characters is not needed, and the frequency of false touch is greatly reduced for the player.
Corresponding to the method embodiment, the embodiment of the invention provides an interactive control device in a game, which provides a graphical user interface through terminal equipment, wherein a scene picture of a game scene is displayed in the graphical user interface; as shown in fig. 5, the apparatus includes:
a map display module 50 for displaying at least a partial scene map of the game scene in the graphical user interface in response to a specified trigger operation; the method comprises the following steps that an object identifier of a virtual object is displayed at a specified position in a scene map; the designated position is matched with the position of the virtual object in the game scene;
a target object determining module 51, configured to determine, in response to a trigger operation acting on the scene map, at least one target object that meets a preset condition from the virtual objects based on a trigger position of the trigger operation in the scene map, and display an interaction control of the target object; the interactive control is used for controlling and executing interactive operation related to the target object.
And the interaction control module 52 is configured to control to perform an interaction operation related to a first object in the target object in response to the triggering operation of the interaction control acting on the first object.
The interactive control device in the game firstly responds to the appointed trigger operation and displays at least part of scene maps of game scenes in the graphical user interface; the method comprises the following steps that an object identifier of a virtual object is displayed at a specified position in a scene map; the designated position is matched with the position of the virtual object in the game scene; then responding to a trigger operation acting on the scene map, determining at least one target object meeting preset conditions from the virtual objects based on a trigger position of the trigger operation in the scene map, and displaying an interactive control of the target object; the interactive control is used for controlling and executing interactive operation related to the target object; and then controlling to execute the interactive operation related to the first object in response to the triggering operation of the interactive control acting on the first object in the target objects. In the method, a player can determine a target object from a plurality of virtual objects contained in a certain position by triggering the position in the scene map, so that the target object can be selected quickly; and the interaction control of the target object can be displayed in the graphical user interface, and can interact with the target object through the interaction control, so that the misoperation of a player on other virtual objects is avoided.
Specifically, the target object determining module 51 includes: the area determining unit is used for responding to the trigger operation acting on the scene map and determining a target area in the scene map based on the trigger position of the trigger operation in the scene map; an object recognition unit configured to recognize a virtual object included in a target area in a scene map; and the object determining unit is used for determining at least one target object meeting preset conditions from the virtual objects contained in the target area.
Further, the area determination unit is configured to: in the scene map, a circular area with the triggering position as the center of a circle and the designated number of pixels as the radius is determined as a target area.
Further, the object determining unit is configured to determine the number of times of interaction between the virtual object included in the target area and the controlled virtual object; and selecting at least one target object with the interaction times meeting the preset conditions from the virtual objects contained in the target area.
In a specific implementation, the object determining unit is further configured to: sequencing virtual objects which have interaction with the controlled virtual object in the virtual objects contained in the target area according to the sequence of the interaction times from a few to obtain a sequencing result; and selecting a specified number of target objects ranked in the top from the ranking result.
Further, the apparatus further comprises an object supplementing module, configured to: if the number of the virtual objects which have interaction with the controlled virtual objects in the sequencing result is less than the specified number, determining the interaction times of the virtual objects contained in the target area and the virtual objects except the controlled virtual objects; sequencing virtual objects which are contained in the target area and interact with virtual objects except the controlled virtual object according to the sequence of the interaction times, and selecting a preset number of target objects which are sequenced in the front; the preset number and the number of the virtual objects which have interaction with the controlled virtual objects in the sequencing result are equal to the specified number.
In practical applications, the object determining unit is further configured to: obtaining historical game data corresponding to the controlled virtual object within the appointed game duration; based on the historical game data, the number of interactions of the controlled virtual object with the virtual object contained in the target area is determined.
In a specific implementation, the triggering operation applied to the scene map includes: and clicking the trigger position of the scene map through a right mouse button.
In an alternative embodiment, the virtual object comprises: a non-player character in the game.
Further, the interaction control module 52 is configured to: and responding to the triggering operation of the interaction control acting on the first object in the target objects, selecting the first object, and controlling the controlled virtual object to move to the position of the first object.
The implementation principle and the generated technical effect of the in-game interaction control device provided by the embodiment of the invention are the same as those of the method embodiment, and for brief description, the corresponding content in the method embodiment can be referred to where the device embodiment is not mentioned.
An embodiment of the present invention further provides an electronic device, as shown in fig. 6, where the electronic device includes a processor and a memory, where the memory stores machine executable instructions capable of being executed by the processor, and the processor executes the machine executable instructions to implement the in-game interaction control method.
Specifically, a graphical user interface is provided through a terminal device, and a scene picture of a game scene is displayed in the graphical user interface; the interactive control method in the game comprises the following steps: in response to a specified trigger operation, displaying at least a partial scene map of a game scene in a graphical user interface; the method comprises the following steps that an object identifier of a virtual object is displayed at a specified position in a scene map; the designated position is matched with the position of the virtual object in the game scene; responding to a trigger operation acting on the scene map, determining at least one target object meeting preset conditions from the virtual objects based on a trigger position of the trigger operation in the scene map, and displaying an interactive control of the target object; the interactive control is used for controlling and executing interactive operation related to the target object; and controlling to execute the interactive operation related to the first object in response to the triggering operation of the interactive control acting on the first object in the target objects.
In the interactive control method in the game, the player can determine the target object from a plurality of virtual objects contained in a certain position only by triggering the position in the scene map, so that the target object can be selected quickly; and the interaction control of the target object can be displayed in the graphical user interface, and can interact with the target object through the interaction control, so that the error touch operation of a player on other virtual objects is avoided.
In an optional embodiment, the step of determining, in response to the trigger operation applied to the scene map, at least one target object satisfying a preset condition from the virtual objects based on a trigger position of the trigger operation in the scene map includes: responding to a trigger operation acting on the scene map, and determining a target area in the scene map based on a trigger position of the trigger operation in the scene map; in a scene map, identifying a virtual object contained in a target area; at least one target object meeting a preset condition is determined from the virtual objects contained in the target area.
In an optional embodiment, the step of determining a target area in the scene map based on the trigger position of the trigger operation in the scene map includes: in the scene map, a circular area which takes the triggering position as the center of a circle and the designated number of pixels as the radius is determined as a target area.
In an optional embodiment, the step of determining at least one target object satisfying a preset condition from among the virtual objects included in the target area includes: determining the interaction times of the virtual object contained in the target area and the controlled virtual object; and selecting at least one target object with the interaction times meeting the preset conditions from the virtual objects contained in the target area.
In an optional embodiment, the step of selecting at least one target object whose interaction times satisfy a preset condition from the virtual objects included in the target area includes: sequencing the virtual objects which have interaction with the controlled virtual object in the virtual objects contained in the target area according to the sequence of the interaction times to obtain a sequencing result; and selecting a specified number of target objects ranked in the top from the ranking results.
In an optional embodiment, the method further comprises: if the number of the virtual objects which have interaction with the controlled virtual objects in the sequencing result is less than the specified number, determining the interaction times of the virtual objects contained in the target area and the virtual objects except the controlled virtual objects; sequencing virtual objects which are contained in the target area and interact with virtual objects except the controlled virtual object according to the sequence of the interaction times, and selecting a preset number of target objects which are sequenced in the front; the preset number and the number of the virtual objects which have interaction with the controlled virtual objects in the sequencing result are equal to the specified number.
In an optional embodiment, the step of determining the number of times of interaction between the virtual object included in the target area and the controlled virtual object includes: obtaining historical game data corresponding to the controlled virtual object within the appointed game duration; based on the historical game data, the number of interactions of the controlled virtual object with the virtual objects contained in the target area is determined.
In an optional embodiment, the triggering operation in response to the action on the scene map includes: and clicking the trigger position of the scene map through a right mouse button.
In an alternative embodiment, the virtual object includes: a non-player character in the game.
In an alternative embodiment, the step of controlling the execution of the interactive operation related to the first object in response to the triggering operation of the interactive control acting on the first object in the target object includes: and responding to the triggering operation of the interaction control of the first object in the target object, selecting the first object, and controlling the controlled virtual object to move to the position of the first object.
Further, the electronic device shown in fig. 6 further includes a bus 102 and a communication interface 103, and the processor 101, the communication interface 103, and the memory 100 are connected through the bus 102.
The memory 100 may include a Random Access Memory (RAM) and a non-volatile memory (non-volatile memory), such as at least one disk memory. The communication connection between the network element of the system and at least one other network element is realized through at least one communication interface 103 (which may be wired or wireless), and the internet, a wide area network, a local network, a metropolitan area network, and the like can be used. The bus 102 may be an ISA bus, a PCI bus, an EISA bus, or the like. The bus may be divided into an address bus, a data bus, a control bus, etc. For ease of illustration, only one double-headed arrow is shown in FIG. 6, but that does not indicate only one bus or one type of bus.
The processor 101 may be an integrated circuit chip having signal processing capabilities. In implementation, the steps of the above method may be performed by integrated logic circuits of hardware or instructions in the form of software in the processor 101. The processor 101 may be a general-purpose processor, and includes a Central Processing Unit (CPU), a Network Processor (NP), and the like; the device can also be a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field-Programmable Gate Array (FPGA), or other Programmable logic devices, discrete Gate or transistor logic devices, discrete hardware components. The various methods, steps and logic blocks disclosed in the embodiments of the present invention may be implemented or performed. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. The steps of the method disclosed in connection with the embodiments of the present invention may be directly implemented by a hardware decoding processor, or implemented by a combination of hardware and software modules in the decoding processor. The software modules may be located in ram, flash, rom, prom, or eprom, registers, etc. as is well known in the art. The storage medium is located in the memory 100, and the processor 101 reads the information in the memory 100, and completes the steps of the method of the foregoing embodiment in combination with the hardware thereof.
The embodiment of the present invention further provides a computer-readable storage medium, where the computer-readable storage medium stores computer-executable instructions, and when the computer-executable instructions are called and executed by a processor, the computer-executable instructions cause the processor to implement the method for controlling interaction in a game.
Specifically, a graphical user interface is provided through a terminal device, and a scene picture of a game scene is displayed in the graphical user interface; the interactive control method in the game comprises the following steps: in response to specifying a trigger action, displaying at least a partial scene map of the game scene in the graphical user interface; the method comprises the following steps that an object identifier of a virtual object is displayed at a specified position in a scene map; the designated position is matched with the position of the virtual object in the game scene; responding to a trigger operation acting on the scene map, determining at least one target object meeting preset conditions from the virtual objects based on a trigger position of the trigger operation in the scene map, and displaying an interactive control of the target object; the interactive control is used for controlling and executing interactive operation related to the target object; and controlling to execute the interactive operation related to the first object in response to the triggering operation of the interactive control acting on the first object in the target object.
In the interactive control method in the game, a player can determine a target object from a plurality of virtual objects contained in a certain position by triggering the position in the scene map, so that the target object can be selected quickly; and the interaction control of the target object can be displayed in the graphical user interface, and can interact with the target object through the interaction control, so that the error touch operation of a player on other virtual objects is avoided.
In an optional embodiment, the step of determining, in response to the trigger operation applied to the scene map, at least one target object satisfying a preset condition from the virtual objects based on a trigger position of the trigger operation in the scene map includes: responding to a trigger operation acting on the scene map, and determining a target area in the scene map based on a trigger position of the trigger operation in the scene map; in a scene map, identifying virtual objects contained in a target area; at least one target object meeting a preset condition is determined from the virtual objects contained in the target area.
In an optional embodiment, the step of determining the target area in the scene map based on the trigger position of the trigger operation in the scene map includes: in the scene map, a circular area with the triggering position as the center of a circle and the designated number of pixels as the radius is determined as a target area.
In an optional embodiment, the step of determining at least one target object satisfying a preset condition from among the virtual objects included in the target area includes: determining the interaction times of the virtual object contained in the target area and the controlled virtual object; and selecting at least one target object with the interaction times meeting preset conditions from the virtual objects contained in the target area.
In an optional embodiment, the step of selecting at least one target object whose interaction times satisfy a preset condition from the virtual objects included in the target area includes: sequencing virtual objects which have interaction with the controlled virtual object in the virtual objects contained in the target area according to the sequence of the interaction times from a few to obtain a sequencing result; and selecting a specified number of target objects ranked in the top from the ranking result.
In an alternative embodiment, the method further comprises: if the number of the virtual objects which have interaction with the controlled virtual objects in the sequencing result is less than the specified number, determining the interaction times of the virtual objects contained in the target area and the virtual objects except the controlled virtual objects; sequencing the virtual objects which are contained in the target area and interact with the virtual objects except the controlled virtual object according to the sequence of the interaction times, and selecting a preset number of target objects which are sequenced in the front; the preset number and the number of the virtual objects which have interaction with the controlled virtual objects in the sequencing result are equal to the specified number.
In an optional implementation manner, the step of determining the number of times of interaction between the virtual object included in the target area and the controlled virtual object includes: obtaining historical game data corresponding to the controlled virtual object within the specified game duration; based on the historical game data, the number of interactions of the controlled virtual object with the virtual object contained in the target area is determined.
In an optional embodiment, the triggering operation in response to the action on the scene map includes: and clicking the trigger position of the scene map through a right mouse button.
In an alternative embodiment, the virtual object includes: a non-player character in the game.
In an alternative embodiment, the step of controlling the execution of the interactive operation related to the first object in response to the triggering operation of the interactive control acting on the first object in the target object includes: and responding to the triggering operation of the interaction control of the first object in the target object, selecting the first object, and controlling the controlled virtual object to move to the position of the first object.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a terminal device, or a network device) to execute all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
In the description of the present invention, it should be noted that the terms "center", "upper", "lower", "left", "right", "vertical", "horizontal", "inner", "outer", etc. indicate orientations or positional relationships based on the orientations or positional relationships shown in the drawings, and are only for convenience of description and simplification of description, but do not indicate or imply that the device or element referred to must have a specific orientation, be constructed and operated in a specific orientation, and thus, should not be construed as limiting the present invention. Furthermore, the terms "first," "second," and "third" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance.
Finally, it should be noted that: the above-mentioned embodiments are only specific embodiments of the present invention, which are used for illustrating the technical solutions of the present invention and not for limiting the same, and the protection scope of the present invention is not limited thereto, although the present invention is described in detail with reference to the foregoing embodiments, those skilled in the art should understand that: any person skilled in the art can modify or easily conceive the technical solutions described in the foregoing embodiments or equivalent substitutes for some technical features within the technical scope of the present disclosure; such modifications, changes or substitutions do not depart from the spirit and scope of the embodiments of the present invention, and they should be construed as being included therein. Therefore, the protection scope of the present invention shall be subject to the protection scope of the claims.

Claims (13)

1. An interactive control method in a game is characterized in that a terminal device provides a graphical user interface, and a scene picture of a game scene is displayed in the graphical user interface; the method comprises the following steps:
in response to specifying a trigger operation, displaying at least a partial scene map of the game scene in the graphical user interface; wherein, the appointed position in the scene map displays the object identification of the virtual object; the designated position matches the position of the virtual object in the game scene;
responding to a trigger operation acting on the scene map, determining at least one target object meeting a preset condition from the virtual objects based on a trigger position of the trigger operation in the scene map, and displaying an interaction control of the target object; the interaction control is used for controlling the execution of the interaction operation related to the target object;
and controlling to execute the interactive operation related to the first object in response to the triggering operation of the interactive control acting on the first object in the target objects.
2. The method according to claim 1, wherein the step of determining at least one target object satisfying a preset condition from the virtual objects based on a trigger position of the trigger operation in the scene map in response to the trigger operation acting on the scene map comprises:
in response to a trigger operation acting on the scene map, determining a target area in the scene map based on a trigger position of the trigger operation in the scene map;
in the scene map, identifying a virtual object contained in the target area;
and determining at least one target object meeting preset conditions from the virtual objects contained in the target area.
3. The method of claim 2, wherein the step of determining a target area in the scene map based on the trigger location of the trigger operation in the scene map comprises:
and in the scene map, determining a circular area with the triggering position as the center of a circle and the designated number of pixels as the radius as the target area.
4. The method according to claim 2, wherein the step of determining at least one target object satisfying a preset condition from among the virtual objects included in the target area comprises:
determining the interaction times of the virtual object contained in the target area and the controlled virtual object;
and selecting at least one target object with interaction times meeting preset conditions from the virtual objects contained in the target area.
5. The method according to claim 4, wherein the step of selecting at least one target object with interaction times satisfying a predetermined condition from the virtual objects included in the target area comprises:
sequencing the virtual objects which have interaction with the controlled virtual object in the virtual objects contained in the target area according to the sequence of the number of interaction times to obtain a sequencing result;
and selecting a specified number of target objects ranked in the top from the ranking results.
6. The method of claim 5, further comprising:
if the number of the virtual objects which have interaction with the controlled virtual objects in the sequencing result is less than the specified number, determining the interaction times of the virtual objects contained in the target area and the virtual objects except the controlled virtual objects;
sequencing the virtual objects contained in the target area and interacted with the virtual objects except the controlled virtual object according to the sequence of the interaction times, and selecting a preset number of target objects in the front of the sequence; and the preset number and the number of the virtual objects which have interaction with the controlled virtual objects in the sequencing result are equal to the specified number.
7. The method of claim 4, wherein the step of determining the number of interactions between the virtual object contained in the target area and the controlled virtual object comprises:
obtaining historical game data corresponding to the controlled virtual object within a specified game duration;
and determining the interaction times of the controlled virtual object and the virtual object contained in the target area based on the historical game data.
8. The method according to claim 1 or 2, wherein the triggering operation acting on the scene map comprises: and clicking the trigger position of the scene map through a right mouse button.
9. The method of claim 1, wherein the virtual object comprises: a non-player character in the game.
10. The method according to claim 9, wherein the step of controlling the execution of the interactive operation related to the first object in response to the triggering operation of the interactive control acting on the first object in the target objects comprises:
and responding to the triggering operation of an interaction control acting on a first object in the target objects, selecting the first object, and controlling the controlled virtual object to move to the position of the first object.
11. An interactive control device in a game is characterized in that a terminal device provides a graphical user interface, and a scene picture of a game scene is displayed in the graphical user interface; the device comprises:
a map display module for displaying at least a partial scene map of the game scene in the graphical user interface in response to a specified trigger operation; wherein, the appointed position in the scene map displays the object identification of the virtual object; the designated position matches the position of the virtual object in the game scene;
the target object determination module is used for responding to a trigger operation acting on the scene map, determining at least one target object meeting a preset condition from the virtual objects based on a trigger position of the trigger operation in the scene map, and displaying an interaction control of the target object; the interaction control is used for controlling the execution of the interaction operation related to the target object;
and the interaction control module is used for responding to the triggering operation of the interaction control acting on the first object in the target object and controlling the execution of the interaction operation related to the first object.
12. An electronic device comprising a processor and a memory, the memory storing machine executable instructions executable by the processor, the processor executing the machine executable instructions to implement the in-game interaction control method of any one of claims 1 to 10.
13. A computer-readable storage medium, characterized in that it stores computer-executable instructions which, when invoked and executed by a processor, cause the processor to implement the in-game interaction control method of any one of claims 1 to 10.
CN202211055391.8A 2022-08-31 2022-08-31 Interactive control method and device in game and electronic equipment Pending CN115591231A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211055391.8A CN115591231A (en) 2022-08-31 2022-08-31 Interactive control method and device in game and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211055391.8A CN115591231A (en) 2022-08-31 2022-08-31 Interactive control method and device in game and electronic equipment

Publications (1)

Publication Number Publication Date
CN115591231A true CN115591231A (en) 2023-01-13

Family

ID=84843494

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211055391.8A Pending CN115591231A (en) 2022-08-31 2022-08-31 Interactive control method and device in game and electronic equipment

Country Status (1)

Country Link
CN (1) CN115591231A (en)

Similar Documents

Publication Publication Date Title
CN110841291A (en) Method and device for interacting shortcut messages in game and electronic equipment
CN111437598B (en) Interactive method and device for tactical plan in game and electronic equipment
CN112807686A (en) Game fighting method and device and electronic equipment
CN111054074A (en) Method and device for moving virtual object in game and electronic equipment
CN113082700A (en) Information interaction method and device and electronic equipment
CN112619124A (en) Control method and device for game object movement and electronic equipment
CN113546412B (en) Display control method and device in game and electronic equipment
CN115487498A (en) Game display control method and device and electronic equipment
CN113750522A (en) Game skill processing method and device and electronic equipment
CN115591231A (en) Interactive control method and device in game and electronic equipment
CN116966561A (en) Tactical command method and device in game and electronic equipment
CN115671735A (en) Object selection method and device in game and electronic equipment
CN115738230A (en) Game operation control method and device and electronic equipment
CN112494945B (en) Game scene conversion method and device and electronic equipment
CN111841003B (en) Information processing method and device in game and electronic equipment
CN109725809B (en) Information processing method, server, terminal and computer storage medium
CN115501604A (en) Game interaction method, device, equipment and storage medium
CN115721936A (en) Interface control method, device, equipment and storage medium
CN115501603A (en) Game interaction method, device, equipment and storage medium
CN116115999A (en) Land parcel searching method and device in game and electronic equipment
CN115382207A (en) Indication identifier display method and device and electronic equipment
CN115317903A (en) Interactive control method and device in game and electronic equipment
CN116421973A (en) Game path finding method, game path finding device, electronic equipment and machine-readable storage medium
CN117205552A (en) Interactive control method and device for game and electronic equipment
CN117244238A (en) Game update picture display control method and device and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination