CN115317912A - Game control method and device, electronic equipment and storage medium - Google Patents

Game control method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN115317912A
CN115317912A CN202210968915.6A CN202210968915A CN115317912A CN 115317912 A CN115317912 A CN 115317912A CN 202210968915 A CN202210968915 A CN 202210968915A CN 115317912 A CN115317912 A CN 115317912A
Authority
CN
China
Prior art keywords
virtual
scene
area
materials
goods
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210968915.6A
Other languages
Chinese (zh)
Inventor
傅婧仪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Netease Hangzhou Network Co Ltd
Original Assignee
Netease Hangzhou Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netease Hangzhou Network Co Ltd filed Critical Netease Hangzhou Network Co Ltd
Priority to CN202210968915.6A priority Critical patent/CN115317912A/en
Priority to PCT/CN2022/128977 priority patent/WO2024031849A1/en
Publication of CN115317912A publication Critical patent/CN115317912A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • A63F13/56Computing the motion of game characters with respect to other game characters, game objects or elements of the game scene, e.g. for simulating the behaviour of a group of virtual soldiers or for path finding
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/53Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game
    • A63F13/537Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game using indicators, e.g. showing the condition of a game character on screen
    • A63F13/5372Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game using indicators, e.g. showing the condition of a game character on screen for tagging characters, objects or locations in the game scene, e.g. displaying a circle under the character controlled by the player
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/837Shooting of targets

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application provides a game control method, a game control device, electronic equipment and a storage medium, wherein the method comprises the following steps: responding that the virtual fighting task meets preset conditions, and controlling to release virtual goods and materials in a virtual scene; determining a target scene area corresponding to the released virtual goods and materials in the virtual scene; and responding to the movement of the virtual object to the target scene area, displaying the azimuth guide identifier on the graphical user interface, and controlling the indication direction of the azimuth guide identifier according to the target position of the virtual material in the virtual scene. According to the method and the device, after the virtual object moves to the target scene area, the direction guide identification is displayed on the graphical user interface, so that a player can quickly find virtual goods in the virtual scene according to the displayed direction guide identification, the speed of searching the virtual goods in the virtual scene by the player is increased, and the human-computer interaction efficiency is improved.

Description

Game control method and device, electronic equipment and storage medium
Technical Field
The present application relates to the field of game technologies, and in particular, to a game control method and apparatus, an electronic device, and a storage medium.
Background
In some shooting games, a virtual vehicle (such as a helicopter) puts airdrop goods into a virtual scene by a fixed airline and at fixed time intervals, the airdrop goods after landing generate a special identifier in the virtual scene, a position identifier corresponding to the position of the airdrop goods in the virtual scene is generated on a field Jing Sulve map, and a player can find and scrape the airdrop goods according to the position identifier and the special identifier in the virtual scene.
However, since the scene thumbnail map displays not only the position identifiers corresponding to the airdropped supplies but also the object identifiers corresponding to the plurality of virtual objects, and a relatively large number of identifiers are displayed in the smaller scene thumbnail map, it is easy for the player to block the line of sight for viewing the relative positions between the object identifiers corresponding to the virtual objects controlled by the player and the position identifiers corresponding to the airdropped supplies, and the display range of the special identifiers corresponding to the airdropped supplies in the virtual scene is limited. Therefore, the player can check the position identification corresponding to the air-dropped goods and materials displayed in the scene thumbnail map and control the virtual object to move to the position of the air-dropped goods and materials in the virtual scene, operation is inconvenient, the player can not accurately control the virtual object to move to the position of the air-dropped goods and materials in the virtual scene due to the fact that sight lines are blocked easily, efficiency of the player for searching the air-dropped goods and materials is reduced finally, and human-computer interaction efficiency is reduced.
Disclosure of Invention
In view of this, an object of the present application is to provide a game control method, an apparatus, an electronic device, and a storage medium, in which after a virtual object moves to a target scene area, an orientation guidance identifier is displayed on a graphical user interface, which is helpful for a player to quickly find virtual materials in a virtual scene according to the displayed orientation guidance identifier, so that the speed of the player searching the virtual materials in the virtual scene is increased, and further, the human-computer interaction efficiency is increased.
In a first aspect, an embodiment of the present application provides a game control method, in which a terminal device provides a graphical user interface, and a virtual scene of a current virtual battle task is displayed on the graphical user interface, where the method includes:
responding that the virtual battle mission meets a preset condition, and controlling to release virtual goods and materials in the virtual scene;
determining a target scene area corresponding to the released virtual goods and materials in the virtual scene;
and responding to the movement of the virtual object to the target scene area, displaying an azimuth guide identifier on the graphical user interface, and controlling the indication direction of the azimuth guide identifier according to the target position of the virtual material in the virtual scene.
In an alternative embodiment of the application, the target location is located within the target scene area.
In an alternative embodiment of the subject application, the step of displaying the bearing guide identifier on the graphical user interface in response to the virtual object moving into the target scene area comprises:
detecting a distance between a virtual object and virtual goods and materials in the target scene area in response to the virtual object moving into the target scene area;
and responding that the distance between the virtual object and the virtual goods and materials in the target scene area is smaller than a preset distance threshold value, and displaying a direction guide identifier on the graphical user interface.
In an optional embodiment of the present application, the method further comprises:
and displaying material area identification corresponding to the virtual materials in a map area corresponding to the target scene area in the scene thumbnail map displayed in the graphical user interface.
In an optional embodiment of the present application, the displaying, in a scene thumbnail map displayed in the graphical user interface, a material area identifier corresponding to the virtual material in a map area corresponding to the target scene area includes:
and displaying a material area identifier with a preset area corresponding to the virtual material in an overlaid manner in a map area corresponding to the target scene area in the scene thumbnail map displayed in the graphical user interface.
In an optional embodiment of the present application, the step of controlling to release the virtual materials in the virtual scene includes:
generating the virtual goods at a preset height of the virtual scene;
and controlling the virtual material to move from the preset height to the target position of the virtual scene.
In an optional embodiment of the present application, the step of displaying, in a map area corresponding to the target scene area in the scene thumbnail map displayed in the graphical user interface, a material area identifier corresponding to the virtual material includes:
and in the process of controlling the virtual goods and materials to move from the preset height to the target position of the virtual scene, displaying goods and materials area identification corresponding to the virtual goods and materials in a map area corresponding to the target scene area in a scene thumbnail map displayed in the graphical user interface.
In an optional embodiment of the present application, the method further comprises:
responding to the virtual goods and materials moving from the preset height to the target position of the virtual scene, and controlling to update the goods and materials area identification to the goods and materials place identification displayed at the position, corresponding to the target position, of the scene thumbnail.
In an optional embodiment of the present application, the method further comprises:
distinguishing and displaying material area identifications corresponding to the virtual materials, which are displayed in map areas corresponding to different target scene areas in the scene thumbnail map; and different target scene areas are used for putting virtual materials of different grades.
In an optional embodiment of the present application, the method further comprises:
displaying a timing control on the graphical user interface; the timing control is used for performing countdown display according to the moment when the virtual goods and materials are generated at the preset height of the virtual scene and the moment when the virtual goods and materials move to the target position of the virtual scene from the preset height;
and adjusting the display form of the material area identifier corresponding to the virtual material, which is displayed in the map area corresponding to the target scene area in the scene thumbnail map, according to the countdown displayed in the timing control.
In an optional embodiment of the present application, the adjusting, according to the countdown displayed in the timing control, a display form of a material area identifier corresponding to the virtual material, displayed in a map area corresponding to the target scene area in the scene thumbnail map, includes:
filling material area marks corresponding to the virtual materials, which are displayed in a map area corresponding to the target scene area in the scene thumbnail map, with target colors according to the countdown time progress displayed in the timing control; the countdown time progress is in a correlation relation with the filling area of the material region identification, and the target color is determined according to the level of the virtual materials put in the target scene region.
In an optional embodiment of the present application, the method further comprises:
displaying a virtual object identifier corresponding to the virtual object in a scene thumbnail map displayed in the graphical user interface;
wherein the position of the virtual object identifier in the scene thumbnail map is updated in real time according to the movement of the virtual object in the virtual scene.
In an optional embodiment of the present application, the method further comprises:
and in response to the virtual goods and materials in the target scene area being picked up, controlling to delete the goods and materials area identification corresponding to the virtual goods and materials from the map area corresponding to the target scene area in the scene thumbnail map.
In an alternative embodiment of the present application, the preset condition comprises one of the following:
the game progress of the virtual battle mission reaches a preset time point;
executing a virtual carrier calling event by a virtual object in the virtual battle task, wherein the virtual carrier is used for throwing virtual materials to the virtual scene;
the survival number of the virtual objects which survive in the virtual battle mission reaches the set number.
In a second aspect, an embodiment of the present application provides a game control apparatus, which provides a graphical user interface through a terminal device, where a virtual scene of a virtual battle mission is displayed on the graphical user interface, and the apparatus includes:
the material control module is used for responding that the virtual fighting task meets preset conditions and controlling virtual materials to be put in the virtual scene;
the region determining module is used for determining a target scene region corresponding to the launched virtual goods in the virtual scene;
and the azimuth guiding module is used for responding to the movement of the virtual object to the target scene area, displaying an azimuth guiding identifier on the graphical user interface, and controlling the indicating direction of the azimuth guiding identifier according to the target position of the virtual material in the virtual scene.
In a third aspect, an embodiment of the present application further provides an electronic device, including: a processor, a memory and a bus, the memory storing machine-readable instructions executable by the processor, the processor and the memory communicating over the bus when the electronic device is operating, the machine-readable instructions when executed by the processor performing the steps of the game control method as described above.
In a fourth aspect, the present application further provides a computer-readable storage medium, where a computer program is stored, and the computer program is executed by a processor to perform the steps of the game control method as described above.
The technical effects brought by the technical scheme provided by the embodiment of the application at least comprise:
after virtual materials are put into the virtual scene, a target scene area corresponding to the put virtual materials in the virtual scene is determined, and after the virtual object moves to the target scene area, an orientation guide mark is displayed on the graphical user interface so as to guide the virtual object to move from the current position of the virtual object to the virtual materials in the virtual scene. Therefore, after the player controls the virtual object to move to the target scene area, the player can directly and accurately control the virtual object to move from the current position of the virtual object to the virtual goods in the virtual scene according to the direction guide identifier, the player does not need to check the position identifier corresponding to the airdropped goods displayed in the scene thumbnail map and control the virtual object to move to the position of the airdropped goods in the virtual scene, the operation is convenient and fast, the sight line of the player is not influenced by too many identifiers displayed on a graphical user interface, the efficiency of searching the airdropped goods by the player is improved, the player can find the virtual goods according to the direction guide identifier after entering the target scene area, the speed of searching the virtual goods in the virtual scene by the player is improved, and the human-computer interaction efficiency is improved.
In order to make the aforementioned objects, features and advantages of the present application more comprehensible, preferred embodiments accompanied with figures are described in detail below.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are required to be used in the embodiments will be briefly described below, it should be understood that the following drawings only illustrate some embodiments of the present application and therefore should not be considered as limiting the scope, and for those skilled in the art, other related drawings can be obtained from the drawings without inventive effort.
Fig. 1 is a flowchart of a game control method according to an embodiment of the present application;
FIG. 2 is a schematic diagram of a graphical user interface provided by an embodiment of the present application;
FIG. 3 is a flow chart of another game control method provided in the embodiments of the present application;
FIG. 4 is a schematic diagram of another graphical user interface provided by an embodiment of the present application;
FIG. 5 is a flow chart of another game control method provided in the embodiments of the present application;
FIG. 6 is a schematic structural diagram of a game control device according to an embodiment of the present application;
FIG. 7 is a schematic structural diagram of another game control apparatus according to an embodiment of the present disclosure;
FIG. 8 is a schematic structural diagram of another game control device provided in an embodiment of the present application;
fig. 9 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present application clearer, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all the embodiments. The components of the embodiments of the present application, as generally described and illustrated in the figures herein, could be arranged and designed in a wide variety of different configurations. Thus, the following detailed description of the embodiments of the present application, as presented in the figures, is not intended to limit the scope of the claimed application, but is merely representative of selected embodiments of the application. Every other embodiment that can be obtained by a person skilled in the art without making creative efforts based on the embodiments of the present application falls within the protection scope of the present application.
First, the names referred to in the embodiments of the present application are briefly introduced:
(1) Terminal device
The terminal device related in the embodiment of the present application mainly refers to a terminal for providing a graphical user interface and capable of performing control operation on a virtual object. The terminal device may be a local terminal device mentioned below, or a client device in a cloud interaction system. The terminal device may include, but is not limited to, any one of the following devices: notebook computers, smart phones, tablet computers, desktop computers, game machines, MP4 (Moving Picture Experts Group Audio Layer IV) players, personal Digital Assistants (PDAs), e-book readers, and the like. The terminal device has installed and operated therein an application program supporting a game, such as an application program supporting a three-dimensional game or a two-dimensional game. In the embodiment of the present application, an application program is introduced as a game application, and optionally, the application program may be a standalone application program, such as a standalone 3D game program, or an online application program.
(2) Graphic User Interface (GUI)
The graphic user interface is an interface display format for human-computer communication, which allows a user to manipulate icons, logos, or menu options on a screen using an input device such as a mouse or a keyboard, and also allows the user to manipulate the icons or menu options on the screen by performing a touch operation on a touch screen of a touch terminal to select a command, start a program, or perform some other task, etc.
The graphical user interface in the embodiment of the application can provide or display the picture corresponding to the application program, and the picture comprises a UI (user interface) and a game picture for a player to interact. In alternative embodiments, game controls (e.g., skill controls, movement controls, functionality controls, etc.), indicators (e.g., directional indicators, character indicators, etc.), information presentation areas (e.g., number of clicks, game play time, etc.), or game setting controls (e.g., system settings, stores, coins, etc.) may be included in the UI interface. In an optional embodiment, the game screen is a display screen corresponding to a virtual scene displayed by the terminal device, and the game screen may include virtual objects such as a game character, an NPC character, and an AI character that execute a game logic in the virtual scene.
(3) Virtual scene
A virtual scene is a game scene that an application program displays (or provides) when running on a terminal or a server, that is, a scene used during normal game play. That is to say, the virtual scene is a virtual game control that bears a virtual object during the game, and the virtual object can be controlled by an operation instruction issued by a user (i.e., a player) to the terminal device in the game scene to perform actions such as movement and skill release. Optionally, the game scene may be a simulated environment of the real world, a semi-simulated semi-fictional virtual environment, or a purely fictional virtual environment. The game scene can be any one of a two-dimensional virtual scene, a 2.5-dimensional virtual scene and a three-dimensional virtual scene, and the virtual environment can be sky, land, sea and the like, wherein the land comprises environmental elements such as deserts, cities and the like. The game scene is a scene in which a user controls the complete game logic of the virtual object. Optionally, the game scenario may also be used for game scenario engagement between at least two virtual objects, in which game scenario there are virtual resources available for use by the at least two virtual objects. Illustratively, a game scene may include any one or more of the following elements: game background elements, game object elements, game prop elements, game material elements and the like.
(4) Virtual object
A virtual object refers to a dynamic object that can be controlled in a virtual scene. Alternatively, the dynamic object may be a virtual character, an animation character, or the like. The virtual object is a character that a player controls through an input device, or Artificial Intelligence (AI) set in a virtual environment fight by training. Optionally, the virtual object is a virtual character playing a game in a virtual scene. Optionally, the number of virtual objects in the virtual scene match is preset, or dynamically determined according to the number of clients participating in the match, which is not limited in the embodiment of the present application. In one possible implementation, the user can control the virtual object to move in the virtual scene, e.g., control the virtual object to run, jump, crawl, etc., and can also control the virtual object to fight against other virtual objects using skills, virtual props, etc., provided by the application. Alternatively, when the virtual environment is a three-dimensional virtual environment, the virtual objects may be three-dimensional virtual models, each virtual object having its own shape and volume in the three-dimensional virtual environment, occupying a portion of the space in the three-dimensional virtual environment. Optionally, the virtual object is a three-dimensional character constructed based on three-dimensional human skeletal technology, and the virtual object realizes different external images by wearing different skins. In some implementations, the virtual object may also be implemented by using a 2.5-dimensional or 2-dimensional model, which is not limited in this application embodiment.
(5) Virtual battle mission
The virtual battle mission comprises a virtual scene, a virtual object controlled by a player in the virtual scene and other virtual objects. The virtual object is a virtual object controlled by a player using the first terminal device, and the other virtual objects are virtual objects controlled by other players who perform the current virtual match-up mission together in the current virtual match-up mission, where the virtual object and the other virtual objects belong to the same virtual battle.
The game control method provided by the embodiment of the application can be operated on the local terminal equipment or the server. When the game control method is run on a server, the game control method can be implemented and executed based on a cloud interactive system, wherein the cloud interactive system comprises the server and the client device.
In an optional embodiment, various cloud applications may be run under the cloud interaction system, for example: and (5) cloud games. Taking a cloud game as an example, a cloud game refers to a game mode based on cloud computing. In the cloud game operation mode, the game program operation main body and the game picture presentation main body are separated, the storage and the operation of the game control method are completed on a cloud game server, and the client device is used for receiving and sending data and presenting the game picture, for example, the client device can be a display device with a data transmission function close to a user side, such as a mobile terminal, a television, a computer, a palm computer and the like; but the game control is performed by a cloud game server in the cloud. When a game is played, a player operates the client device to send an operation instruction to the cloud game server, the cloud game server runs the game according to the operation instruction, data such as game pictures and the like are encoded and compressed, the data are returned to the client device through a network, and finally the data are decoded through the client device and the game pictures are output.
In an optional implementation manner, taking a game as an example, the local terminal device stores a game program and is used for presenting a game screen. The local terminal device is used for interacting with the player through a graphical user interface, namely, a game program is downloaded and installed and operated through the electronic device conventionally. The manner in which the local terminal device provides the graphical user interface to the player may include a variety of ways, for example, it may be rendered for display on a display screen of the terminal or provided to the player through holographic projection. By way of example, the local terminal device may include a display screen for presenting a graphical user interface including game screens and a processor for running the game, generating the graphical user interface, and controlling display of the graphical user interface on the display screen.
Next, an application scenario to which the present application is applicable is described. The application can be applied to the technical field of games, such as shooting games and the like. In the related game, players can compete with each other in limited maps and resources corresponding to the virtual scene, collect supplies such as various firearms in the game and a sighting telescope for upgrading the firearms, some equipment and medicines such as helmets, backpacks, flash bombs, and airdrop supplies. Wherein, the acquisition mode of air-dropped supplies includes two kinds: firstly, calling a virtual carrier (such as a helicopter) through a calling prop (such as a signal bomb) of a virtual object device, and throwing air-drop materials to the position of the virtual object by the virtual carrier in an air-drop mode; secondly, during the game, the virtual vehicle puts airdrop materials into the virtual environment by a fixed route and a fixed time interval.
Illustratively, the related game may be played as follows: controlling the virtual object completing the game preparation link to enter a first game link in response to the condition that the number of players completing the game preparation link reaches a first preset number; and in the process of the first game link, responding to the interactive behavior of the virtual object in the first game link in the virtual scene, controlling the virtual object to move in the virtual scene and picking up materials. Specifically, the condition that the number of players who complete the game preparation link reaches the first preset number is used for controlling the start of the game in the game, which may be understood as that a game player sends a start request of the game in the game to a server through a terminal device, the server determines whether the received start request of the game player meets the start condition of the game in the game after receiving the start request of the game in the game, and if the start request meets the start condition, the game in the game is controlled to start. Specifically, the starting conditions of the game of this hand (starting conditions of the game search link) may include the following: the number of the players participating in the game of the local game meets the requirement of a first preset number, and if the number of the players participating in the game of the local game meets 100 persons, 120 persons, 140 persons and the like, the game of the local game is started; if the online time of the game exceeds the preset limit time, the server forces the game player to quit the game. The interactive behavior refers to a behavior of a virtual object manipulated by a player performing an activity in a virtual scene, and exemplary behaviors of the virtual object performing the activity in the virtual scene may include, but are not limited to, at least one of the following: walking, running, jumping, climbing, lying down, attacking, skill releasing, material picking and message sending. Here, the virtual objects active in the virtual scene may include other player-manipulated virtual objects or other non-player-manipulated virtual objects in addition to the virtual object manipulated by the player.
Specifically, when a virtual vehicle (such as a helicopter) launches airdropped goods and materials into a virtual scene in a fixed air route and at a fixed time interval, the airdropped goods and materials generate a special identifier in the virtual scene, and a position identifier corresponding to the position of the airdropped goods and materials in the virtual scene is generated on a field Jing Sulve map, so that a player can find and scrape the airdropped goods and materials according to the position identifier and the special identifier in the virtual scene.
However, research shows that, in addition to the position identifiers corresponding to the airdropped supplies, object identifiers corresponding to a plurality of virtual objects are also displayed in the scene thumbnail map, and a large number of identifiers are displayed in the smaller scene thumbnail map, so that the sight line of the player viewing the relative position between the object identifier corresponding to the virtual object controlled by the player and the position identifier corresponding to the airdropped supplies is easily blocked, and the display range of the special identifiers corresponding to the airdropped supplies in the virtual scene is limited. Therefore, the player can check the position identification corresponding to the air-dropped goods and materials displayed in the scene thumbnail map and control the virtual object to move to the position of the air-dropped goods and materials in the virtual scene, operation is inconvenient, the player can not accurately control the virtual object to move to the position of the air-dropped goods and materials in the virtual scene due to the fact that sight lines are blocked easily, efficiency of the player for searching the air-dropped goods and materials is reduced finally, and human-computer interaction efficiency is reduced.
Based on this, the embodiment of the application provides a game control method, a game control device, an electronic device and a storage medium, a player can quickly find virtual goods in a virtual scene according to a displayed direction guide identifier, the speed of searching the virtual goods in the virtual scene by the player is improved, and further the human-computer interaction efficiency is improved.
Referring to fig. 1, fig. 1 is a flowchart of a game control method according to an embodiment of the present disclosure. As shown in fig. 1, a game control method provided in an embodiment of the present application includes:
s101, responding that the virtual battle mission meets a preset condition, and controlling to release virtual goods and materials in a virtual scene;
s102, determining a target scene area corresponding to the released virtual goods in the virtual scene;
s103, responding to the movement of the virtual object to the target scene area, displaying the azimuth guide mark on the graphical user interface, and controlling the indication direction of the azimuth guide mark according to the target position of the virtual material in the virtual scene.
According to the game control method, after virtual materials are thrown into the virtual scene, the target scene area corresponding to the thrown virtual materials in the virtual scene is determined, and after the virtual object moves to the target scene area, the direction guide identification is displayed on the graphical user interface so as to guide the virtual object to move from the current position of the virtual object to the virtual materials in the virtual scene. Therefore, after the player controls the virtual object to move to the target scene area, the player can directly and accurately control the virtual object to move from the current position of the virtual object to the virtual goods in the virtual scene according to the direction guide identifier, the player does not need to check the position identifier corresponding to the airdropped goods displayed in the scene thumbnail map and control the virtual object to move to the position of the airdropped goods in the virtual scene, the operation is convenient and fast, the sight line of the player is not influenced by too many identifiers displayed on a graphical user interface, the efficiency of searching the airdropped goods by the player is improved, the player can find the virtual goods according to the direction guide identifier after entering the target scene area, the speed of searching the virtual goods in the virtual scene by the player is improved, and the human-computer interaction efficiency is improved.
In order to facilitate understanding of the present embodiment, the following takes as an example that the game control method provided in the embodiments of the present application is applied to a terminal device, and the foregoing exemplary steps provided in the embodiments of the present application are respectively described.
In step S101, in response to that the virtual battle mission meets a preset condition, virtual materials are controlled to be released in the virtual scene.
In this step, the preset condition includes one of the following items:
(1) The game progress of the virtual battle mission reaches a preset time point.
(2) And executing a virtual carrier calling event by the virtual object in the virtual battle mission, wherein the virtual carrier is used for throwing virtual goods and materials to the virtual scene.
(3) The survival number of the virtual objects which survive in the virtual battle mission reaches the set number.
The following is a specific description of the preset conditions including the above three cases:
(1) The game progress of the virtual battle mission reaches a preset time point.
In one embodiment, the preset time point may be a fixed time point preset by the game system, and the fixed time point may be a time point uniformly set in the game system, for example, when the game progress of the virtual match-up task reaches 20 minutes, the virtual goods and materials are controlled to be released in the virtual scene.
Therefore, as the player knows the fixed time point of putting the virtual goods in the virtual scene in advance, the player can prepare to search the virtual goods in advance, the situation that the player misses the time of searching and scraping the virtual goods because the time of putting the virtual goods in the virtual scene is not clear is avoided, the game process is accelerated, and the human-computer interaction efficiency is indirectly improved.
In another embodiment, the preset time point may be a random time point preset by the game system, and the random time point may be a time point randomly set in the game mechanism, for example, when the game progress of the virtual match-up task reaches 10 minutes, 20 minutes, or 30 minutes, the virtual material is controlled to be released in the virtual scene.
Here, because the time point of putting in the virtual materials in the virtual scene is a random time point, and the program code aiming at the random time point in the computer is relatively not very complex, balance can be established between computer resources and the game experience of the player, and the effect of game balance can be achieved without consuming a large amount of computer resources while improving the game experience of the player.
(2) And executing a virtual carrier calling event by the virtual object in the virtual battle mission, wherein the virtual carrier is used for throwing virtual goods and materials to the virtual scene.
The virtual vehicle calling event refers to calling a virtual vehicle (such as a helicopter) through a calling prop (such as a signal bomb) of virtual object equipment, and the virtual vehicle puts virtual goods and materials into a virtual scene in an air drop mode.
Here, the time of putting the virtual goods in the virtual scene may be controlled by the player himself, that is, the time of the virtual goods appearing in the virtual scene may be controlled by the player himself. The virtual object can control the virtual goods and materials to be delivered in the virtual scene by executing the virtual vehicle calling event. Therefore, the virtual object which searches for the summoning prop can summon and preferentially search virtual goods and materials, so that the player is not only stimulated to wait for airdrop goods and materials, but also can search for the summoning prop in a virtual scene, the diversification of virtual goods and materials putting is increased, the virtual goods and materials put in the virtual scene have greater attraction, more players can be promoted to compete for the virtual goods and materials, the battle is bound to be carried out between different marketing campaigns so as to eliminate partial players, the number of players can be greatly reduced, the time of a single game is shortened, and the human-computer interaction efficiency is indirectly improved.
(3) The survival number of the virtual objects which survive in the virtual battle mission reaches the set number.
In the step, the set number is used as a starting condition for judging the virtual goods and materials to be thrown in the virtual scene by the game system, and in the process of the virtual battle mission, if the survival number of the survival virtual objects reaches the set number, the virtual goods and materials are thrown in the virtual scene.
The set number can be determined by a game developer according to a value set by game design, or can be determined by intelligently analyzing historical data in a virtual battle task through an artificial intelligence technology.
Therefore, the virtual objects are triggered to be thrown in the virtual scene according to the set number of the survival virtual objects, so that the scattered players are gathered to increase the possibility of confrontation among different players, the elimination speed of each virtual object in the game is increased, and the aims of indirectly shortening the time length of a single game and improving the human-computer interaction efficiency are fulfilled.
In step S102, a target scene region corresponding to the virtual material to be delivered in the virtual scene is determined. Here, the target scene area has at least two functions:
firstly, the target scene area can be used for putting virtual goods and materials. In particular, the target scene area has a certain limit or range in the virtual scene, and the virtual object may move a distance within the target scene area or move a time within the target scene area at the current moving speed. In one embodiment, when the system puts virtual goods and materials, an air-drop area is predetermined as a target scene area, and the virtual goods and materials are randomly put in the air-drop area; in one embodiment, a system determines an accurate position when virtual goods and materials are thrown in advance, and then determines an air-drop area corresponding to the accurate position according to a rule preset by the system; the system can launch virtual goods and materials to the accurate position, and a scene thumbnail map in the game can generate goods and materials area identification corresponding to an air-dropped area; in another embodiment, the system simultaneously determines a target scene area for delivering the virtual goods and an accurate position in the target scene area for delivering the virtual goods, wherein the target scene area and the accurate position appear simultaneously and supplement each other.
In the embodiment of the application, the target scene area is used for launching the virtual goods, so that the player can control the virtual object to search the virtual goods in the target scene area after moving to the target scene area. The searching area of the virtual goods and materials is determined to be the target scene area, so that the player can know the searching area of the virtual goods and materials in advance, the moving direction of the virtual object can be adjusted accurately and timely, the virtual object can be guaranteed to be searched for the virtual goods and materials in the target scene area all the time, the searching efficiency of the player on the virtual goods and materials is effectively improved, and the game progress can be accelerated.
Secondly, the target scene area can be used as a trigger area for triggering the display of the azimuth guide mark on the graphical user interface. At this time, the target scene area may be independent of the precise location in the virtual scene for delivering the virtual goods. That is, when virtual materials are released in a virtual scene, a target scene area is determined synchronously, and the target scene area is not used for releasing the virtual materials. However, after the virtual goods and materials are put in the virtual scene, if the player moves to the target scene area, the direction guide identifier for guiding the virtual object to move to the virtual goods and materials is displayed on the graphical user interface.
For example, in the related game, the target scene area can be used as one of the links of the reward mechanism in the game, and the reward mechanism is designed according to the following idea: the player can provide clues for moving to the target scene area or move to the target scene area by luck after finishing the specified tasks, and when the player moves to the target scene area, the player can be triggered to display the direction guide identifier on the graphical user interface, so that the player can quickly search virtual goods in the virtual scene according to the direction guide identifier. By setting such a reward mechanism, the player can be promoted to actively complete a specified task or actively move in a virtual scene, which is helpful for accelerating the game process.
In the embodiment of the application, the target scene area is encouraged to be searched in the virtual scene by setting the reward mechanism based on that the target scene area can trigger the azimuth guide identifier to display, and corresponding virtual rewards (the virtual rewards are the azimuth guide identifiers displayed on the graphical user interface) are obtained, so that the virtual objects can complete the search of virtual materials as soon as possible, and the game experience is enriched. Meanwhile, after the player moves to the target scene area, the player can quickly search virtual goods and materials in the virtual scene according to the displayed direction guide identification, so that the player can find the virtual goods and materials as soon as possible, the game process can be accelerated, and the human-computer interaction efficiency is indirectly improved.
In step S103, in response to the virtual object moving into the target scene area, the direction indicator is displayed on the graphical user interface, and the indication direction of the direction indicator is controlled according to the target position of the virtual material in the virtual scene.
In the above step, the direction guide identifier has a guide direction, and the guide direction is a direction in which the current position of the virtual object faces the target position of the virtual goods in the virtual scene, and further, the player can control the virtual object to move from the current position of the virtual object to the target position of the virtual goods in the virtual scene according to the direction guide identifier.
Specifically, the orientation indicator may include one of a two-dimensional orientation indicator and a three-dimensional orientation indicator, and the orientation indicator may be displayed in the form of a planar arrow, a solid arrow, and the like, where the indication directions of the planar arrow and the solid arrow point to the target position of the virtual goods in the virtual scene. In one embodiment, the orientation guide identifier may be disposed at a boundary of the graphical user interface to avoid occlusion of a portion of the virtual scene displayed in the graphical user interface; in another embodiment, the position guide identifier may be disposed at the top of the virtual object, so that the player controlling the virtual object can observe the change of the position guide identifier in time and adjust the traveling route thereof in time according to the change of the position guide identifier.
For example, referring to fig. 2, fig. 2 is a schematic diagram of a graphical user interface provided in an embodiment of the present application, as shown in fig. 2, after a virtual object 201 enters a target scene area 202 of a virtual scene, an orientation indicator 203 is displayed on the graphical user interface, an indication direction of the orientation indicator 203 points to a target position of a virtual material in the virtual scene, as shown in fig. 2, an indication direction of the orientation indicator 203 is left of a current position of the virtual object 201, and a player can control the virtual object 201 to move to the target position of the virtual material according to the indication direction of the orientation indicator 203.
In addition, after the virtual object moves to the target scene area, a text prompt message can also be displayed on the graphical user interface, wherein the text prompt message is used for prompting the distance between the virtual object and the target position of the virtual material in the virtual scene, and if the text prompt message pops up in a pop-up window form; the method can also output voice prompt information, for example, when the target positions of the virtual object and the virtual goods in the virtual scene are continuously reduced, a 'tic' sound is generated, or a 'distance between you and the virtual goods reaches XX' is broadcasted through voice, so that the player can find the virtual goods as soon as possible under the prompt information, and the time for the player to search the virtual goods in the virtual scene is saved.
In the embodiment of the application, the direction guide identifier is displayed on the graphical user interface, so that accurate direction guide can be provided for the virtual object to search virtual materials in the virtual scene, the virtual object can accurately and quickly move to the target position of the virtual materials in the virtual scene according to the indication direction provided by the direction guide identifier, the virtual object can be helped to quickly find the virtual materials in the virtual scene, and the time for the virtual object to search the virtual materials in the virtual scene is saved. In addition, a player does not need to check position identification corresponding to virtual goods and materials displayed in a scene thumbnail map and control the virtual object to move to a target position of the virtual goods and materials in a virtual scene, operation is convenient, sight lines of the player are not affected by too many identification displayed on a graphical user interface, efficiency of the player in searching the virtual goods and materials is improved, the player can find the virtual goods and materials as soon as possible according to the direction guide identification after entering a target scene area, speed of the player in searching the virtual scene for the virtual goods and materials is improved, and man-machine interaction efficiency is improved.
In one embodiment, the following specific description is made in the present application by taking an example of using a target scene area for launching virtual goods and materials:
when the target position of the virtual material in the virtual scene is located in the target scene area, in step S103, in response to the virtual object moving into the target scene area, the step of displaying the direction indicator on the graphical user interface includes: responding to the movement of the virtual object to the target scene area, and detecting the distance between the virtual object and the virtual goods and materials in the target scene area; and responding that the distance between the virtual object and the virtual goods and materials in the target scene area is smaller than a preset distance threshold value, and displaying the azimuth guide identification on the graphical user interface.
In this step, since the virtual materials are in the target scene area, after the virtual object moves to the target scene area, the direction guide identifier may be displayed on the graphical user interface in response to that the distance between the virtual object and the virtual materials is smaller than the preset distance threshold, and the player may control the virtual object to move toward the target position according to the direction guide identifier.
The target scene area may be set to be in a regular shape, such as a circle, a square, or the like, and if the target scene area is a circle, the preset distance threshold is smaller than the diameter of the circle area corresponding to the target scene area; if the target scene area is square, the preset distance threshold is smaller than the side length of the square area corresponding to the target scene area; the target scene area may also be set to an irregular shape, where the preset distance threshold may be set according to experience of the game designer.
Therefore, after the virtual object moves to the target scene area, if the direction guide identification is displayed on the graphical user interface, the virtual object can be represented to approach to the target position of the virtual goods, so that the player is reminded to adjust the moving direction of the virtual object controlled by the player in the target scene area in time, and the virtual object can move to the target position of the virtual goods as soon as possible, thereby saving the time for the player to search the virtual goods and improving the man-machine interaction efficiency.
In the related scheme, because the player can predict the position of the airdropped goods in the virtual scene in advance, the virtual object is controlled to move to the position where the airdropped goods can land in advance, and then the airdropped goods are obtained, so that some old players can squat at the position where the airdropped goods can land, some players, such as new players, dare to pick up the airdropped goods at the position where the airdropped goods can land, the game duration of a single game is prolonged, and the man-machine interaction efficiency is reduced.
To solve the above technical problem, an embodiment of the present application provides another game control method, please refer to fig. 3, and fig. 3 is a flowchart of another game control method provided in the embodiment of the present application. As shown in fig. 3, a game control method provided in an embodiment of the present application includes:
s301, responding that the virtual battle mission meets a preset condition, and controlling to release virtual goods and materials in a virtual scene;
s302, determining a target scene area corresponding to the released virtual goods in the virtual scene;
s303, displaying material area identifications corresponding to virtual materials in a map area corresponding to a target scene area in a scene thumbnail map displayed in a graphical user interface;
s304, responding to the movement of the virtual object to the target scene area, displaying the azimuth guide identifier on the graphical user interface, and controlling the indication direction of the azimuth guide identifier according to the target position of the virtual material in the virtual scene; wherein the target location is located within the target scene area.
For the explanation of step S301, step S302 and step S304, reference may be made to the explanation of step S101 to step S103, and repeated descriptions are omitted.
In step S303, a material area identifier corresponding to a virtual material is displayed in a map area corresponding to a target scene area in a scene thumbnail map displayed in the graphical user interface. Here, the target position is located within the target scene area.
In one embodiment, a map control is displayed on a graphical user interface, a scene thumbnail map is displayed on the graphical user interface in response to a touch operation on the map control, the scene thumbnail map is a thumbnail of the whole virtual scene, the scene thumbnail map includes a map area, the map area corresponds to a target scene area in the virtual scene one by one, since a target position of a virtual material in the virtual scene is located in the target scene area, a material area identifier corresponding to the virtual material is displayed on the map area, and a player can control a virtual object to move towards the target position according to the material area identifier. Here, the virtual scene, the target scene area, and the target position of the virtual goods in the virtual scene are displayed in the scene thumbnail map corresponding to the map control according to the preset scale.
In an optional embodiment, step S303 specifically includes: and overlapping and displaying a material area identifier with a preset area corresponding to the virtual material at a map area corresponding to the target scene area in the scene thumbnail map displayed in the graphical user interface.
In the above embodiment, the material area identifier has a preset area, that is, the proportion of the occupied area of the material area identifier in the scene thumbnail map to the total area of the scene thumbnail map is equal to the preset proportion threshold. Here, since the target scene area in the virtual scene has a certain range, the scale for displaying the target scene area is the same as the scale for displaying the scene thumbnail map, and meanwhile, the area size of the target scene area is in positive correlation with the movement distance obtained by the virtual object continuously moving for the preset time, the material area identifier has a certain occupied area in the field Jing Sulve map, and further can obtain: the proportion of the occupied area of the material area identification in the field Jing Sulve map to the total area of the scene thumbnail map is equal to a preset proportion threshold, wherein the preset proportion threshold can be determined according to the ratio of the actual area of the target scene area to the area of the whole virtual scene.
The 'superposed display and material area identification with a preset area corresponding to the virtual materials' can represent that the material area identification is arranged in a map area in a suspending mode, and here, the material area identification can be arranged in the map area in a suspending mode in a semi-transparent mode so as to prevent the material area identification arranged in the map area in a suspending mode from shielding the map area and further influence a sight line of a player for viewing the scene thumbnail map.
In the foregoing manner, since the material area identifier is used to represent the azimuth information of the target scene area in the virtual scene, the material area identifier in the embodiment of the present application is no longer a coordinate point as shown in the prior art, but the player can find the target scene area according to the material area identifier displayed in the scene thumbnail map, so as to provide a route reference for the virtual object controlled by the player to move to the target scene area, which is helpful for the player to quickly reach the target scene area.
In this way, according to the embodiment of the application, the material area identifier corresponding to the virtual material is displayed in the map area corresponding to the target scene area in the scene thumbnail map, so that the player can control the virtual object to move towards the target scene area according to the material area identifier displayed in the scene thumbnail map, and since the accurate position of the virtual material does not need to be displayed, the specific calculation of the accurate position of the virtual material can be avoided, thereby reducing the consumption of computer resources; because the player can not predict the actual position of the virtual goods in the virtual scene, the position of the virtual goods in the virtual scene has certain randomness, and the individual difference of the player can be considered to improve the fairness of the game; and old players cannot squat at positions near the virtual goods in advance, so that more new players can be promoted to compete for the virtual goods, the development speed of the new players is improved, and because more players are gathered in a target scene area where the virtual goods are located, different camps can certainly fight to eliminate part of players, and the time length of a single game is shortened.
In one embodiment, in step S301, the step of controlling to release the virtual goods and materials in the virtual scene further includes: generating virtual materials at a preset height of a virtual scene; and controlling the virtual material to move from the preset height to the target position of the virtual scene.
In the above steps, a period of time is provided for putting virtual goods and materials in the virtual scene, and in this period of time, the player can control the virtual object to move to the target scene area according to the goods and materials area identification displayed in the scene thumbnail map, so as to provide more deployment time for the player to search virtual goods and materials in the target scene area, and further improve the game experience of the player.
Further, step S303 specifically includes: in the process of controlling the virtual goods and materials to move from the preset height to the target position of the virtual scene, displaying goods and materials area identification corresponding to the virtual goods and materials in a map area corresponding to the target scene area in a scene thumbnail map displayed in a graphical user interface.
On the basis of the above scheme, the embodiment of the present application further includes: and responding to the movement of the virtual goods and materials from the preset height to the target position of the virtual scene, and controlling to update the goods and materials area identification to the goods and materials place identification displayed at the position, corresponding to the target position, of the scene thumbnail.
In the scheme, in the process of controlling the virtual goods and materials to move from the preset height to the target position of the virtual scene, the goods and materials area identification corresponding to the virtual goods and materials is displayed in the map area of the field Jing Sulve map, and when the virtual goods and materials move from the preset height to the target position of the virtual scene, the goods and materials area identification is controlled to be updated to the goods and materials place identification. The material place mark is used for representing the specific target position of the virtual material in the virtual scene.
In the embodiment of the application, in the process of putting virtual goods and materials in a virtual scene, goods and materials area identification corresponding to the virtual goods and materials is displayed in a map area of a field Jing Sulve map, and when the virtual goods and materials move to the target position of the virtual scene from a preset height, except that goods and materials place identification corresponding to the virtual goods and materials is displayed in the map area of a field Jing Sulve map, azimuth guide identification is also displayed on a graphical user interface.
In this way, in the process of putting the virtual goods and materials, the player controls the virtual object to move to the target scene area according to the position of the goods and materials area identification on the scene thumbnail map, wherein the goods and materials area identification only represents the position range of the virtual goods and materials in the virtual scene, because the player cannot predict the actual position of the virtual goods and materials in the virtual scene, the old player cannot squat at the position near the virtual goods and can promote more new players to compete for the virtual goods and materials, so that the development speed of the new players is improved, and because more players search the virtual goods and materials in close time periods, different battles are bound to fight to eliminate part of players, so that the time length of a single game is shortened. When the virtual goods and materials move to the target position of the virtual scene from the preset height, the player can control the virtual object to move to the target position and search the virtual goods and materials according to the azimuth guide identification and the goods and materials location identification, the search efficiency and the search accuracy of the player can be improved, and the human-computer interaction efficiency is further improved.
In one embodiment, a target scene area for putting virtual goods and materials is generated in a virtual scene, and then a goods and materials area identifier corresponding to the virtual goods and materials is displayed in a map area corresponding to the target scene area in a scene thumbnail map displayed in a graphical user interface; in another embodiment, a plurality of target scene areas for delivering virtual goods and materials are generated in a virtual scene, and then a plurality of goods and materials area identifications corresponding to the virtual goods and materials are displayed in a map area corresponding to the target scene area in a scene thumbnail map displayed in a graphical user interface. Here, when a plurality of material area identifiers are displayed on the scene thumbnail map, in order to distinguish the material area identifiers corresponding to different target scene areas, the shape and/or color of each material area identifier may be set to be different. Specifically, the regional sign of goods and materials in this application embodiment has certain area occupied, and then can fill different colours, or different patterns, or different stripes distinguish different regional signs of goods and materials in to the regional sign of different goods and materials.
In a specific embodiment, the embodiment of the present application further includes: distinguishing and displaying material area identifications corresponding to virtual materials displayed in map areas corresponding to different target scene areas in the scene thumbnail map; and different target scene areas are used for putting virtual materials of different grades.
Specifically, the display form of the differential display includes at least one of the following items: display brightness, display shape, fill color.
In the above manner, the display forms for performing differentiated display on the material area identifiers in the map area of the scene thumbnail map corresponding to the target scene area where the virtual materials of different levels are located may include multiple forms, so as to implement differentiated display of different material area identifiers, so that the different material area identifiers have differentiation, which is helpful for a player to rapidly differentiate the material area identifiers of different levels displayed on the scene thumbnail map.
The following specifically explains that the display form includes display luminance, display shape, and fill color, respectively, and the display form includes at least two of the display luminance, the display shape, and the fill color:
firstly, when the display form is display brightness, the material area identifications in the map areas corresponding to the target scene areas where the virtual materials of different grades are located are displayed in a distinguishing mode through different display brightness.
Here, the display brightness refers to the luminous intensity of the material region identification per unit projected area of a surface in a given direction, wherein the given direction is a direction perpendicular to the graphic user interface. The display brightness may vary from very dark (brightness minimum) to very bright (brightness maximum).
According to the above-described characteristics of the display luminance, the display luminance may be divided into a plurality of luminance levels, and the luminance levels of the display luminance correspond one-to-one to the levels of the virtual materials in the target scene area. Wherein, the grade of the virtual material and the brightness grade of the display brightness are in positive correlation.
For example, it is assumed that the virtual materials for delivery to the target scene area include five levels, which are respectively a low-material level, a common-material level, a rare-material level, and a super-material level; and setting display brightness of different brightness levels for the five levels of virtual materials, wherein the identification brightness of the material area related to the super material level is the maximum, and the identification brightness of the material area related to the low material level is the minimum.
The material area identification is displayed through different display brightness, so that a player can distinguish different levels of virtual materials in different target scene areas, the player can conveniently make reasonable tactical deployment according to the current position of the player in the virtual scene, and the target scene area suitable for the player can be searched and scraped.
And secondly, when the display form is a display shape, distinguishing and displaying the material area identifications in the map areas corresponding to the target scene areas where the virtual materials of different grades are located through different display shapes.
Here, the display shape refers to the existence or representation form of the material region identification in the map region of the field Jing Sulve map, for example, the display shape may include a circle, a heart, a square, a pentagon, an irregular figure, or the like. Different display shapes can be adopted for distinguishing and displaying the material area identifications included in the target scene areas to which the virtual materials of different grades belong.
For example, it is assumed that the virtual materials for delivery to the target scene area include five levels, namely, a low-material level, a common-material level, a rare-material level, and a super-material level; for example, a circular icon may be used to represent a material region identifier corresponding to a target scene region to which a virtual material of a low material level belongs, and an irregular graphic icon may be used to represent a material region identifier corresponding to a target scene region to which a virtual material of a super material level belongs.
The material area identification in the map area corresponding to the target scene area where the virtual materials of different grades are located is displayed through different display shapes, and due to the fact that the difference between different display shapes is large, the different grades of the virtual materials in different target scene areas can be distinguished remarkably, and a player can know the grade of the virtual materials in the target scene area in time conveniently.
And thirdly, when the display form is filling color, distinguishing and displaying the material area identifications in the map areas corresponding to the target scene areas where the virtual materials of different grades are located through different filling colors.
Here, the fill color includes red, yellow, green, blue, purple, and the like. Because the material area identification has an area, in order to distinguish the material area identification corresponding to the virtual materials of different grades, different colors can be filled in the material area identification in the map area corresponding to the target scene area where the virtual materials of different grades are located.
For example, the material zone identifiers associated with the super material level are filled with red, and the material zone identifiers associated with the low ingredient material level are filled with yellow.
The material region identification corresponding to the target scene region where the virtual materials of different grades are located is displayed through different filling colors, and different grades of the virtual materials in different target scene regions can be distinguished remarkably due to the fact that different filling colors are different greatly, and players can know the grades of the virtual materials in different target scene regions conveniently in time.
In addition, any two of the display brightness, the display shape, and the fill color included in the display form may be displayed in combination, or three of them may be displayed in combination. If the material area identifications corresponding to the target scene areas where the virtual materials of different grades are located are displayed in a distinguishing manner through the display brightness and the filling color; or the material area identifications in the map areas corresponding to the target scene areas where the virtual materials of different grades are located are displayed in a distinguishing manner through the display brightness, the display shape and the filling color.
It should be noted that, the specific contents related to each display form described in the above description are also applicable to the embodiment in which at least two display forms are combined, and are not described again here.
Therefore, through the random combination of multiple display forms, the purpose of distinguishing and displaying different goods and materials regional identifications is obviously achieved, a player can easily distinguish the differences among the different goods and materials regional identifications, the player can timely know the levels of virtual goods and materials in target scene regions corresponding to the different goods and materials regional identifications, the player can conveniently make reasonable tactical deployment, and the game experience of the player is improved. And multiple different display forms can satisfy different players ' game demand, adapts to different players ' game habits, has promoted player's visual experience.
In a related scheme, in order to enable a player to better observe a position relationship between the player and a target scene area, an embodiment of the present application further includes:
displaying virtual object identifications corresponding to the virtual objects in a scene thumbnail map displayed in a graphical user interface; wherein the position of the virtual object identifier in the scene thumbnail map is updated in real time according to the movement of the virtual object in the virtual scene.
In this step, the player may control the virtual object to move to the target scene area according to the relative position between the virtual object identifier and the material area identifier displayed in the scene thumbnail map, and find the target scene area. Accordingly, the generating position of the virtual object identifier in the scene thumbnail map is determined according to the azimuth information of the virtual object in the virtual scene, wherein the generating position of the virtual object identifier in the scene thumbnail map corresponds to the azimuth information of the virtual object in the virtual scene.
Because the related games comprise a plurality of camps, each campt comprises a plurality of virtual objects, and all the virtual objects in the same campt belong to the teammate relationship. Specifically, on the field Jing Sulve map, the virtual object can see the orientation information of its teammates in the virtual scene, so that a plurality of virtual object identifiers can be displayed on the field Jing Sulve map at the same time, wherein the virtual objects corresponding to the virtual object identifiers displayed in the scene thumbnail map belong to the same camp. When the virtual object is eliminated, the virtual object identifier corresponding to the virtual object disappears from the scene thumbnail map.
Illustratively, the virtual object identification may include any one of: letter sign, characters sign, numeral sign, figure sign, special symbol sign for the convenience of player distinguishes virtual object sign and goods and materials regional sign, can set virtual object sign and goods and materials regional sign into different kinds of sign, if use different show shape to distinguish different goods and materials regional signs, use different colours to distinguish different virtual object signs to guarantee more audio-visual virtual object sign and goods and materials regional sign of demonstrateing to the player.
Therefore, while the material area identification is displayed in the field Jing Sulve map, the virtual object identification is displayed on the graphical user interface, so that the player can control the virtual object to move to the target scene area according to the relative position between the virtual object identification and the material area identification displayed in the scene thumbnail map, the moving speed and the moving accuracy of the virtual object to the target scene area can be improved, the player can quickly reach the target scene area, the player is prevented from missing the opportunity of entering the target scene area to search for the virtual material, and the man-machine interaction efficiency is indirectly improved.
For example, please refer to fig. 4, fig. 4 is a schematic diagram of another graphical user interface provided in an embodiment of the present application. As shown in fig. 4, in a field Jing Sulve map 204, material area identifiers 205 and virtual object identifiers 206 are displayed, wherein the material area identifiers in the scene thumbnail map corresponding to target scene areas where virtual materials of different levels are located are displayed differently, as shown in fig. 4, the material area identifiers 205 include a first material area identifier 2051 (black circle), a second material area identifier 2052 (circle filled with partial black), and a third material area identifier 2053 (circle filled with grid), for the first material area identifier 2051 (black circle), the first material area identifier 2051 with the black circle filled completely represents that the first virtual material is moved to a target position of a virtual scene, and the second material area identifier 2052 with the black circle filled partially represents that the second virtual material is moved to the target position of the virtual scene from a preset height, wherein the material levels of the first virtual material and the second virtual material are the same; the third material area identifier 2053 for completely filling the grid with the circles represents that the third virtual material moves to the target position of the virtual scene, and the material grade of the third virtual material is different from that of the first virtual material. Wherein, the virtual object 201 is located in the virtual scene, and the player can control the virtual object 201 to move to the target scene area according to the relative position relationship between the material area identifier 205 and the virtual object identifier 206.
In order to enable a player to reasonably arrange the time for moving a virtual object controlled by the player to a target scene area to search virtual materials, the embodiment of the application adopts the following scheme:
displaying a timing control on a graphical user interface; the timing control is used for performing countdown display according to the moment when the virtual goods and materials are put in the virtual scene and the moment when the virtual goods and materials appear in the target scene area; and adjusting the display form of the material area identifier corresponding to the virtual material displayed in the map area corresponding to the target scene area in the scene thumbnail map according to the countdown displayed in the timing control.
The timing control is used for performing countdown display according to the time when the virtual materials are started to be put in the virtual scene and the time when the virtual materials appear in the target scene area, namely, countdown is started from the time when the virtual materials are started to be put in the virtual scene, countdown is finished at the time when the virtual materials appear in the target scene area, and the virtual materials appear in the target scene area are controlled.
Therefore, the timing control can perform countdown display according to the moment when the virtual goods and materials are put in the virtual scene and the moment when the virtual goods and materials appear in the target scene area, so that a player can accurately master the initial time when the virtual goods and materials appear in the target scene area, the time when the player moves to the target scene area is reasonably arranged, the purpose of searching the virtual goods and materials in the target scene area in time is achieved, the information prompting mode in a game is optimized, and human-computer interaction can be promoted.
Specifically, as the countdown time progress displayed in the timing control changes, the display form of the material area identifier in the scene thumbnail map also changes along with the countdown time progress change.
Here, when the display form is display brightness, the display brightness of the material area identifier in the scene thumbnail map is increased along with the time progress change of countdown displayed in the timing control; when the display form is a display shape, along with the change of the countdown time progress displayed in the timing control, the display shape of the material area identifier in the scene thumbnail map is larger and larger (from absent to present), wherein the proportion of the occupied area of the display shape of the material area identifier corresponding to the countdown ending moment to the total area of the scene thumbnail map is equal to a preset proportion threshold; when the display form is filling color, along with the change of the countdown time progress displayed in the timing control, the filling area of the material area identifier in the scene thumbnail map is larger and larger, wherein the filling area of the material area identifier corresponding to the countdown ending moment is equal to the occupied area of the material area identifier in the field Jing Sulve map. When the material area mark is filled, colors, patterns, stripes and the like can be adopted as fillers.
Illustratively, according to the countdown time progress displayed in the timing control, the material area identifier corresponding to the virtual material and displayed in the map area corresponding to the target scene area in the scene thumbnail map is filled by adopting the target color; the countdown time progress is in a correlation relation with the filling area of the material region identification, and the target color is determined according to the level of the virtual materials thrown in the target scene region.
Here, since the filling colors corresponding to the virtual materials of different levels are preset, the target color can be determined according to the level of the virtual materials put in the target scene area. In addition, the countdown time progress and the filling area of the material area identification are in positive correlation, so that the importance of distinguishing different material area identifications by highlighting the target color can be further highlighted.
Therefore, the player can accurately master the initial time for searching the virtual goods in the target scene area through the dynamic change of the filling color on the goods and materials area mark, the time wasted by searching when the player does not have the virtual goods and materials in the target scene area can be avoided, and the search efficiency of the player can be improved.
In another embodiment, the following specific description is made in the embodiment of the present application by taking the target scene area as a trigger area for triggering the display of the direction indicator on the graphical user interface as an example:
referring to fig. 5, fig. 5 is a flowchart of another game control method according to an embodiment of the present disclosure. As shown in fig. 5, a game control method provided in an embodiment of the present application includes:
s501, responding that the virtual battle mission meets a preset condition, and controlling to release virtual goods and materials in a virtual scene;
s502, determining a target scene area corresponding to the delivered virtual materials in the virtual scene;
s503, displaying material area identifications corresponding to virtual materials in a map area corresponding to a target scene area in a scene thumbnail map displayed in a graphical user interface;
s504, responding to the fact that the virtual object moves to the target scene area, displaying the direction guide identification on the graphical user interface, and controlling the indication direction of the direction guide identification according to the target position of the virtual material in the virtual scene; wherein the target location is not within the target scene area.
For the explanation of steps S501 to S503, reference may be made to the explanation of steps S301 to S303, and repeated descriptions are omitted.
In step S504, no virtual goods exist in the target scene area, and at this time, the target scene area is used as a trigger area for triggering the display of the direction indicator on the graphical user interface.
In the embodiment of the application, a player can search a target scene area in a virtual scene according to a material area identifier displayed on a graphical user interface, and after a virtual object moves to the target scene area, a direction guide identifier for guiding the virtual object to move towards the virtual material is displayed on the graphical user interface, that is, after the player finds the target scene area, direction information (direction guide identifier) for guiding the player to find the virtual material can be obtained, so that the player controls the virtual object to quickly search the virtual material in the virtual scene according to the direction guide identifier, the setting can promote the player to actively complete a designated task or actively move in the virtual scene to find the target scene area, and after the player finds the target scene area, the direction identifier for guiding the virtual object to move towards the virtual material is provided, so that the searching efficiency of the player on the virtual material is improved, the game progress is accelerated, and the man-machine interaction efficiency is improved.
In an optional embodiment, the embodiment of the present application further includes: and in response to the virtual goods and materials in the target scene area being picked up, controlling to delete the goods and materials area identification corresponding to the virtual goods and materials from the map area corresponding to the target scene area in the scene thumbnail map.
Therefore, the player can change the self traveling route of the player in time according to the lost goods and materials area identification in the scene thumbnail map, and the problem that the virtual object cannot pick up goods and materials when reaching the target scene area and is easy to be overlooked by other players is avoided.
According to the game control method provided by the embodiment of the application, a player can control the virtual object to move towards the target scene area according to the material area identification and the relative position between the virtual object identifications displayed in the scene thumbnail map; and after the virtual object moves to the target scene area, displaying the orientation guide identifier on the graphical user interface to guide the virtual object to move from the current position of the virtual object to the virtual goods and materials. By the method, the accurate position of the virtual material does not need to be displayed, so that specific calculation of the accurate position of the virtual material can be avoided, and the consumption of computer resources is reduced; because the player can not predict the actual position of the virtual goods in the virtual scene, the position of the virtual goods in the virtual scene has certain randomness, and the individual difference of the player can be considered to improve the fairness of the game; the old players cannot squat at the positions near the virtual materials in advance, so that more new players can be promoted to compete for the virtual materials, the development speed of the new players is improved, and as more players search the virtual materials in similar time periods, different camps are bound to fight to eliminate partial players, and the time length of a single game is shortened; the player can find the virtual goods and materials as soon as possible according to the direction guide identification after moving to the target scene area, the speed of searching the virtual goods and materials in the virtual scene by the player is increased, the duration of a single game is favorably shortened, and the man-machine interaction efficiency is improved.
Based on the same inventive concept, the embodiment of the present application further provides a game control device corresponding to the game control method, and as the principle of solving the problem of the device in the embodiment of the present application is similar to the game control method in the embodiment of the present application, the implementation of the device can refer to the implementation of the method, and repeated details are not repeated.
Referring to fig. 6, 7 and 8, fig. 6 is a schematic structural diagram of a game control device according to an embodiment of the present application, fig. 7 is a schematic structural diagram of another game control device according to an embodiment of the present application, and fig. 8 is a schematic structural diagram of another game control device according to an embodiment of the present application. As shown in fig. 6, a game control apparatus 600 provided in an embodiment of the present application includes:
the material control module 601 is used for responding that the virtual battle mission meets the preset condition and controlling virtual materials to be released in the virtual scene;
a region determining module 602, configured to determine a target scene region corresponding to the delivered virtual goods and materials in the virtual scene;
and the azimuth guidance module 603 is configured to respond that the virtual object moves into the target scene area, display an azimuth guidance identifier on the graphical user interface, and control an indication direction of the azimuth guidance identifier according to a target position of the virtual material in the virtual scene.
In an alternative embodiment of the present application, the target location is located within a target scene area.
In an alternative embodiment of the present application, the position guidance module 603 is specifically configured to:
responding to the movement of the virtual object to the target scene area, and detecting the distance between the virtual object and the virtual goods and materials in the target scene area;
and responding that the distance between the virtual object and the virtual goods and materials in the target scene area is smaller than a preset distance threshold value, and displaying the azimuth guide identification on the graphical user interface.
Further, as shown in fig. 7, the apparatus 600 further includes an identifier displaying module 604, where the identifier displaying module 604 is configured to:
and displaying material area identification corresponding to the virtual materials in a map area corresponding to the target scene area in the scene thumbnail map displayed in the graphical user interface.
In an optional embodiment of the present application, the identifier displaying module 604 is specifically configured to:
and overlapping and displaying a material area identifier with a preset area corresponding to the virtual material at a map area corresponding to the target scene area in the scene thumbnail map displayed in the graphical user interface.
In an optional embodiment of the present application, the material control module 601 is specifically configured to:
generating virtual materials at a preset height of a virtual scene;
and controlling the virtual material to move from the preset height to the target position of the virtual scene.
In an optional embodiment of the present application, the identifier displaying module 604 is further specifically configured to:
in the process of controlling the virtual goods and materials to move from the preset height to the target position of the virtual scene, displaying goods and materials area identification corresponding to the virtual goods and materials in a map area corresponding to a target scene area in a scene thumbnail map displayed in a graphical user interface.
In an optional embodiment of the present application, the apparatus further includes an identity updating module (not shown in the figure), and the identity updating module is configured to:
and responding to the virtual goods and materials moving from the preset height to the target position of the virtual scene, and controlling to update the goods and materials area identification to the goods and materials place identification displayed at the position, corresponding to the target position, of the scene thumbnail.
Further, as shown in fig. 8, the apparatus 600 further includes a difference display module 605, where the difference display module 605 is configured to:
distinguishing and displaying material area identifications corresponding to virtual materials displayed in map areas corresponding to different target scene areas in the scene thumbnail map; and different target scene areas are used for putting virtual materials of different grades.
In an alternative embodiment of the present application, the apparatus further includes a timing display module (not shown in the drawings), and the timing display module is configured to:
displaying a timing control on a graphical user interface; the timing control is used for performing countdown display according to the moment when the virtual materials are generated at the preset height of the virtual scene and the moment when the virtual materials move to the target position of the virtual scene from the preset height;
and adjusting the display form of the material area identifier corresponding to the virtual material displayed in the map area corresponding to the target scene area in the scene thumbnail map according to the countdown displayed in the timing control.
In an optional embodiment of the present application, the timing display module is further specifically configured to:
filling material area identifications corresponding to virtual materials, which are displayed in a map area corresponding to a target scene area in the scene thumbnail map, with target colors according to the countdown time progress displayed in the timing control; the countdown time progress is in a correlation relation with the filling area of the material region identification, and the target color is determined according to the level of the virtual materials thrown in the target scene region.
In an alternative embodiment of the present application, the apparatus further comprises an object display module (not shown in the figure), the object display module is configured to:
displaying virtual object identifications corresponding to the virtual objects in a scene thumbnail map displayed in a graphical user interface;
wherein the position of the virtual object identifier in the scene thumbnail map is updated in real time according to the movement of the virtual object in the virtual scene.
In an optional embodiment of the present application, the apparatus further includes an identifier deleting module (not shown in the figure), and the identifier deleting module is configured to:
and in response to the virtual goods and materials in the target scene area being picked up, controlling to delete the goods and materials area identification corresponding to the virtual goods and materials from the map area corresponding to the target scene area in the scene thumbnail map.
In an alternative embodiment of the present application, the preset condition comprises one of the following:
the game progress of the virtual battle mission reaches a preset time point;
executing a virtual carrier calling event by a virtual object in the virtual battle mission, wherein the virtual carrier is used for throwing virtual goods and materials to a virtual scene;
the survival number of the virtual objects which survive in the virtual battle mission reaches the set number.
In the game control device provided by the embodiment of the application, a player can control the virtual object to move towards the target scene area according to the material area identifier and the relative position between the virtual object identifiers displayed in the scene thumbnail map; and after the virtual object moves to the target scene area, displaying the orientation guide identification on the graphical user interface to guide the virtual object to move from the current position of the virtual object to the virtual material. By the method, the accurate position of the virtual material does not need to be displayed, so that specific calculation of the accurate position of the virtual material can be avoided, and the consumption of computer resources is reduced; because the player can not predict the actual position of the virtual goods in the virtual scene, the position of the virtual goods in the virtual scene has certain randomness, and the individual difference of the player can be considered to improve the fairness of the game; the old players cannot squat at the positions near the virtual materials in advance, so that more new players can be promoted to compete for the virtual materials, the development speed of the new players is improved, and as more players search the virtual materials in similar time periods, different camps are bound to fight to eliminate partial players, and the time length of a single game is shortened; after the player moves to the target scene area, the player can find the virtual goods as soon as possible according to the direction guide identification, the speed of searching the virtual goods in the virtual scene by the player is improved, the time length of a single game is favorably shortened, and the man-machine interaction efficiency is improved.
Referring to fig. 9, fig. 9 is a schematic structural diagram of an electronic device according to an embodiment of the present disclosure. As shown in fig. 9, the electronic device 900 includes a processor 901, a memory 902, and a bus 903.
The memory 902 stores machine-readable instructions executable by the processor 901, the processor 901 and the memory 902 communicating over the bus 903 when the electronic device 900 is operating, such that the processor 901 executes the following instructions when operating:
responding that the virtual fighting task meets preset conditions, and controlling to release virtual goods and materials in a virtual scene;
determining a target scene area corresponding to the released virtual goods and materials in the virtual scene;
and responding to the movement of the virtual object to the target scene area, displaying the azimuth guidance identifier on the graphical user interface, and controlling the indication direction of the azimuth guidance identifier according to the target position of the virtual material in the virtual scene.
In an alternative embodiment of the present application, the target location is located within a target scene area.
In an alternative embodiment of the present application, the processor 901 specifically executes the following instructions when the orientation guide identifier is displayed on the graphical user interface in response to the virtual object moving into the target scene area:
responding to the movement of the virtual object to the target scene area, and detecting the distance between the virtual object and the virtual goods and materials in the target scene area;
and responding that the distance between the virtual object and the virtual goods and materials in the target scene area is smaller than a preset distance threshold value, and displaying the azimuth guide identification on the graphical user interface.
In an alternative embodiment of the present application, the processor 901 further executes the following instructions:
and displaying a material area identifier corresponding to the virtual material in a map area corresponding to the target scene area in the scene thumbnail map displayed in the graphical user interface.
In an optional embodiment of the present application, when the map area corresponding to the target scene area in the scene thumbnail map displayed in the graphical user interface displays the material area identifier corresponding to the virtual material, the processor 901 specifically executes the following instructions:
and overlapping and displaying a material area identifier with a preset area corresponding to the virtual material at a map area corresponding to the target scene area in the scene thumbnail map displayed in the graphical user interface.
In an optional embodiment of the present application, when the processor 901 runs the step of controlling to release the virtual materials in the virtual scene, the following instructions are specifically executed:
generating virtual materials at a preset height of a virtual scene;
and controlling the virtual material to move from the preset height to the target position of the virtual scene.
In an optional embodiment of the present application, when the step of displaying, in a map area corresponding to a target scene area in a scene thumbnail map displayed in a graphical user interface, a material area identifier corresponding to a virtual material is executed by the processor 901, the following instructions are specifically executed:
in the process of controlling the virtual goods and materials to move from the preset height to the target position of the virtual scene, displaying goods and materials area identification corresponding to the virtual goods and materials in a map area corresponding to the target scene area in a scene thumbnail map displayed in a graphical user interface.
In an alternative embodiment of the present application, the processor 901 further executes the following instructions:
and responding to the virtual goods and materials moving from the preset height to the target position of the virtual scene, and controlling to update the goods and materials area identification to the goods and materials place identification displayed at the position, corresponding to the target position, of the scene thumbnail.
In an alternative embodiment of the present application, the processor 901 further executes the following instructions:
distinguishing and displaying material area identifications corresponding to virtual materials displayed in map areas corresponding to different target scene areas in the scene thumbnail map; and different target scene areas are used for putting virtual materials of different grades.
In an alternative embodiment of the present application, the processor 901 further executes the following instructions:
displaying a timing control on a graphical user interface; the timing control is used for performing countdown display according to the moment when the virtual materials are generated at the preset height of the virtual scene and the moment when the virtual materials move to the target position of the virtual scene from the preset height;
and adjusting the display form of the material area identifier corresponding to the virtual material displayed in the map area corresponding to the target scene area in the scene thumbnail map according to the countdown displayed in the timing control.
In an optional embodiment of the present application, when the display form of the material area identifier corresponding to the virtual material, which is displayed in the map area corresponding to the target scene area in the time counting adjustment scene thumbnail map according to the countdown displayed in the time counting control, is run, the processor 901 specifically executes the following instruction:
filling material area identifications corresponding to virtual materials, which are displayed in a map area corresponding to a target scene area in a scene thumbnail map, by adopting target colors according to the countdown time progress displayed in the timing control; the countdown time progress is in a correlation relation with the filling area of the material area identification, and the target color is used for determining the level of the virtual materials put in the target scene area.
In an alternative embodiment of the present application, the processor 901 further executes the following instructions:
displaying virtual object identifications corresponding to the virtual objects in a scene thumbnail map displayed in a graphical user interface;
wherein the position of the virtual object identifier in the scene thumbnail map is updated in real time according to the movement of the virtual object in the virtual scene.
In an alternative embodiment of the present application, the processor 901 further executes the following instructions:
and in response to the virtual goods and materials in the target scene area being picked up, controlling to delete the goods and materials area identification corresponding to the virtual goods and materials from the map area corresponding to the target scene area in the scene thumbnail map.
In an alternative embodiment of the present application, the preset condition comprises one of the following:
the game progress of the virtual battle mission reaches a preset time point;
executing a virtual carrier calling event by a virtual object in the virtual battle mission, wherein the virtual carrier is used for throwing virtual goods and materials to a virtual scene;
the survival number of the virtual objects which survive in the virtual battle mission reaches the set number.
According to the game control method provided by the embodiment of the application, a player can control the virtual object to move towards the target scene area according to the material area identification and the relative position between the virtual object identifications displayed in the scene thumbnail map; and after the virtual object moves to the target scene area, displaying the orientation guide identification on the graphical user interface to guide the virtual object to move from the current position of the virtual object to the virtual material. By the method, the accurate position of the virtual material does not need to be displayed, so that specific calculation of the accurate position of the virtual material can be avoided, and the consumption of computer resources is reduced; because the player can not predict the actual position of the virtual goods in the virtual scene, the position of the virtual goods in the virtual scene has certain randomness, and the individual difference of the player can be considered to improve the fairness of the game; the old players cannot squat at the positions near the virtual materials in advance, so that more new players can be promoted to compete for the virtual materials, the development speed of the new players is improved, and as more players search the virtual materials in similar time periods, different camps are bound to fight to eliminate partial players, and the time length of a single game is shortened; after the player moves to the target scene area, the player can find the virtual goods as soon as possible according to the direction guide identification, the speed of searching the virtual goods in the virtual scene by the player is improved, the time length of a single game is favorably shortened, and the man-machine interaction efficiency is improved.
Embodiments of the present application further provide a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when executed by a processor, the computer program executes the following instructions:
responding that the virtual fighting task meets preset conditions, and controlling to release virtual goods and materials in a virtual scene;
determining a target scene area corresponding to the released virtual goods and materials in the virtual scene;
and responding to the movement of the virtual object to the target scene area, displaying the azimuth guide identifier on the graphical user interface, and controlling the indication direction of the azimuth guide identifier according to the target position of the virtual material in the virtual scene.
In an alternative embodiment of the present application, the target location is located within a target scene area.
In an alternative embodiment of the present application, the computer-readable storage medium, when running to display the position guide identifier on the graphical user interface in response to the virtual object moving into the target scene area, specifically executes the following instructions:
responding to the movement of the virtual object to the target scene area, and detecting the distance between the virtual object and the virtual goods and materials in the target scene area;
and responding that the distance between the virtual object and the virtual goods and materials in the target scene area is smaller than a preset distance threshold value, and displaying the azimuth guide identification on the graphical user interface.
In an alternative embodiment of the present application, the computer readable storage medium further executes the following instructions:
and displaying a material area identifier corresponding to the virtual material in a map area corresponding to the target scene area in the scene thumbnail map displayed in the graphical user interface.
In an optional embodiment of the present application, when a map area corresponding to a target scene area in a scene thumbnail map displayed in a graphical user interface displays a material area identifier corresponding to a virtual material, the computer-readable storage medium specifically executes the following instructions:
and overlapping and displaying a material area identifier with a preset area corresponding to the virtual material at a map area corresponding to the target scene area in the scene thumbnail map displayed in the graphical user interface.
In an optional embodiment of the present application, when the step of controlling to release the virtual goods and materials in the virtual scene is executed by the computer-readable storage medium, the following instructions are specifically executed:
generating virtual materials at a preset height of a virtual scene;
and controlling the virtual material to move from the preset height to the target position of the virtual scene.
In an optional embodiment of the present application, when the step of displaying, in a map area corresponding to a target scene area in a scene thumbnail map displayed in a graphical user interface, a material area identifier corresponding to a virtual material is executed by a computer-readable storage medium, the following instructions are specifically executed:
in the process of controlling the virtual goods and materials to move from the preset height to the target position of the virtual scene, displaying goods and materials area identification corresponding to the virtual goods and materials in a map area corresponding to a target scene area in a scene thumbnail map displayed in a graphical user interface.
In an alternative embodiment of the present application, the computer readable storage medium further executes the following instructions:
and responding to the virtual goods and materials moving from the preset height to the target position of the virtual scene, and controlling to update the goods and materials area identification to the goods and materials place identification displayed at the position, corresponding to the target position, of the scene thumbnail.
In an alternative embodiment of the present application, the computer readable storage medium further executes the following instructions:
distinguishing and displaying material area identifications corresponding to virtual materials displayed in map areas corresponding to different target scene areas in the scene thumbnail map; and different target scene areas are used for putting virtual materials of different grades.
In an alternative embodiment of the present application, the computer readable storage medium further executes the following instructions:
displaying a timing control on a graphical user interface; the timing control is used for performing countdown display according to the moment when the virtual materials are generated at the preset height of the virtual scene and the moment when the virtual materials move to the target position of the virtual scene from the preset height;
and adjusting the display form of the material area identifier corresponding to the virtual material displayed in the map area corresponding to the target scene area in the scene thumbnail map according to the countdown displayed in the timing control.
In an optional embodiment of the present application, when the display form of the material area identifier corresponding to the virtual material, which is displayed in the map area corresponding to the target scene area in the scene thumbnail map and is adjusted according to the countdown displayed in the timing control, is run, the following instruction is specifically executed:
filling material area identifications corresponding to virtual materials, which are displayed in a map area corresponding to a target scene area in the scene thumbnail map, with target colors according to the countdown time progress displayed in the timing control; the countdown time progress is in a correlation relation with the filling area of the material area identification, and the target color is used for determining the level of the virtual materials put in the target scene area.
In an alternative embodiment of the present application, the computer readable storage medium further executes the following instructions:
displaying virtual object identifications corresponding to the virtual objects in a scene thumbnail map displayed in a graphical user interface;
wherein the position of the virtual object identifier in the scene thumbnail map is updated in real time according to the movement of the virtual object in the virtual scene.
In an alternative embodiment of the present application, the computer readable storage medium further executes the following instructions:
and in response to the virtual goods and materials in the target scene area being picked up, controlling to delete the goods and materials area identification corresponding to the virtual goods and materials from the map area corresponding to the target scene area in the scene thumbnail map.
In an alternative embodiment of the present application, the preset condition comprises one of the following:
the game process of the virtual battle mission reaches a preset time point;
executing a virtual carrier calling event by a virtual object in the virtual battle mission, wherein the virtual carrier is used for throwing virtual materials to a virtual scene;
the survival number of the virtual objects which survive in the virtual battle mission reaches the set number.
According to the game control method provided by the embodiment of the application, a player can control the virtual object to move towards the target scene area according to the material area identification and the relative position between the virtual object identifications displayed in the scene thumbnail map; and after the virtual object moves to the target scene area, displaying the orientation guide identifier on the graphical user interface to guide the virtual object to move from the current position of the virtual object to the virtual goods and materials. By the method, the accurate position of the virtual material does not need to be displayed, so that specific calculation of the accurate position of the virtual material can be avoided, and the consumption of computer resources is reduced; because the player can not predict the actual position of the virtual goods in the virtual scene, the position of the virtual goods in the virtual scene has certain randomness, and the individual difference of the player can be considered to improve the fairness of the game; the old players cannot squat at the positions near the virtual materials in advance, so that more new players can be promoted to compete for the virtual materials, the development speed of the new players is improved, and as more players search the virtual materials in similar time periods, different camps are bound to fight to eliminate partial players, and the time length of a single game is shortened; the player can find the virtual goods and materials as soon as possible according to the direction guide identification after moving to the target scene area, the speed of searching the virtual goods and materials in the virtual scene by the player is increased, the duration of a single game is favorably shortened, and the man-machine interaction efficiency is improved.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described systems, apparatuses and units may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the several embodiments provided in the present application, it should be understood that the disclosed system, apparatus and method may be implemented in other ways. The above-described embodiments of the apparatus are merely illustrative, and for example, the division of the units is only one logical division, and there may be other divisions when actually implemented, and for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection of devices or units through some communication interfaces, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a non-volatile computer-readable storage medium executable by a processor. Based on such understanding, the technical solution of the present application or portions thereof that substantially contribute to the prior art may be embodied in the form of a software product stored in a storage medium and including instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
Finally, it should be noted that: the above-mentioned embodiments are only specific embodiments of the present application, and are used for illustrating the technical solutions of the present application, but not limiting the same, and the scope of the present application is not limited thereto, and although the present application is described in detail with reference to the foregoing embodiments, those skilled in the art should understand that: any person skilled in the art can modify or easily conceive the technical solutions described in the foregoing embodiments or equivalent substitutes for some technical features within the technical scope disclosed in the present application; such modifications, changes or substitutions do not depart from the spirit and scope of the exemplary embodiments of the present application, and are intended to be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (17)

1. A game control method is characterized in that a graphical user interface is provided through terminal equipment, a virtual scene of a current virtual battle mission is displayed on the graphical user interface, and the method comprises the following steps:
responding that the virtual battle mission meets a preset condition, and controlling to release virtual goods and materials in the virtual scene;
determining a target scene area corresponding to the released virtual goods and materials in the virtual scene;
and responding to the movement of the virtual object to the target scene area, displaying an azimuth guide identifier on the graphical user interface, and controlling the indication direction of the azimuth guide identifier according to the target position of the virtual material in the virtual scene.
2. The method of claim 1, wherein the target location is within the target scene area.
3. The method of claim 2, wherein displaying the bearing guide identification on the graphical user interface in response to the virtual object moving into the target scene area comprises:
responding to the movement of a virtual object into the target scene area, and detecting the distance between the virtual object and virtual goods and materials in the target scene area;
and responding that the distance between the virtual object and the virtual goods and materials in the target scene area is smaller than a preset distance threshold value, and displaying a direction guide identifier on the graphical user interface.
4. The method of claim 2, further comprising:
and displaying material area identification corresponding to the virtual materials in a map area corresponding to the target scene area in the scene thumbnail map displayed in the graphical user interface.
5. The method of claim 4, wherein the displaying of the material area identification corresponding to the virtual material in the map area corresponding to the target scene area in the scene thumbnail map displayed in the graphical user interface comprises:
and displaying a material area identifier with a preset area corresponding to the virtual material in an overlaid manner in a map area corresponding to the target scene area in the scene thumbnail map displayed in the graphical user interface.
6. The method of claim 4, wherein the step of controlling the placement of virtual supplies in the virtual scene comprises:
generating the virtual goods at a preset height of the virtual scene;
and controlling the virtual material to move from the preset height to the target position of the virtual scene.
7. The method of claim 6, wherein the step of displaying the material area identifier corresponding to the virtual material in the map area corresponding to the target scene area in the scene thumbnail map displayed in the graphical user interface comprises:
and in the process of controlling the virtual goods and materials to move from the preset height to the target position of the virtual scene, displaying goods and materials area identification corresponding to the virtual goods and materials in a map area corresponding to the target scene area in a scene thumbnail map displayed in the graphical user interface.
8. The method of claim 7, further comprising:
responding to the virtual goods and materials moving from the preset height to the target position of the virtual scene, and controlling to update the goods and materials area identification to the goods and materials place identification displayed at the position, corresponding to the target position, of the scene thumbnail.
9. The method of claim 4, further comprising:
distinguishing and displaying material area identifications corresponding to the virtual materials, which are displayed in map areas corresponding to different target scene areas in the scene thumbnail map; and different target scene areas are used for putting virtual materials of different grades.
10. The method of claim 7, further comprising:
displaying a timing control on the graphical user interface; the timing control is used for performing countdown display according to the moment when the virtual goods and materials are generated at the preset height of the virtual scene and the moment when the virtual goods and materials move to the target position of the virtual scene from the preset height;
and adjusting the display form of the material area identifier corresponding to the virtual material, which is displayed in the map area corresponding to the target scene area in the scene thumbnail map, according to the countdown displayed in the timing control.
11. The method of claim 10, wherein the adjusting the display form of the material area identifier corresponding to the virtual material displayed in the map area corresponding to the target scene area in the scene thumbnail map according to the countdown displayed in the timing control comprises:
filling material area identifications corresponding to the virtual materials, which are displayed in a map area corresponding to the target scene area in the scene thumbnail map, with target colors according to the countdown time progress displayed in the timing control; the countdown time progress is in a correlation relation with the filling area of the material area mark, and the target color is determined according to the level of the virtual materials put in the target scene area.
12. The method of claim 4, further comprising:
displaying a virtual object identifier corresponding to the virtual object in a scene thumbnail map displayed in the graphical user interface;
wherein the position of the virtual object identifier in the scene thumbnail map is updated in real time according to the movement of the virtual object in the virtual scene.
13. The method of claim 4, further comprising:
and in response to the virtual goods and materials in the target scene area being picked up, controlling to delete the goods and materials area identification corresponding to the virtual goods and materials from the map area corresponding to the target scene area in the scene thumbnail map.
14. The method according to claim 1, wherein the preset condition comprises one of:
the game progress of the virtual battle mission reaches a preset time point;
executing a virtual carrier calling event by a virtual object in the virtual battle task, wherein the virtual carrier is used for throwing virtual materials to the virtual scene;
the survival number of the virtual objects which survive in the virtual battle mission reaches the set number.
15. A game control apparatus for providing a graphical user interface on which a virtual scene of a virtual battle mission is displayed through a terminal device, the apparatus comprising:
the material control module is used for responding that the virtual fighting task meets preset conditions and controlling virtual materials to be put in the virtual scene;
the region determining module is used for determining a target scene region corresponding to the launched virtual goods in the virtual scene;
and the azimuth guiding module is used for responding to the movement of the virtual object to the target scene area, displaying an azimuth guiding identifier on the graphical user interface, and controlling the indicating direction of the azimuth guiding identifier according to the target position of the virtual material in the virtual scene.
16. An electronic device, comprising: a processor, a memory and a bus, the memory storing machine-readable instructions executable by the processor, the processor and the memory communicating over the bus when the electronic device is operating, the processor executing the machine-readable instructions to perform the steps of the game control method according to any one of claims 1 to 14.
17. A computer-readable storage medium, having stored thereon a computer program for performing, when executed by a processor, the steps of a game control method according to any one of claims 1 to 14.
CN202210968915.6A 2022-08-12 2022-08-12 Game control method and device, electronic equipment and storage medium Pending CN115317912A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202210968915.6A CN115317912A (en) 2022-08-12 2022-08-12 Game control method and device, electronic equipment and storage medium
PCT/CN2022/128977 WO2024031849A1 (en) 2022-08-12 2022-11-01 Game control method and apparatus, and electronic device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210968915.6A CN115317912A (en) 2022-08-12 2022-08-12 Game control method and device, electronic equipment and storage medium

Publications (1)

Publication Number Publication Date
CN115317912A true CN115317912A (en) 2022-11-11

Family

ID=83923936

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210968915.6A Pending CN115317912A (en) 2022-08-12 2022-08-12 Game control method and device, electronic equipment and storage medium

Country Status (2)

Country Link
CN (1) CN115317912A (en)
WO (1) WO2024031849A1 (en)

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6078724B2 (en) * 2012-05-16 2017-02-15 株式会社ユピテル System, electronic device and program
JP6877902B2 (en) * 2016-07-06 2021-05-26 任天堂株式会社 Game programming, game processing methods, and game systems
CN109876442A (en) * 2019-04-15 2019-06-14 网易(杭州)网络有限公司 Route indicating means, equipment and storage medium in game based on map
CN111760288B (en) * 2020-06-10 2024-03-12 网易(上海)网络有限公司 Method, device, terminal and storage medium for displaying direction in virtual three-dimensional scene
CN112044069B (en) * 2020-09-10 2023-02-03 腾讯科技(深圳)有限公司 Object prompting method, device, equipment and storage medium in virtual scene
CN113499586B (en) * 2021-07-08 2024-04-12 网易(杭州)网络有限公司 Information prompting method and device in game, electronic equipment and storage medium
CN115089959A (en) * 2022-06-21 2022-09-23 网易(杭州)网络有限公司 Direction prompting method and device in game and electronic terminal

Also Published As

Publication number Publication date
WO2024031849A1 (en) 2024-02-15

Similar Documents

Publication Publication Date Title
US11890541B2 (en) Virtual object control method and apparatus, device, and storage medium
CN110433493B (en) Virtual object position marking method, device, terminal and storage medium
CN113440846B (en) Game display control method and device, storage medium and electronic equipment
EP3970819A1 (en) Interface display method and apparatus, and terminal and storage medium
CN112138384B (en) Using method, device, terminal and storage medium of virtual throwing prop
CN111672116B (en) Method, device, terminal and storage medium for controlling virtual object release technology
CN110339564B (en) Virtual object display method, device, terminal and storage medium in virtual environment
CN110465090B (en) Virtual object control method, device, terminal and storage medium
EP3943175A1 (en) Information display method and apparatus, and device and storage medium
US20230072503A1 (en) Display method and apparatus for virtual vehicle, device, and storage medium
WO2022257653A1 (en) Virtual prop display method and apparatus, electronic device and storage medium
CN110801629B (en) Method, device, terminal and medium for displaying virtual object life value prompt graph
CN110478900B (en) Map area generation method, device, equipment and storage medium in virtual environment
CN113813603A (en) Game display control method and device, electronic equipment and storage medium
CN113546422A (en) Virtual resource delivery control method and device, computer equipment and storage medium
JP7449416B2 (en) Virtual object skill selection method, device, equipment, storage medium and program
CN114307147A (en) Interactive method and device in game, electronic equipment and storage medium
CN114247146A (en) Game display control method and device, electronic equipment and medium
CN112221135B (en) Picture display method, device, equipment and storage medium
CN112076468B (en) Virtual environment picture display method, device, equipment and storage medium
CN113304479B (en) Indication information display method and device, game terminal equipment and storage medium
CN115317912A (en) Game control method and device, electronic equipment and storage medium
CN113144617B (en) Control method, device and equipment of virtual object and computer readable storage medium
CN111905380B (en) Virtual object control method, device, terminal and storage medium
CN114042309A (en) Using method, device, terminal and storage medium of virtual prop

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination