CN113559508B - Method, device, equipment and storage medium for prompting azimuth of virtual object - Google Patents

Method, device, equipment and storage medium for prompting azimuth of virtual object Download PDF

Info

Publication number
CN113559508B
CN113559508B CN202110853128.2A CN202110853128A CN113559508B CN 113559508 B CN113559508 B CN 113559508B CN 202110853128 A CN202110853128 A CN 202110853128A CN 113559508 B CN113559508 B CN 113559508B
Authority
CN
China
Prior art keywords
virtual object
current
azimuth
virtual
target virtual
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110853128.2A
Other languages
Chinese (zh)
Other versions
CN113559508A (en
Inventor
王翌希
胡志鹏
程龙
刘勇成
袁思思
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Netease Hangzhou Network Co Ltd
Original Assignee
Netease Hangzhou Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netease Hangzhou Network Co Ltd filed Critical Netease Hangzhou Network Co Ltd
Priority to CN202110853128.2A priority Critical patent/CN113559508B/en
Publication of CN113559508A publication Critical patent/CN113559508A/en
Application granted granted Critical
Publication of CN113559508B publication Critical patent/CN113559508B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/53Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game
    • A63F13/533Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game for prompting the player, e.g. by displaying a game menu
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/53Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game
    • A63F13/537Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game using indicators, e.g. showing the condition of a game character on screen
    • A63F13/5378Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game using indicators, e.g. showing the condition of a game character on screen for displaying an additional top view, e.g. radar screens or maps
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/847Cooperative playing, e.g. requiring coordinated actions from several players to achieve a common goal

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application provides a method, a device, equipment and a storage medium for prompting the azimuth of a virtual object, wherein the method comprises the following steps: receiving an azimuth checking instruction for the target virtual object; responding to the azimuth checking instruction, and acquiring relative azimuth information of the target virtual object and a current virtual object in a virtual scene, wherein the current virtual object is a virtual object under the control of a current terminal; and prompting the relative azimuth information. According to the application, the relative azimuth information between the target virtual object and the current virtual object is obtained in real time by responding to the checking instruction, and is given and prompted, so that a player can check the relative azimuth between the player and teammates in real time.

Description

Method, device, equipment and storage medium for prompting azimuth of virtual object
Technical Field
The application relates to the technical field of electronic games, in particular to a method, a device, equipment and a storage medium for prompting the azimuth of a virtual object.
Background
MMO (MASSIVELY MULTIPLAYER ONLINE ) games generally refer to games in which a large number of players (e.g., about 1000 players) can be provided on any server of the online games while online, and can be referred to as massively multiplayer online games. In the course of team-forming games, players can be formed with other players, and in the course of team-forming games, the situation that the player needs to view his own teammates is usually encountered, and it is common practice to click on the teammates' status bars, if the teammates are close to the player, the teammates can see their own distance.
The method for checking the teammate state is more suitable for common activities of MMO games, when teammates exceed the viewable distance in a special game mode, such as a 'eating chicken' mode, players cannot see the distance between the players and the teammates, and when an emergency needs teammate integration, accurate relative positions of the players and other teammates are difficult to learn, so that corresponding strategy reactions cannot be quickly made, and adverse effects are brought to game results.
Disclosure of Invention
The embodiment of the application aims to provide a method, a device, equipment and a storage medium for prompting the azimuth of a virtual object, which are used for acquiring the relative azimuth information between a target virtual object and a current virtual object in real time by responding to a checking instruction and giving prompts so that a player can check the relative azimuth between the player and teammates in real time.
The first aspect of the embodiment of the application provides a method for prompting the azimuth of a virtual object, which comprises the following steps: receiving an azimuth checking instruction for the target virtual object; responding to the azimuth checking instruction, and acquiring relative azimuth information of the target virtual object and a current virtual object in a virtual scene, wherein the current virtual object is a virtual object under the control of a current terminal; and prompting the relative azimuth information.
In an embodiment, the responding to the direction viewing instruction, obtaining the relative direction information of the target virtual object and the current virtual object in the virtual scene includes: and responding to the azimuth checking instruction, and calculating the relative azimuth information of the target virtual object and the current virtual object in a virtual map.
In an embodiment, the calculating the relative orientation information of the target virtual object and the current virtual object in the virtual map includes: and calculating the relative position information of the target virtual object and the current virtual object in the virtual map, and calculating the travel time required by the current virtual object to reach the position of the target virtual object from the current position in the virtual map.
In an embodiment, the prompting the relative orientation information includes: and generating a prompt element pointing to the target virtual object from the current virtual object in the virtual scene, and displaying the prompt element.
In one embodiment, the method further comprises: and when the target virtual objects are multiple, prompting relative azimuth information between every two of the multiple target virtual objects.
In one embodiment, the receiving the direction viewing instruction for the target virtual object includes: and receiving a direction viewing instruction of the target virtual object through a shortcut key or a virtual trigger button on the current display interface.
A second aspect of an embodiment of the present application provides an azimuth indicating device for a virtual object, including: the receiving module is used for receiving an azimuth checking instruction of the target virtual object; the response module is used for responding to the azimuth checking instruction and obtaining the relative azimuth information of the target virtual object and the current virtual object in the virtual scene, wherein the current virtual object is a virtual object under the control of the current terminal; and the prompting module is used for prompting the relative azimuth information.
In an embodiment, the response module is configured to: and responding to the azimuth checking instruction, and calculating the relative azimuth information of the target virtual object and the current virtual object in a virtual map.
In an embodiment, the calculating the relative orientation information of the target virtual object and the current virtual object in the virtual map includes: and calculating the relative position information of the target virtual object and the current virtual object in the virtual map, and calculating the travel time required by the current virtual object to reach the position of the target virtual object from the current position in the virtual map.
In one embodiment, the prompting module is configured to: and generating a prompt element pointing to the target virtual object from the current virtual object in the virtual scene, and displaying the prompt element.
In an embodiment, the prompting module is further configured to: and when the target virtual objects are multiple, prompting relative azimuth information between every two of the multiple target virtual objects.
In an embodiment, the receiving module is configured to: and receiving a direction viewing instruction of the target virtual object through a shortcut key or a virtual trigger button on the current display interface.
A third aspect of an embodiment of the present application provides an electronic device, including: a memory for storing a computer program; a processor for executing the computer program to implement the method of the first aspect of the embodiment of the present application and any of the embodiments thereof.
A fourth aspect of an embodiment of the present application provides a non-transitory electronic device readable storage medium, comprising: and a program which, when run by an electronic device, causes the electronic device to perform the method of the first aspect of the embodiments of the application and any of its embodiments.
According to the azimuth prompting method, device, equipment and storage medium for the virtual object, provided by the application, the azimuth checking instruction of a player to the target virtual object is received in real time, the azimuth checking instruction is responded in real time, the relative azimuth information between the target virtual object and the current virtual object is obtained, and the information is prompted, so that the player can check the relative azimuth between the player and teammates in real time.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed in the embodiments of the present application will be briefly described below, it should be understood that the following drawings only illustrate some embodiments of the present application and should not be considered as limiting the scope, and other related drawings can be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a schematic diagram of an electronic device according to an embodiment of the application;
FIG. 2 is a schematic diagram of a virtual scene according to an embodiment of the application;
FIG. 3 is a flowchart illustrating a method for prompting an azimuth of a virtual object according to an embodiment of the application;
FIG. 4 is a flowchart illustrating a method for prompting an azimuth of a virtual object according to an embodiment of the application;
FIG. 5A is a schematic diagram of an interactive interface of a method for prompting the azimuth of a virtual object according to an embodiment of the application;
FIG. 5B is a schematic diagram of an interactive interface of a method for prompting the azimuth of a virtual object according to an embodiment of the application;
fig. 6 is a schematic structural diagram of an azimuth indicating device for a virtual object according to an embodiment of the application.
Detailed Description
The technical solutions in the embodiments of the present application will be described below with reference to the accompanying drawings in the embodiments of the present application. In the description of the present application, the terms "first," "second," and the like are used merely to distinguish between descriptions and are not to be construed as indicating or implying relative importance.
As shown in fig. 1, the present embodiment provides an electronic apparatus 1 including: at least one processor 11 and a memory 12, one processor being exemplified in fig. 1. The processor 11 and the memory 12 are connected by a bus 10. The memory 12 stores instructions executable by the processor 11, which are executed by the processor 11 to enable the electronic device 1 to perform all or part of the methods in the embodiments described below to implement prompting the relative orientation information of the current virtual object and the target virtual object in the virtual scene.
In an embodiment, the electronic device 1 may be a mobile phone, a tablet computer, a notebook computer, a desktop computer, or the like.
For a more clear description of the solution of this embodiment, the terms that will be referred to will now be interpreted as follows:
Virtual scene: is a virtual scene that an application displays (or provides) while running on a terminal. The virtual scene can be a simulation environment scene of a real world, a half-simulation half-fictional three-dimensional environment scene, or a pure fictional three-dimensional environment scene. The virtual scene may be any one of a two-dimensional virtual scene, a 2.5-dimensional virtual scene, and a three-dimensional virtual scene, and the following embodiments are exemplified by the virtual scene being a three-dimensional virtual scene, but are not limited thereto.
In one embodiment, as shown in fig. 2, the virtual scene 200 may be generated by an application program in a computer device such as the electronic device 1 and displayed based on hardware (such as a screen) in the electronic device 1.
In an embodiment, a virtual map 201 may be disposed in the virtual scene 200, and virtual geographic location information of the virtual scene 200 is configured in the virtual map 201. As shown in fig. 2, a zoomed out virtual map 201 may be displayed on the interactive interface 20.
Virtual object: refers to movable objects and non-movable objects in the virtual scene 200. The movable object may be at least one of a virtual character, a virtual animal, a virtual vehicle. The inactive object may be a virtual plant, virtual mountain forest, virtual streamable object, or the like. The virtual object a (player a) and the virtual object B (player B) shown in fig. 2 are exemplified by virtual characters, and may further include virtual mountain forests.
In one embodiment, when the virtual scene 200 is a three-dimensional virtual scene 200, the virtual objects are three-dimensional stereoscopic models created based on animated skeleton techniques. Each virtual object has its own shape, volume, and orientation in the three-dimensional virtual scene 200 and occupies a portion of the space in the three-dimensional virtual scene 200.
In an embodiment, the processor 11 of the electronic device 1 may generate the virtual scene 200 by executing or calling the program code and data stored in the memory 12, and present the generated virtual scene 200 through an external output/input device. In the process of displaying the virtual scene 200, the capacitive touch system can be used for detecting the touch operation executed when the user interacts with the virtual scene 200, and the external keyboard can also be used for detecting the interaction operation between the user and the virtual scene 200.
Taking a multi-person chicken eating electronic game scene as an example, a chicken eating game is a large escape and kill game mode, and the playing method of the chicken eating game is simple, so that exploration and collection elements are fused, and the survival game that the hand is adhered to the last round is eliminated. The game mainly puts a plurality of players on a virtual map 201 with quite large terrain and complex topography, and the players can select a single-person mode for team formation in the aspect of team formation, and can form team formation by two persons and four persons. By initially observing the course, selecting a landing point, then picking up resources (such as defenses, firearms, ammunition, medicine bags, doubling glasses and the like), utilizing topography or guarding or attacking, and constantly facing the urgent sense of endangering survival such as poison circle shrinkage or random bombing, the team of the player will obtain the chicken eating achievement after lasting to eliminate the last opponent.
In the above-mentioned chicken game scenario, when a player selects a plurality of teams, for example, 3-5 teams, the player needs to cooperate to fight, and resources in a region are available, so that scattered collection of resources often occurs between teammates. When the emergency needs teammate integration, in the prior art, the player is difficult to know the relative positions of the player and other teammates, so that the game result is influenced compared with the situation that the corresponding strategy reaction cannot be quickly made. The embodiment of the application can solve the problem that the player cannot know the mutual positions of teammates, and help the player to quickly know the relative positions of the player and teammates, so as to make a quicker and correct response to the emergency.
Please refer to fig. 3, which is a method for prompting the azimuth of a virtual object according to an embodiment of the present application, wherein the method can be executed by the electronic device 1 shown in fig. 1 as a current terminal, and can be applied to the virtual scene 200 of the chicken-eating electronic game shown in fig. 2, so as to prompt the relative azimuth information of the current virtual object and the target virtual object in the virtual scene 200. The method comprises the following steps:
Step 301: and receiving an azimuth viewing instruction for the target virtual object.
In this step, the virtual scene 200 may include a plurality of virtual objects, and the virtual objects may be virtual character objects under the control of the user terminal, virtual carriers in the virtual scene 200, or virtual objects in any other form controlled by the terminal, such as virtual animals. The virtual object takes a virtual character as an example, the target virtual object can be a teammate of a current player, when the teammate is required to be checked in the game process of the player, a corresponding azimuth checking instruction can be triggered aiming at a specific teammate, and the terminal receives the azimuth checking instruction in real time.
In one embodiment, the player may receive an orientation viewing instruction for the target virtual object through a shortcut key or a virtual trigger button on the current display interface. The target virtual object may include a plurality of. For example, three players logged in by the current terminal can form a team together, a shortcut key F3 corresponding to a current virtual object A controlled by the current terminal can be preconfigured when the team is formed, the remaining two teammates can respectively correspond to the shortcut keys F1 and F2, the player can click F1 to trigger an azimuth viewing instruction for teammates No. 1 (target virtual objects), and the player clicks F1 again to cancel azimuth viewing. If F2 is clicked after F1 is clicked, the direction viewing instructions of the No. 1 teammates and the No. 2 teammates can be triggered respectively.
Step 302: in response to the direction viewing instruction, the relative direction information of the target virtual object and the current virtual object in the virtual scene 200 is acquired.
In this step, the current virtual object is a virtual object under control of the current terminal, and the current virtual object and the target virtual object may be team teammates. When receiving a direction viewing instruction of a player on a target virtual object, responding to the viewing instruction, and acquiring the relative direction information of the target virtual object and the current virtual object in the virtual scene 200 in real time. No limitation is made here on the distance between virtual objects. The relative position information may be a relative position distance between two virtual objects and a relative deviation direction.
Step 303: the relative orientation information is prompted.
In this step, after the relative azimuth information is obtained, the azimuth information needs to be prompted, and the co-player looks over the distance position between himself and teammates in real time to assist the player to quickly know the relative position between himself and teammates, so as to make a quicker and correct response to the emergency.
In an alternative embodiment, the relative orientation information comprises direction indication information, such as direction indication information characterized by arrow indication, path finding indication, etc. In an alternative embodiment, the relative position information comprises direction indication information and distance indication information, such as direction indication information characterized by an arrow indication, a path-finding path indication, etc., and distance indication information after both are indicated by numerals.
In an alternative embodiment, the target virtual object corresponding to the azimuth view instruction includes a first virtual object and a second virtual object. The relative orientation information also includes distance indication information between the first virtual object and the second virtual object, as compared to other embodiments. In one embodiment, step 303 may include: a hint element 202 is generated in the virtual scene 200 that points from the current virtual object to the target virtual object, and the hint element 202 is displayed. The prompt element 202 may be one or more of voice, text, or iconic symbols. The iconic symbols may be indicator lines, arrows, etc.
According to the azimuth prompting method of the virtual object, the azimuth checking instruction of the player on the target virtual object is received in real time, the azimuth checking instruction is responded in real time, the relative azimuth information between the target virtual object and the current virtual object is obtained, and the prompt is given, so that the player can check the relative azimuth between the player and teammates in real time.
Please refer to fig. 4, which is a method for prompting the azimuth of a virtual object according to an embodiment of the present application, wherein the method can be executed by the electronic device 1 shown in fig. 1 as a current terminal, and can be applied to the virtual scene 200 of the chicken-eating electronic game shown in fig. 2, so as to prompt the relative azimuth information of the current virtual object and the target virtual object in the virtual scene 200. The method comprises the following steps:
Step 401: and receiving an azimuth viewing instruction for the target virtual object. See the description of step 301 in the above embodiments for details.
Step 402: in response to the direction viewing instruction, a virtual map 201 of the virtual scene 200 is acquired in accordance with the direction viewing instruction.
In this step, the virtual map 201 includes geographical distribution information of various virtual objects in the virtual scene 200, such as geographical distribution of mountains, rivers, geographical distribution of virtual characters, virtual buildings, virtual vehicles, virtual plants, and virtual animals, and the like.
Step 403: the relative position information of the target virtual object and the current virtual object in the virtual map 201 is calculated.
In this step, geographical distribution position information of the current virtual object and the target virtual object in the virtual map 201 may be acquired based on the virtual map 201, and then relative orientation information between the two may be calculated.
In one embodiment, step 403 may specifically include: the relative position information of the target virtual object and the current virtual object in the virtual map 201 is calculated. The relative position information may be position coordinate information of the current virtual object and the target virtual object in the virtual map 201, for example, two virtual character center point coordinate information, and the coordinate distance between the two may be calculated based on the two position coordinate information.
In one embodiment, step 403 may specifically include: the travel time required for the current virtual object to reach the target virtual object from the current position in the virtual map 201 is calculated. In an actual game scene, a player not only wants to know the distance between the player and the teammate, but also usually wants to quickly know how long it takes to go to the location of the teammate, especially in a relatively urgent situation, for example, the teammate B sends a distress signal, if the player can quickly know the time required by the player to reach the teammate B, the player can be assisted to better make a rescue strategy. Thus, the relative position information may further include a travel time required for the current virtual object to reach the target virtual object from the current position.
Step 404: a hint element 202 is generated in the virtual scene 200 that points from the current virtual object to the target virtual object, and the hint element 202 is displayed.
In this step, the prompt element 202 may be one or more of a voice, text, or iconic symbol. The iconic symbols may be indicator lines, arrows, etc. The prompt element 202 may be embedded in the virtual scene 200, and is used to indicate azimuth information, and may be displayed on the current user interface to form interaction information with the user, so that the player user may conveniently view the relative azimuth information between the current virtual object and the target virtual object in real time. The interactive experience of the player user is increased.
In one embodiment, as shown in FIG. 5A, the prompt element 202 may be in the form of a text plus an icon symbol. Assuming that the current virtual object A is the current virtual object, a player of the current terminal clicks the shortcut key F2 to trigger a direction viewing instruction for the target virtual object B (teammate B), and the acquired relative direction information is as follows: the distance between the virtual object a and the virtual object B is 23 ruler, at which time the virtual object B may not be within the currently displayed interactive interface 20, and the virtual object a may take about 6 seconds to reach the virtual object B. The text element of the azimuth information may be generated to be about 23 feet (about 6 s) away from teammate B, and the azimuth indication dashed line B may be generated, and the indication dashed line may be displayed in the reduced virtual map 201 and the virtual scene 200 for the user to view more conveniently.
In an embodiment, when the target virtual objects are plural, step 404 may further include prompting relative orientation information between every two of the plural target virtual objects. In an actual game scene, a plurality of teammates participating in team formation often exist, when a player selects team formation teammates, shortcuts corresponding to azimuth information viewing functions corresponding to each team member can be preconfigured, for example, each team member is respectively configured with different numbers to distinguish, and corresponding number information is configured with the shortcuts corresponding to the team members, if the player user presses the shortcuts of the teammates, the player can view the azimuth of the player from the teammates, and view the azimuth information of other teammates, so that the player is assisted to make a matching strategy better.
In one embodiment, as shown in FIG. 5B, assuming a chicken game is played, the front end player selects 3 team members, the team member gates, while falling together, travel to different locations in the virtual scene 200 for collection of resources, respectively. If the current virtual object a of the current player encounters an enemy and wants to seek help from other teammates, the current player user may trigger a shortcut key for viewing the directions of other teammates, for example, a direction viewing instruction for the target virtual object B and the target virtual object C is triggered respectively, the terminal responds after receiving the viewing instruction, acquires the relative direction information between the current virtual object a and the target virtual object B, acquires the relative direction information between the current virtual object a and the target virtual object C, acquires the relative direction information between the target virtual object B and the target virtual object C, and generates a prompt element 202 to prompt respectively. As shown in FIG. 5B, the hint element 202, dashed line B, indicates the position between the current virtual object A and the target virtual object B and displays that the literal element "about 23 feet (about 6 seconds) from teammate B" can be reached. The hint element 202, dashed line C, indicates the position between the current virtual object A and the target virtual object C and displays the literal element "about 44 feet (about 11 s)" from teammate C as reachable. And the text element "teammate B is about 46 feet from teammate C, (about 13 s)", that is, between teammate B and teammate C, it takes 13 seconds for the other to reach the other while one is stationary, can be displayed. Meanwhile, the above-described instruction information may be displayed in the thumbnail virtual map 201. So that the user can more intuitively view the azimuth information.
In an embodiment, the target virtual object may not be in the currently displayed interactive interface 20, and when the player user selects to advance to the position of one of the target virtual objects, the target virtual object gradually enters the pre-point interactive interface 20 as the distance between the two is closer.
The above-mentioned azimuth prompting method of the virtual object can quickly acquire the relative distance between the player and other teammates in the virtual scene 200, for example, can quickly look up how many teammates are closest to the player and the distance between them, etc., thereby helping the player to make more accurate decisions such as escaping or holding, etc., and improving the interactive experience of the player in the game process.
Referring to fig. 5, an azimuth indicating device 600 for a virtual object according to an embodiment of the application is applicable to the electronic device 1 shown in fig. 1 and can be applied to the virtual scene 200 of a chicken-eating electronic game shown in fig. 2 to indicate the relative azimuth information of a current virtual object and a target virtual object in the virtual scene 200. The device comprises: the principle relation of the receiving module 601, the responding module 602 and the prompting module 603 is as follows:
The receiving module 601 is configured to receive an azimuth view instruction for a target virtual object. See the description of step 301 in the above embodiments for details.
The response module 602 is configured to obtain, in response to the direction viewing instruction, relative direction information of the target virtual object and the current virtual object in the virtual scene 200, where the current virtual object is a virtual object under control of the current terminal. See the description of step 302 in the above embodiments for details.
A prompting module 603, configured to prompt the relative orientation information. See for details the description of step 303 in the above embodiments.
In one embodiment, the response module 602 is configured to: in response to the bearing view instruction, relative bearing information of the target virtual object and the current virtual object in the virtual map 201 is calculated.
In one embodiment, calculating the relative position information of the target virtual object and the current virtual object in the virtual map 201 includes: the relative position information of the target virtual object and the current virtual object in the virtual map 201 is calculated, and the travel time required for the current virtual object to reach the position of the target virtual object from the current position in the virtual map 201 is calculated.
In one embodiment, the prompting module 603 is configured to: a hint element 202 is generated in the virtual scene 200 that points from the current virtual object to the target virtual object, and the hint element 202 is displayed.
In one embodiment, the prompting module 603 is further configured to: when the target virtual objects are multiple, prompting relative azimuth information between every two of the multiple target virtual objects.
In one embodiment, the receiving module 601 is configured to: and receiving an azimuth viewing instruction of the target virtual object through the shortcut key or the virtual trigger button on the current display interface.
For a detailed description of the above-mentioned orientation prompting device 600 of the virtual object, please refer to the description of the related method steps in the above-mentioned embodiment.
The embodiment of the invention also provides a non-transitory electronic device readable storage medium, which comprises: a program which, when run on an electronic device, causes the electronic device to perform all or part of the flow of the method in the above-described embodiments. The storage medium may be a magnetic disk, an optical disk, a Read-Only Memory (ROM), a random access Memory (Random Access Memory, RAM), a Flash Memory (Flash Memory), a hard disk (HARD DISK DRIVE, abbreviated as HDD), a Solid state disk (Solid-state disk STATE DRIVE, SSD), or the like. The storage medium may also comprise a combination of memories of the kind described above.
Although embodiments of the present invention have been described in connection with the accompanying drawings, various modifications and variations may be made by those skilled in the art without departing from the spirit and scope of the invention, and such modifications and variations are within the scope of the invention as defined by the appended claims.

Claims (9)

1. A method for prompting the azimuth of a virtual object, comprising:
receiving an azimuth checking instruction for the target virtual object;
responding to the azimuth checking instruction, and acquiring relative azimuth information of the target virtual object and a current virtual object in a virtual scene, wherein the current virtual object is a virtual object under the control of a current terminal;
Prompting the relative azimuth information;
The prompting the relative orientation information includes:
generating a prompt element pointing to the target virtual object from the current virtual object in the virtual scene, and displaying the prompt element; the prompt element comprises: the text is used for indicating the distance between the current virtual object and the target virtual object, and the icon symbol comprises a direction indication line.
2. The method of claim 1, wherein the obtaining, in response to the direction view instruction, relative direction information of the target virtual object and the current virtual object in the virtual scene comprises:
And responding to the azimuth checking instruction, and calculating the relative azimuth information of the target virtual object and the current virtual object in a virtual map.
3. The method of claim 2, wherein the calculating relative position information of the target virtual object and the current virtual object in a virtual map comprises:
And calculating the relative position information of the target virtual object and the current virtual object in the virtual map, and calculating the travel time required by the current virtual object to reach the position of the target virtual object from the current position in the virtual map.
4. The method as recited in claim 1, further comprising:
And when the target virtual objects are multiple, prompting relative azimuth information between every two of the multiple target virtual objects.
5. The method of claim 1, wherein receiving an orientation view instruction for a target virtual object comprises:
And receiving a direction viewing instruction of the target virtual object through a shortcut key or a virtual trigger button on the current display interface.
6. An azimuth indicating device for a virtual object, comprising:
The receiving module is used for receiving an azimuth checking instruction of the target virtual object;
The response module is used for responding to the azimuth checking instruction and obtaining the relative azimuth information of the target virtual object and the current virtual object in the virtual scene, wherein the current virtual object is a virtual object under the control of the current terminal;
The prompting module is used for prompting the relative azimuth information;
The prompt module is specifically configured to generate a prompt element pointing to the target virtual object from the current virtual object in the virtual scene, and display the prompt element; the prompt element comprises: the text is used for indicating the distance between the current virtual object and the target virtual object, and the icon symbol comprises a direction indication line.
7. The apparatus of claim 6, wherein the response module is configured to:
And responding to the azimuth checking instruction, and calculating the relative azimuth information of the target virtual object and the current virtual object in a virtual map.
8. An electronic device, comprising:
A memory for storing a computer program;
A processor for executing the computer program to implement the method of any one of claims 1 to 5.
9. A non-transitory electronic device-readable storage medium, comprising: program which, when run by an electronic device, causes the electronic device to perform the method of any one of claims 1 to 5.
CN202110853128.2A 2021-07-27 2021-07-27 Method, device, equipment and storage medium for prompting azimuth of virtual object Active CN113559508B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110853128.2A CN113559508B (en) 2021-07-27 2021-07-27 Method, device, equipment and storage medium for prompting azimuth of virtual object

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110853128.2A CN113559508B (en) 2021-07-27 2021-07-27 Method, device, equipment and storage medium for prompting azimuth of virtual object

Publications (2)

Publication Number Publication Date
CN113559508A CN113559508A (en) 2021-10-29
CN113559508B true CN113559508B (en) 2024-05-28

Family

ID=78168205

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110853128.2A Active CN113559508B (en) 2021-07-27 2021-07-27 Method, device, equipment and storage medium for prompting azimuth of virtual object

Country Status (1)

Country Link
CN (1) CN113559508B (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108525300A (en) * 2018-04-27 2018-09-14 腾讯科技(深圳)有限公司 Position indication information display methods, device, electronic device and storage medium
CN108619721A (en) * 2018-04-27 2018-10-09 腾讯科技(深圳)有限公司 Range information display methods, device and computer equipment in virtual scene
JP6545345B1 (en) * 2018-10-22 2019-07-17 グリー株式会社 Terminal device, control method and control program
CN112090069A (en) * 2020-09-17 2020-12-18 腾讯科技(深圳)有限公司 Information prompting method and device in virtual scene, electronic equipment and storage medium
CN112121422A (en) * 2020-09-30 2020-12-25 腾讯科技(深圳)有限公司 Interface display method, device, equipment and storage medium
CN112546627A (en) * 2020-12-22 2021-03-26 网易(杭州)网络有限公司 Route guiding method, device, storage medium and computer equipment
CN113082714A (en) * 2021-03-01 2021-07-09 上海硬通网络科技有限公司 Game role moving path determining method and device and electronic equipment

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2019193742A (en) * 2018-05-02 2019-11-07 任天堂株式会社 Information processing program, information processor, information processing system, and method of processing information

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108525300A (en) * 2018-04-27 2018-09-14 腾讯科技(深圳)有限公司 Position indication information display methods, device, electronic device and storage medium
CN108619721A (en) * 2018-04-27 2018-10-09 腾讯科技(深圳)有限公司 Range information display methods, device and computer equipment in virtual scene
JP6545345B1 (en) * 2018-10-22 2019-07-17 グリー株式会社 Terminal device, control method and control program
CN112090069A (en) * 2020-09-17 2020-12-18 腾讯科技(深圳)有限公司 Information prompting method and device in virtual scene, electronic equipment and storage medium
CN112121422A (en) * 2020-09-30 2020-12-25 腾讯科技(深圳)有限公司 Interface display method, device, equipment and storage medium
CN112546627A (en) * 2020-12-22 2021-03-26 网易(杭州)网络有限公司 Route guiding method, device, storage medium and computer equipment
CN113082714A (en) * 2021-03-01 2021-07-09 上海硬通网络科技有限公司 Game role moving path determining method and device and electronic equipment

Also Published As

Publication number Publication date
CN113559508A (en) 2021-10-29

Similar Documents

Publication Publication Date Title
CN110115838B (en) Method, device, equipment and storage medium for generating mark information in virtual environment
CN108211358B (en) Information display method and device, storage medium and electronic device
US8261199B2 (en) Breakpoint identification and presentation in virtual worlds
CN110833689A (en) Augmented reality device, method, and program
CN110339564B (en) Virtual object display method, device, terminal and storage medium in virtual environment
CN110801629B (en) Method, device, terminal and medium for displaying virtual object life value prompt graph
CN111672116B (en) Method, device, terminal and storage medium for controlling virtual object release technology
CN113101636B (en) Information display method and device for virtual object, electronic equipment and storage medium
CN110433493A (en) Position mark method, device, terminal and the storage medium of virtual objects
CN113457150A (en) Information prompting method and device, storage medium and electronic equipment
US20220305384A1 (en) Data processing method in virtual scene, device, storage medium, and program product
CN110478900B (en) Map area generation method, device, equipment and storage medium in virtual environment
CN113813603B (en) Game display control method and device, electronic equipment and storage medium
CN113440846A (en) Game display control method and device, storage medium and electronic equipment
US20230072463A1 (en) Contact information presentation
US20230285855A1 (en) Virtual scene display method and apparatus, terminal, and storage medium
CN114100119A (en) Virtual object repairing method and device, electronic equipment and storage medium
CN112691366A (en) Virtual item display method, device, equipment and medium
CN110465090A (en) Control method, device, terminal and the storage medium of virtual objects
CN112221135A (en) Screen display method, device, equipment and storage medium
JP2023549370A (en) Information display methods, devices, electronic devices and computer programs
CN113559508B (en) Method, device, equipment and storage medium for prompting azimuth of virtual object
CN112843682A (en) Data synchronization method, device, equipment and storage medium
US10300379B1 (en) Facilitating contextual game notifications and smart action options
CN111318020A (en) Virtual object control method, device, equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant