CN113546419A - Game map display method, device, terminal and storage medium - Google Patents

Game map display method, device, terminal and storage medium Download PDF

Info

Publication number
CN113546419A
CN113546419A CN202110871986.XA CN202110871986A CN113546419A CN 113546419 A CN113546419 A CN 113546419A CN 202110871986 A CN202110871986 A CN 202110871986A CN 113546419 A CN113546419 A CN 113546419A
Authority
CN
China
Prior art keywords
game
map
virtual object
target
scene
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110871986.XA
Other languages
Chinese (zh)
Other versions
CN113546419B (en
Inventor
黄智行
郝亭翰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Netease Hangzhou Network Co Ltd
Original Assignee
Netease Hangzhou Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netease Hangzhou Network Co Ltd filed Critical Netease Hangzhou Network Co Ltd
Priority to CN202110871986.XA priority Critical patent/CN113546419B/en
Publication of CN113546419A publication Critical patent/CN113546419A/en
Application granted granted Critical
Publication of CN113546419B publication Critical patent/CN113546419B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/53Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game
    • A63F13/537Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game using indicators, e.g. showing the condition of a game character on screen
    • A63F13/5378Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game using indicators, e.g. showing the condition of a game character on screen for displaying an additional top view, e.g. radar screens or maps
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/214Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads
    • A63F13/2145Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads the surface being also a display device, e.g. touch screens
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/30Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device
    • A63F2300/308Details of the user interface

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application discloses a game map display method, a game map display device, a terminal and a storage medium, wherein the method can display a game interface of a target virtual object, the game interface comprises a game picture and a scene thumbnail, the game picture comprises at least part of a game scene and at least part of the virtual object positioned in the game scene, and the scene thumbnail is a thumbnail corresponding to the game scene; responding to the triggering operation of the scene thumbnail, displaying a game map in a preset map orientation on the game interface, wherein the game map comprises a target orientation indicator corresponding to the target virtual object, and the target orientation indicator is used for indicating the position and the orientation of the target virtual object in the game scene; and responding to the navigation trigger instruction, controlling the target direction indicator to always point to a preset direction in the game interface, and rotating the game map in real time to enable the target direction indicator to always point to the orientation of the target virtual object in the game scene in the game map.

Description

Game map display method, device, terminal and storage medium
Technical Field
The application relates to the technical field of computers, in particular to a game map display method, a game map display device, a game map display terminal and a storage medium.
Background
During the game, in order to facilitate the game user to control the virtual objects to move in a plurality of virtual scenes, the game user needs to frequently view the game map. In the related art, one is to display a tactical map on a game interface, which is based on a standard orientation (north up, south down), but for a game user with poor sense of direction, it is difficult to identify a correct direction through the map; the other is that the user selects an option taking the user as a visual angle in the setting and switches to map display taking the user as the center at one time, but a certain setting threshold exists, and when the map is large, the game user cannot clearly recognize the direction of the landmark place.
Accordingly, there is a need in the art for improvements.
Disclosure of Invention
The embodiment of the application provides a game map display method, a game map display device, a game map display terminal and a storage medium, which can meet the requirement of a game user on quick map guide in the map running process.
The embodiment of the application provides a game map display method, which comprises the following steps:
the game interface is used for displaying a target virtual object and comprises a game picture and a scene thumbnail, wherein the game picture comprises at least part of game scene and at least part of virtual object positioned in the game scene, and the scene thumbnail is a thumbnail corresponding to the game scene;
responding to the triggering operation of the scene thumbnail, displaying a game map in a preset map orientation on the game interface, wherein the game map comprises a target orientation indicator corresponding to the target virtual object, and the target orientation indicator is used for indicating the position and the orientation of the target virtual object in the game scene;
and responding to a navigation trigger instruction, controlling the target direction indicator to always point to a preset direction in the game interface, and rotating the game map in real time to enable the target direction indicator to always point to the orientation of the target virtual object in the game scene in the game map.
In an optional embodiment, the step of controlling the target direction indicator to always point to a preset direction in the game interface and rotating the game map in real time so that the target direction indicator always points to the orientation of the target virtual object in the game scene in the game map comprises:
acquiring orientation information of the target virtual object;
determining a rotation direction of the game map according to the orientation information of the target virtual object;
rotating the game map according to a rotation direction of the game map such that the target orientation indicator always points in the game map to an orientation of the target virtual object in the game scene.
In an optional embodiment, before the controlling the target position indicator to always point to a preset direction in the game interface, the method further includes:
determining the angle difference between the current direction pointed by the target direction indicator in the game interface and the preset direction;
rotating the game map based on the angle difference such that the rotation angle of the game map is the same as the angle difference.
In an optional embodiment, before the responding to the navigation trigger instruction, the method further includes:
receiving touch operation for controlling the virtual object to move;
responding to the touch point of the touch operation to meet a preset condition, and generating the navigation trigger instruction, wherein the preset condition comprises at least one of the following conditions: the touch point moves to a preset area; the duration of the touch point meets the preset duration; and the pressure value of the touch point meets a preset pressure threshold value.
In an optional embodiment, before the responding to the navigation triggering instruction, the method further includes:
acquiring a first moving speed of the target virtual object;
and if the first moving speed reaches a preset speed and the duration of the target virtual object moving at the first moving speed reaches a first preset time, generating the navigation trigger instruction.
In an optional embodiment, after the controlling the target position indicator to always point to a preset direction in the game interface in response to the navigation trigger instruction and rotating the game map in real time so that the target position indicator always points to the orientation of the target virtual object in the game scene in the game map, the method further includes:
acquiring a second moving speed of the target virtual object;
and if the second moving speed is less than the preset speed and the duration of the movement of the target virtual object at the second moving speed reaches a second preset time, displaying the game map in the direction of the preset map.
In an optional embodiment, the method further comprises:
and responding to a navigation trigger instruction, and adjusting the transparency of the game map so that a game picture in a display area of the game map is in a visible state.
In an optional embodiment, the game map includes a map base map and a map interface base plate, and the adjusting the transparency of the game map includes:
presetting a first transparency and a second transparency;
adjusting the transparency of the map base map to the first transparency, and adjusting the transparency of the map interface backplane to the second transparency.
In an alternative embodiment, the game interface includes interface interaction elements, the method further comprising:
and responding to a navigation trigger instruction, zooming the game map, and controlling the interface interaction element to be displayed on the upper layer of the zoomed game map.
In an optional embodiment, the game interface comprises a skill control, and after displaying the game map on the game interface in the preset map orientation, the game interface further comprises:
and closing the game map in response to the triggering operation of the skill control.
In an optional embodiment, the displaying a game map in a preset map orientation on the game interface in response to the triggering operation on the scene thumbnail comprises:
in response to a viewing operation on the scene thumbnail, enlarging the scene thumbnail;
and displaying the enlarged scene thumbnail on the game interface in the preset map orientation.
In an optional embodiment, the method further comprises:
displaying a first direction indicator corresponding to a first virtual object in the game map, wherein the first direction indicator is used for indicating the position and the orientation of the first virtual object in the game scene, and the first virtual object and the target virtual object are in the same game;
in response to the selection operation of the first virtual object, displaying navigation information on the game interface, wherein the navigation information is used for indicating route information of the target virtual object to the first virtual object.
An embodiment of the present application further provides a game map display device, including:
the game system comprises a first display unit, a second display unit and a third display unit, wherein the first display unit is used for displaying a game interface of a target virtual object, the game interface comprises a game picture and a scene thumbnail, the game picture comprises at least part of a game scene and at least part of the virtual object positioned in the game scene, and the scene thumbnail is a thumbnail corresponding to the game scene;
the second display unit is used for responding to the trigger operation of the scene thumbnail and displaying a game map in a preset map orientation on the game interface, wherein the game map comprises a target orientation indicator corresponding to the target virtual object, and the target orientation indicator is used for indicating the position and the orientation of the target virtual object in the game scene;
and the rotating unit is used for responding to a navigation trigger instruction, controlling the target direction indicator to always point to a preset direction in the game interface, and rotating the game map in real time to enable the target direction indicator to always point to the orientation of the target virtual object in the game scene in the game map.
The embodiment of the application also provides a terminal, which comprises a memory, a processor and a computer program stored on the memory and capable of running on the processor, wherein the steps of the game map display method are realized when the processor executes the computer program.
The embodiment of the application also provides a computer readable storage medium, on which a computer program is stored, wherein the computer program realizes the steps of the game map display method when being executed by a processor.
The embodiment of the application provides a game map display method, a game map display device, a terminal and a storage medium, wherein the method can display a game interface of a target virtual object, the game interface comprises a game picture and a scene thumbnail, the game picture comprises at least part of a game scene and at least part of virtual objects positioned in the game scene, and the scene thumbnail is a thumbnail corresponding to the game scene; responding to the triggering operation of the scene thumbnail, displaying a game map in a preset map orientation on the game interface, wherein the game map comprises a target orientation indicator corresponding to the target virtual object, and the target orientation indicator is used for indicating the position and the orientation of the target virtual object in the game scene; and responding to a navigation trigger instruction, controlling the target direction indicator to always point to a preset direction in the game interface, and rotating the game map in real time to enable the target direction indicator to always point to the orientation of the target virtual object in the game scene in the game map. Therefore, the state of the game map can be changed, the requirement of a game user for checking the map can be met, the game map can be rotated in real time according to the orientation information of the virtual object controlled by the game user in a game scene after the game user enters the navigation state, the requirement of rapid guiding of the map is met, and better navigation experience is achieved under the condition that an entity control of a user interface is not increased.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
FIG. 1 is a system diagram of a game map display device according to an embodiment of the present disclosure;
FIG. 2 is a flowchart of a method for displaying a game map according to an embodiment of the present disclosure;
FIG. 3 is a schematic view of a game interface provided by an embodiment of the present application;
FIG. 4 is a schematic diagram of a game map displayed on a game interface according to an embodiment of the present disclosure;
FIG. 5 is a schematic diagram of a game map provided by an embodiment of the present application;
FIG. 6 is a schematic diagram of a rotated game map provided by an embodiment of the present application;
FIG. 7 is a schematic view of a rotated game map displayed in a game interface according to an embodiment of the present application;
FIG. 8 is a schematic structural diagram of a game map display device according to an embodiment of the present disclosure;
fig. 9 is a schematic structural diagram of a terminal according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The embodiment of the application provides a game map display method, a game map display device, a terminal and a storage medium. Specifically, the present embodiment provides a game map display method suitable for a game map display apparatus that can be integrated in a computer device.
The Computer device may be a terminal device, such as a smart phone, a tablet Computer, a notebook Computer, a touch screen, a game machine, a Personal Computer (PC), a Personal Digital Assistant (PDA), and the like, and the terminal may further include a client, which may be a game application client, a browser client carrying a game program, or an instant messaging client. The computer device may also be a device such as a server, and the server may be an independent physical server, a server cluster or a distributed system formed by a plurality of physical servers, or a cloud server providing basic cloud computing services such as a cloud service, a cloud database, cloud computing, a cloud function, cloud storage, a network service, cloud communication, middleware service, a domain name service, a security service, a CDN, and a big data and artificial intelligence platform, but is not limited thereto.
For example, when the game map display method is applied to a terminal, the terminal is deployed with a target game application, and a game interface is provided when the terminal runs the target game application. The manner in which the terminal device provides the game interface to the user may include a variety of ways, for example, the game interface may be rendered for display on a display screen of the terminal device or presented by holographic projection. For example, the terminal device may include a touch display screen for presenting a game interface and receiving operation instructions generated by a user acting on the game interface, and a processor for executing the game, generating the game interface, responding to the operation instructions, and controlling display of the game interface on the touch display screen.
For example, when the game map display method is executed on a server, the game map display method may be a cloud game. Cloud gaming refers to a gaming regime based on cloud computing. In the running mode of the cloud game, the running main body of the game application program and the game picture presenting main body are separated, and the storage and the running of the skill control method are finished on the cloud game server. The game screen presentation is performed at a cloud game client, which is mainly used for receiving and sending game data and presenting the game screen, for example, the cloud game client may be a display device with a data transmission function near a user side, such as a mobile terminal, a television, a computer, a palm computer, a personal digital assistant, and the like, but a terminal device for performing game data processing is a cloud game server at the cloud end. When a game is played, a user operates the cloud game client to send an operation instruction to the cloud game server, the cloud game server runs the game according to the operation instruction, data such as game pictures and the like are encoded and compressed, the data are returned to the cloud game client through a network, and finally the data are decoded through the cloud game client and the game pictures are output.
Referring to fig. 1, fig. 1 is a system schematic diagram of a game map display device according to an embodiment of the present application. The system may include at least one terminal 1000, at least one server 2000, at least one database 3000, and a network 4000. The terminal 1000 held by the user can be connected to servers of different games through the network 4000. Terminal 1000 can be any device having computing hardware capable of supporting and executing a software product corresponding to a game. In addition, terminal 1000 can have one or more multi-touch sensitive screens for sensing and obtaining user input through touch or slide operations performed at multiple points on one or more touch sensitive display screens. In addition, when the system includes a plurality of terminals 1000, a plurality of servers 2000, and a plurality of networks 4000, different terminals 1000 may be connected to each other through different networks 4000 and through different servers 2000. The network 4000 may be a wireless network or a wired network, such as a Wireless Local Area Network (WLAN), a Local Area Network (LAN), a cellular network, a 2G network, a 3G network, a 4G network, a 5G network, and so on. In addition, different terminals 1000 may be connected to other terminals or a server using their own bluetooth network or hotspot network. For example, a plurality of users may be online through different terminals 1000 to be connected and synchronized with each other through a suitable network to support multiplayer games. In addition, the system may include a plurality of databases 3000, the plurality of databases 3000 being coupled to different servers 2000, and information related to the game environment may be continuously stored in the databases 3000 when different users play the multiplayer game online.
The embodiment of the application provides a game map display method, which can be executed by a terminal or a server. In the embodiment of the present application, the game map display method is described as an example executed by a terminal, where the terminal is deployed with a target game application and provides a game interface when the terminal runs the target game application. A game map display method, comprising: the game interface is used for displaying a target virtual object and comprises a game picture and a scene thumbnail, wherein the game picture comprises at least part of game scene and at least part of virtual object positioned in the game scene, and the scene thumbnail is a thumbnail corresponding to the game scene; responding to the triggering operation of the scene thumbnail, displaying a game map in a preset map orientation on the game interface, wherein the game map comprises a target orientation indicator corresponding to the target virtual object, and the target orientation indicator is used for indicating the position and the orientation of the target virtual object in the game scene; and responding to a navigation trigger instruction, controlling the target direction indicator to always point to a preset direction in the game interface, and rotating the game map in real time to enable the target direction indicator to always point to the orientation of the target virtual object in the game scene in the game map.
The application does not limit the types of the target game applications, and only the virtual object can be moved in the game. The target virtual object is controlled by the user of the terminal, and if one or more other virtual objects exist in the game, the target virtual object can be controlled by other game users in the current game, and can also be controlled by a computer, such as a robot using an Artificial Intelligence (AI) algorithm, so as to realize a man-machine fighting mode.
The following are detailed below. It should be noted that the following description of the embodiments is not intended to limit the preferred order of the embodiments.
An embodiment of the present application provides a game map display method, and as shown in fig. 2, a flow of the game map display method according to the embodiment of the present application may be as follows:
201. the game interface is used for displaying a target virtual object and comprises a game picture and a scene thumbnail, the game picture comprises at least part of game scene and at least part of virtual object located in the game scene, and the scene thumbnail is a thumbnail corresponding to the game scene.
In some games, in embodiments of the present application, the virtual object needs to be moved in the virtual game scene to complete the game task. For example, in some battle games, the terminal may provide a tactical map, and the game user may control the corresponding virtual object to move to a selected location according to a route in the tactical map by viewing the tactical map. After a target game user enters a target game, a terminal can display a game interface where a target virtual object is located, wherein the game interface comprises a game picture and a scene thumbnail, and the game picture comprises at least part of a virtual game scene and at least part of the virtual object located in the virtual game scene. For example, in some multiplayer games, a target virtual object and a first virtual object are included in a virtual game scene in the same game, where the first virtual object may be a teammate or enemy of the target virtual object.
It should be noted that, in the first person perspective, the target virtual object is not displayed in the game interface, and in the third person perspective, the target virtual object is displayed in the game interface, or whether the target virtual object controlled by the game user is displayed in the game interface or not needs to be set according to the actual situation, which is not limited in the embodiment of the present application. During the game, because the target virtual object and the first virtual object are in motion, the position in the game scene and the relative distance between the target virtual object and the first virtual object are always changed, and the first virtual object in the visual field range of the target virtual object may also be changed, that is, not all the first virtual objects are located in the visual field range of the target virtual object (part of the first virtual objects are located outside the visual field range of the target virtual object).
Referring to fig. 3, fig. 3 is a schematic view of a game interface according to an embodiment of the present application. As shown in fig. 3, in a displayed game interface 301, a game screen 302, a scene thumbnail 303, a movement rocker 304, a skill control 305, and a target virtual object 306 are included. Through the triggering operation of the game user on the scene thumbnail 303, the terminal responds to the triggering operation on the scene thumbnail 303, and can open the game map. Optionally, the scene thumbnail shown in fig. 3 may also be a map control, and the game map is opened by viewing the map control.
202. Responding to the triggering operation of the scene thumbnail, displaying a game map in a preset map orientation on the game interface, wherein the game map comprises a target orientation indicator corresponding to the target virtual object, and the target orientation indicator is used for indicating the position and the orientation of the target virtual object in the game scene.
In the embodiment of the application, the terminal responds to the trigger operation of the scene thumbnail and displays the game map on the game interface. The trigger operation may be a sliding operation or a clicking operation, and specifically may be a long press, a drag, a slide, a click, a double click, or the like. For example, after a game user clicks a scene thumbnail, a game map is displayed on the game interface in a preset map orientation, and the game map may be displayed in any area of the game interface, which is not limited in the present application. In some games, the game map is typically displayed in the game interface in a traditional north-down-south orientation. The preset map orientation of the application can also be the orientation of the upper north, the lower south as an example, and the direction of each virtual place in the game map is consistent with the direction in the virtual game scene; for example, if a marked place is set at the north-most position of the virtual game scene, and the marked place is located above the displayed game map, the game map can be considered to be in an upward orientation, i.e., a north-positive orientation, displayed on the game interface.
The game map includes a direction indicator corresponding to the target virtual object, which is set as a target direction indicator, and the target direction indicator can indicate the position and the orientation of the target virtual object in the virtual game scene. Through viewing the game map, a game user can obtain the position and the orientation of the controlled virtual object in the game scene according to the position and the direction of the corresponding direction indicator on the game map, and clear direction cognition can be formed on a symbolic place in the game scene.
Referring to fig. 4, fig. 4 is a schematic view illustrating a game map displayed on a game interface according to an embodiment of the present application. As shown in fig. 4, in the game interface 401, a game screen 402, a game map 403, a skill control 404, a map closing control 405, and an object orientation indicator 406 are included. The game map 403 is displayed on the game interface 401 in a north-up, south-down map orientation. The area of the game map displayed on the game interface shields part of the game picture, and the display transparency of the game map can be adjusted.
In the embodiment of the application, in response to the trigger operation on the scene thumbnail, the specific steps of displaying the game map in the preset map orientation on the game interface are as follows: and in response to the viewing operation of the scene thumbnail, amplifying the scene thumbnail, and displaying the amplified scene thumbnail in a preset map orientation on the game interface. Optionally, if the map control is a map control, the map control may include a minimap display control, where the minimap display control displays at least part of the game map of the virtual game scene, and the game map is at least partially displayed in a preset map orientation on the game interface in response to a viewing operation of the minimap display control. The viewing operation may be a sliding operation or a clicking operation, and specifically may be a long press, a drag, a slide, a click, a double click, or the like. Taking the preset map orientation as the north-up, south-down orientation as an example, after the game user clicks the small map control, at least part of the game map displayed by the small map control can be enlarged, and at least part of the enlarged game map is displayed in the north-up, south-down orientation on the game interface.
In the embodiment of the application, after the game map is displayed, the terminal responds to the triggering operation of the map closing control through the triggering operation of the game user on the map closing control, and the game map is directly closed. In the game process, the terminal can respond to the triggering operation of the skill control to close the game map through the triggering operation of the game user on the skill control. The skill control can be an attack skill control, such as a firing button, and after the game user clicks the firing button, the game map is closed, and the target virtual object is controlled to fire. As shown in fig. 4, in a state where the game map 403 is displayed, the game user can close the game map 403 by triggering the map closing control 405; or if the game user triggers the skill control 404, the game map 403 is also directly closed.
203. And responding to a navigation trigger instruction, controlling the target direction indicator to always point to a preset direction in the game interface, and rotating the game map in real time to enable the target direction indicator to always point to the orientation of the target virtual object in the game scene in the game map.
In the embodiment of the application, after a game map is displayed on a game interface, the state of a current target virtual object is detected, and if the target virtual object is determined to need navigation and a navigation trigger instruction is responded, the game map is rotated on the game interface to control a target azimuth indicator corresponding to the target virtual object to always point to a preset direction in the game interface. At this time, it may be assumed that the game map enters the navigation state, and the game map is first rotated in real time so that the target direction indicator always points to the direction of the target virtual object in the game scene in the game map.
In the embodiment of the application, the preset direction may be the same as the direction of the preset map in the game interface towards the north central direction. Taking the preset map orientation as the north-up, south-down orientation as an example, in the rotated game map, the direction indicated by the target direction indicator on the game interface is upward, i.e. north-up.
Referring to fig. 5, fig. 5 is a schematic view of a game map according to an embodiment of the present application. As shown in fig. 5, the game map 501 includes an object orientation indicator 502, a first orientation indicator 503, a map base map 504, and a map interface base plate 505. The target orientation indicator indicates a position and an orientation of the target virtual object in the virtual game scene, and the first orientation indicator indicates a position and an orientation of the first virtual object in the virtual game scene. In some multiplayer battle games, a virtual object in the same game as the target virtual object is set as the first virtual object. The first virtual object may be a teammate or an enemy of the target virtual object in the game, and generally, a position indicator corresponding to the teammate is displayed on a game map, and the position of the teammate can be known by looking at the map. The arrangement mode of the orientation indicator is not limited in the application.
In an embodiment of the present application, the step of controlling the target direction indicator to always point to a preset direction in the game interface, and rotating the game map in real time so that the target direction indicator always points to the orientation of the target virtual object in the game scene in the game map specifically includes: the method comprises the steps of obtaining orientation information of a target virtual object, determining a rotation direction of a game map according to the orientation information of the target virtual object, and rotating the game map according to the rotation direction of the game map, so that a target azimuth indicator always points to the orientation of the target virtual object in a game scene in the game map. The orientation information of the virtual object is adjusted in real time through touch operation acting on the adjustment area, and the rotation direction of the game map is determined according to the change direction of the orientation information, so that the game map is rotated. The adjustment area may be a moving rocker area, and the orientation of the virtual object is adjusted by controlling the moving rocker.
In the embodiment of the application, when the target virtual object is determined to be required to be navigated, before the target direction indicator is controlled by the rotary game map to always point to the preset direction in the game interface, the angle difference between the current direction pointed by the target direction indicator in the game interface and the preset direction needs to be determined; the game map is rotated based on the angle difference so that the rotation angle of the game map is the same as the angle difference. Referring to fig. 6, fig. 6 is a schematic view of a rotated game map according to an embodiment of the present application. As shown in fig. 6, the game map 501 includes an object orientation indicator 502, a first orientation indicator 503, a map base map 504, and a map interface base plate 505. The target direction indicator 502 in the game map 501 is always upward in the game interface, that is, the preset direction is directed upward in the game interface, and the rotation angle 601 of the game map is the same as the angle difference 506 between the current direction indicated by the target direction indicator 502 in fig. 5 and the preset direction.
In the embodiment of the application, the game map is rotated in response to the navigation triggering instruction, so that the game map enters a navigation state after the direction pointed by the target direction indicator in the game interface is a preset direction. At this time, the target user controls the target virtual object to move in the virtual game scene, and when the orientation of the target virtual object in the virtual scene changes, the orientation of the target orientation indicator in the game interface is always represented in the virtual game scene by rotating the map since the target orientation indicator always points in the preset direction in the game interface. Alternatively, the target position indicator may be caused to be always in an intermediate position in the game map by moving the game map. For example, if the target virtual object is controlled to turn to the right by sliding the joystick, then the target direction indicator also needs to turn to the right in the game map, but because the direction pointed by the target direction indicator on the game interface is unchanged, i.e. always points to the upper side, only the game map can be rotated so that the target direction indicator points in the game map and the target virtual object still points in the game scene, and the rotating direction of the game map is to the left. In the embodiment of the application, after the navigation triggering instruction is responded, the game map directly changes to the virtual object as the visual angle, so that the visual angle does not need to be reset by adding a user interface entity control, and the method is more convenient and faster.
In this embodiment of the present application, before responding to a navigation trigger instruction, the method further includes: receiving touch operation for controlling the virtual object to move; responding that the touch point of the touch operation meets a preset condition, and generating a navigation trigger instruction, wherein the preset condition comprises at least one of the following conditions: moving the touch point to a preset area; the duration of the touch point meets the preset duration; the pressure value of the touch point meets a preset pressure threshold value. For example, the target user controls the target virtual object to move by moving the rocker, and when the target virtual object is dragged to a specific area by moving the rocker, or the duration of touching the moving rocker reaches time T, or the terminal detects that the pressure value of the touch point meets a pressure threshold P, the navigation instruction is triggered.
In the embodiment of the present application, when it is determined that the target virtual object needs to be navigated, it may be considered that the moving speed of the target virtual object satisfies a certain condition. The method comprises the following specific steps: acquiring the moving speed of a target virtual object, and setting the moving speed as a first moving speed; if the first moving speed reaches the preset speed and the duration of the target virtual object moving at the first moving speed reaches the first preset time, determining that the target virtual object needs to be navigated; and when the target virtual object is determined to need navigation, generating a navigation trigger instruction, and rotating the game map by the terminal in response to the navigation trigger instruction so that the direction indicated by the target azimuth indicator in the game interface is a preset direction. For example, assuming that the preset speed is V0, when the continuous moving speed of the target virtual object reaches V0 and the continuous moving time reaches T0, the game map is rotated. The preset speed and the first preset time can be set in a user-defined mode.
In the embodiment of the application, if the moving speed of the target virtual object is obtained, the moving speed is set as a second moving speed; and if the second moving speed is less than the preset speed and the duration of the movement of the target virtual object at the second moving speed reaches a second preset time, displaying the game map in the direction of the preset map. For example, when the moving speed of the target virtual object is detected to be less than V0 and the continuous moving time reaches T1, the game map is rotated back to the preset map orientation. Optionally, when the target virtual object stops moving, the game map may be displayed in the preset map orientation again.
In the embodiment of the application, when the target virtual object is determined to need navigation, the transparency of the game map is adjusted in response to a navigation trigger instruction, so that a game picture in a display area of the game map is in a visible state. Wherein, adjust the transparency of recreation map, concrete step includes: presetting a first transparency and a second transparency; the transparency of a map base map in the game map is adjusted to a first transparency, and the transparency of the map interface baseplate is adjusted to a second transparency. For example, the map base plate can be completely transparent, that is, the transparency of the map interface base plate is adjusted to be the highest, and the map base plate is semi-transparent, that is, the transparency of the map base plate is lower than that of the base plate.
In the embodiment of the application, the game interface comprises an interface interaction element, the game map is zoomed when the game map is rotated in response to a navigation trigger instruction, and the interface interaction element is controlled to be displayed on the upper layer of the zoomed game map. The zoom percentage of the game base map can be set by a user, and the application is not limited. Referring to fig. 7, fig. 7 is a schematic view illustrating a rotated game map displayed in a game interface according to an embodiment of the present application. As shown in fig. 7, the game interface 701 includes a game screen 702, a game map 703; wherein, the interface interaction elements can comprise a skill control 704, a moving rocker 705, and the skill control 704 and the moving rocker 705 are displayed on the upper layer of the game map. Because the area of the game map is visible, the game map cannot shield the game scene, and the skill control, the movement rocker and other interface interaction elements are arranged on the upper layer of the game map, the game user can more conveniently operate the target virtual object to move or trigger the skill in the virtual game scene.
In the embodiment of the application, a first orientation indicator corresponding to a first virtual object is displayed in a game map, and the first orientation indicator is used for indicating the position and the orientation of the first virtual object in a virtual game scene, wherein the first virtual object and a target virtual object are in the same game; and in response to the selection operation of the first virtual object, displaying navigation information on the game interface, wherein the navigation information is used for indicating the route information of the target virtual object to the first virtual object. For example, a route from a target virtual object to a first virtual object may be identified in a game map, based on which the game user moves in the game map to reach the location of the first virtual object. Alternatively, a route may be marked in the virtual game scene where the target virtual object is located, and the game user directly controls the target virtual object to move along the route.
In the game map display method provided by the embodiment of the application, a game interface of a target virtual object can be displayed, the game interface comprises a game picture and a scene thumbnail, the game picture comprises at least part of a game scene and at least part of the virtual object located in the game scene, and the scene thumbnail is a thumbnail corresponding to the game scene; responding to the triggering operation of the scene thumbnail, displaying a game map in a preset map orientation on the game interface, wherein the game map comprises a target orientation indicator corresponding to the target virtual object, and the target orientation indicator is used for indicating the position and the orientation of the target virtual object in the game scene; and responding to a navigation trigger instruction, controlling the target direction indicator to always point to a preset direction in the game interface, and rotating the game map in real time to enable the target direction indicator to always point to the orientation of the target virtual object in the game scene in the game map. Therefore, the scheme can meet the requirement of a game user on rapid map guide in the map running process, and has better navigation experience under the condition of not increasing entity controls of a user interface.
In order to better implement the method, correspondingly, the embodiment of the application also provides a game map display device, and the game map display device can be specifically integrated in a computer device, for example, in the form of a terminal.
Referring to fig. 8, the game map display apparatus includes a first display unit 801, a second display unit 802, and a rotation unit 803 as follows:
the first display unit 801 is configured to display a game interface of a target virtual object, where the game interface includes a game screen and a scene thumbnail, the game screen includes at least a part of a game scene and at least a part of a virtual object located in the game scene, and the scene thumbnail is a thumbnail corresponding to the game scene;
a second display unit 802, configured to display, in response to a trigger operation on the scene thumbnail, a game map in a preset map orientation on the game interface, where the game map includes a target orientation indicator corresponding to the target virtual object, and the target orientation indicator is used to indicate a position and an orientation of the target virtual object in the game scene;
a rotating unit 803, configured to, in response to a navigation trigger instruction, control the target direction indicator to always point to a preset direction in the game interface, and rotate the game map in real time so that the target direction indicator always points to the orientation of the target virtual object in the game scene in the game map.
In an optional embodiment, the rotation unit 803 further includes:
acquiring orientation information of the target virtual object;
determining a d-rotation direction of the game map according to the orientation information of the target virtual object;
rotating the game map according to a rotation direction of the game map such that the target orientation indicator always points in the game map to an orientation of the target virtual object in the game scene.
In an optional embodiment, the rotation unit 803 further includes:
determining the angle difference between the current direction pointed by the target direction indicator in the game interface and the preset direction;
rotating the game map based on the angle difference such that the rotation angle of the game map is the same as the angle difference.
In an optional embodiment, the rotation unit 803 further includes:
receiving touch operation for controlling the virtual object to move;
and responding to a preset condition when the touch point of the touch operation meets the preset condition, and generating a navigation trigger instruction, wherein the preset condition comprises at least one of the following conditions: the touch point moves to a preset area; the duration of the touch point meets the preset duration; and the pressure value of the touch point meets a preset pressure threshold value.
In an optional embodiment, the rotation unit 803 further includes:
acquiring a first moving speed of the target virtual object;
and if the first moving speed reaches a preset speed and the duration of the target virtual object moving at the first moving speed reaches a first preset time, generating a navigation trigger instruction.
In an optional embodiment, the rotation unit 803 further includes:
acquiring a second moving speed of the target virtual object;
and if the second moving speed is less than the preset speed and the duration of the movement of the target virtual object at the second moving speed reaches a second preset time, displaying the game map in the direction of the preset map.
In an optional embodiment, the rotation unit 803 further includes:
and responding to a navigation trigger instruction, and adjusting the transparency of the game map so that a game picture in a display area of the game map is in a visible state.
In an optional embodiment, the game map includes a map base map and a map interface base plate, and the rotation unit 803 includes:
presetting a first transparency and a second transparency;
adjusting the transparency of the map base map to the first transparency, and adjusting the transparency of the map interface backplane to the second transparency.
In an alternative embodiment, the game interface includes an interface interaction element, and the rotation unit 803 further includes:
and responding to a navigation trigger instruction, zooming the game map, and controlling the interface interaction element to be displayed on the upper layer of the zoomed game map.
In an alternative embodiment, the game interface includes a skill control, and the second display unit 802 further includes:
and closing the game map in response to the triggering operation of the skill control.
In an optional embodiment, the second display unit 802 further includes:
in response to a viewing operation on the scene thumbnail, enlarging the scene thumbnail;
and displaying the enlarged scene thumbnail on the game interface in the preset map orientation.
In an optional embodiment, the second display unit 802 further includes:
displaying a first direction indicator corresponding to a first virtual object in the game map, wherein the first direction indicator is used for indicating the position and the orientation of the first virtual object in the game scene, and the first virtual object and the target virtual object are in the same game;
in response to the selection operation of the first virtual object, displaying navigation information on the game interface, wherein the navigation information is used for indicating route information of the target virtual object to the first virtual object.
Correspondingly, the embodiment of the present application further provides a terminal, where the terminal may be a Computer device such as a smart phone, a tablet Computer, a notebook Computer, a touch screen, a game console, a Personal Computer (PC), a Personal Digital Assistant (PDA), and the like. As shown in fig. 9, fig. 9 is a schematic structural diagram of a terminal 900 according to an embodiment of the present invention. The terminal 900 includes a processor 901 with one or more processing cores, memory 902 with one or more computer-readable storage media, and a computer program stored on the memory 902 and executable on the processor. The processor 901 is electrically connected to the memory 902. Those skilled in the art will appreciate that the terminal 900 configuration shown in the figures is not intended to be limiting and may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components.
The processor 901 is a control center of the terminal 900, connects various parts of the entire terminal 900 by various interfaces and lines, performs various functions of the terminal 900 and processes data by running or loading software programs and/or modules stored in the memory 902 and calling data stored in the memory 902, thereby monitoring the entire terminal 900.
In this embodiment of the present application, the processor 901 in the terminal 900 loads instructions corresponding to processes of one or more application programs into the memory 902 according to the following steps, and the processor 901 runs the application programs stored in the memory 902, thereby implementing various functions:
the game interface is used for displaying a target virtual object and comprises a game picture and a scene thumbnail, wherein the game picture comprises at least part of game scene and at least part of virtual object positioned in the game scene, and the scene thumbnail is a thumbnail corresponding to the game scene; responding to the triggering operation of the scene thumbnail, displaying a game map in a preset map orientation on the game interface, wherein the game map comprises a target orientation indicator corresponding to the target virtual object, and the target orientation indicator is used for indicating the position and the orientation of the target virtual object in the game scene; and responding to a navigation trigger instruction, controlling the target direction indicator to always point to a preset direction in the game interface, and rotating the game map in real time to enable the target direction indicator to always point to the orientation of the target virtual object in the game scene in the game map.
The above operations can be implemented in the foregoing embodiments, and are not described in detail herein.
Optionally, as shown in fig. 9, the terminal 900 further includes: touch-sensitive display screen 903, radio frequency circuit 904, audio circuit 905, input unit 906 and power 907. The processor 901 is electrically connected to the touch display screen 903, the radio frequency circuit 904, the audio circuit 905, the input unit 906, and the power supply 907. Those skilled in the art will appreciate that the terminal structure shown in fig. 9 does not constitute a limitation of the terminal, and may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components.
The touch screen 903 may be used for displaying a graphical user interface and receiving operation instructions generated by a user acting on the graphical user interface. The touch display screen 903 may include a display panel and a touch panel. Among other things, the display panel may be used to display information input by or provided to the user and various graphical user interfaces of the terminal, which may be made up of graphics, text, icons, video, and any combination thereof. Alternatively, the Display panel may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like. The touch panel may be used to collect touch operations of a user on or near the touch panel (for example, operations of the user on or near the touch panel using any suitable object or accessory such as a finger, a stylus pen, and the like), and generate corresponding operation instructions, and the operation instructions execute corresponding programs. Alternatively, the touch panel may include two parts, a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 901, and can receive and execute commands sent by the processor 901. The touch panel may cover the display panel, and when the touch panel detects a touch operation on or near the touch panel, the touch panel transmits the touch operation to the processor 901 to determine the type of the touch event, and then the processor 901 provides a corresponding visual output on the display panel according to the type of the touch event. In the embodiment of the present invention, a touch panel and a display panel may be integrated into the touch display screen 903 to realize input and output functions. However, in some embodiments, the touch panel and the touch panel can be implemented as two separate components to perform the input and output functions. That is, the touch display 903 may also be used as a part of the input unit 906 to implement an input function.
The rf circuit 904 may be configured to transmit and receive rf signals to establish wireless communication with a network device or other terminals through wireless communication, and transmit and receive signals with the network device or other terminals.
The audio circuit 905 may be used to provide an audio interface between the user and the terminal through a speaker, microphone. The audio circuit 905 can transmit the electrical signal converted from the received audio data to a loudspeaker, and the electrical signal is converted into a sound signal by the loudspeaker and output; on the other hand, the microphone converts the collected sound signal into an electrical signal, which is received by the audio circuit 905 and converted into audio data, and the audio data is processed by the audio data output processor 901, and then transmitted to another terminal via the radio frequency circuit 904, or the audio data is output to the memory 902 for further processing. The audio circuitry 905 may also include an earbud jack to provide communication of peripheral headphones with the terminal.
The input unit 906 may be used to receive input numbers, character information, or user characteristic information (e.g., fingerprint, iris, facial information, etc.), and generate keyboard, mouse, joystick, optical, or trackball signal inputs related to user settings and function control.
Power supply 907 is used to provide power to the various components of terminal 900. Optionally, the power supply 907 may be logically connected to the processor 901 through a power management system, so as to implement functions of managing charging, discharging, power consumption management, and the like through the power management system. Power supply 907 may also include any component such as one or more dc or ac power sources, recharging systems, power failure detection circuitry, power converters or inverters, power status indicators, and the like.
Although not shown in fig. 9, the terminal 900 may further include a camera, a sensor, a wireless fidelity module, a bluetooth module, etc., which are not described in detail herein.
In the foregoing embodiments, the descriptions of the respective embodiments have respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to related descriptions of other embodiments.
As can be seen from the above, the terminal provided in this embodiment may display a game interface of a target virtual object, where the game interface includes a game screen and a scene thumbnail, the game screen includes at least a part of a game scene and at least a part of a virtual object located in the game scene, and the scene thumbnail is a thumbnail corresponding to the game scene; responding to the triggering operation of the scene thumbnail, displaying a game map in a preset map orientation on the game interface, wherein the game map comprises a target orientation indicator corresponding to the target virtual object, and the target orientation indicator is used for indicating the position and the orientation of the target virtual object in the game scene; and responding to a navigation trigger instruction, controlling the target direction indicator to always point to a preset direction in the game interface, and rotating the game map in real time to enable the target direction indicator to always point to the orientation of the target virtual object in the game scene in the game map. The scheme can meet the requirement of a game user on rapid map guide in the map running process, and has better navigation experience under the condition of not increasing entity controls of a user interface.
It will be understood by those skilled in the art that all or part of the steps of the methods of the above embodiments may be performed by instructions or by associated hardware controlled by the instructions, which may be stored in a computer readable storage medium and loaded and executed by a processor.
To this end, the present application provides a computer-readable storage medium, in which a plurality of computer programs are stored, and the computer programs can be loaded by a processor to execute the steps in any one of the game map display methods provided by the embodiments of the present application. For example, the computer program may perform the steps of:
the game interface is used for displaying a target virtual object and comprises a game picture and a scene thumbnail, wherein the game picture comprises at least part of game scene and at least part of virtual object positioned in the game scene, and the scene thumbnail is a thumbnail corresponding to the game scene; responding to the triggering operation of the scene thumbnail, displaying a game map in a preset map orientation on the game interface, wherein the game map comprises a target orientation indicator corresponding to the target virtual object, and the target orientation indicator is used for indicating the position and the orientation of the target virtual object in the game scene; and responding to a navigation trigger instruction, controlling the target direction indicator to always point to a preset direction in the game interface, and rotating the game map in real time to enable the target direction indicator to always point to the orientation of the target virtual object in the game scene in the game map.
The above operations can be implemented in the foregoing embodiments, and are not described in detail herein.
Wherein the storage medium may include: read Only Memory (ROM), Random Access Memory (RAM), magnetic or optical disks, and the like.
The game map display method, the game map display device, the game map display terminal and the storage medium provided by the embodiment of the application are described in detail, a specific example is applied in the description to explain the principle and the implementation of the application, and the description of the embodiment is only used for helping to understand the method and the core idea of the application; meanwhile, for those skilled in the art, according to the idea of the present application, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present application.

Claims (15)

1. A game map display method, comprising:
the game interface is used for displaying a target virtual object and comprises a game picture and a scene thumbnail, wherein the game picture comprises at least part of game scene and at least part of virtual object positioned in the game scene, and the scene thumbnail is a thumbnail corresponding to the game scene;
responding to the triggering operation of the scene thumbnail, displaying a game map in a preset map orientation on the game interface, wherein the game map comprises a target orientation indicator corresponding to the target virtual object, and the target orientation indicator is used for indicating the position and the orientation of the target virtual object in the game scene;
and responding to a navigation trigger instruction, controlling the target direction indicator to always point to a preset direction in the game interface, and rotating the game map in real time to enable the target direction indicator to always point to the orientation of the target virtual object in the game scene in the game map.
2. The game map display method according to claim 1, wherein the step of controlling the target orientation indicator to always point to a preset direction in the game interface and rotating the game map in real time so that the target orientation indicator always points to an orientation of the target virtual object in the game scene in the game map comprises:
acquiring orientation information of the target virtual object;
determining a rotation direction of the game map according to the orientation information of the target virtual object;
rotating the game map according to a rotation direction of the game map such that the target orientation indicator always points in the game map to an orientation of the target virtual object in the game scene.
3. The game map display method according to claim 1, further comprising, before the controlling the target orientation indicator to always point in a preset direction in the game interface:
determining the angle difference between the current direction pointed by the target direction indicator in the game interface and the preset direction;
rotating the game map based on the angle difference such that the rotation angle of the game map is the same as the angle difference.
4. The game map display method according to claim 1, further comprising, before the responding to the navigation trigger instruction:
receiving touch operation for controlling the virtual object to move;
responding to the touch point of the touch operation to meet a preset condition, and generating the navigation trigger instruction, wherein the preset condition comprises at least one of the following conditions: the touch point moves to a preset area; the duration of the touch point meets the preset duration; and the pressure value of the touch point meets a preset pressure threshold value.
5. The game map display method according to claim 1, further comprising, before the responding to the navigation trigger instruction:
acquiring a first moving speed of the target virtual object;
and if the first moving speed reaches a preset speed and the duration of the target virtual object moving at the first moving speed reaches a first preset time, generating the navigation trigger instruction.
6. The game map display method according to claim 5, wherein after the controlling the target orientation indicator to always point to a preset direction in the game interface in response to the navigation trigger instruction and rotating the game map in real time so that the target orientation indicator always points to an orientation of the target virtual object in the game scene in the game map, further comprising:
acquiring a second moving speed of the target virtual object;
and if the second moving speed is less than the preset speed and the duration of the movement of the target virtual object at the second moving speed reaches a second preset time, displaying the game map in the direction of the preset map.
7. The game map display method according to claim 1, further comprising:
and responding to a navigation trigger instruction, and adjusting the transparency of the game map so that a game picture in a display area of the game map is in a visible state.
8. The method as claimed in claim 7, wherein the game map comprises a map base map and a map interface base plate, and the adjusting of the transparency of the game map comprises:
presetting a first transparency and a second transparency;
adjusting the transparency of the map base map to the first transparency, and adjusting the transparency of the map interface backplane to the second transparency.
9. The game map display method of claim 1, wherein the game interface comprises interface interaction elements, the method further comprising:
and responding to a navigation trigger instruction, zooming the game map, and controlling the interface interaction element to be displayed on the upper layer of the zoomed game map.
10. The game map display method of claim 1, wherein the game interface comprises a skill control, and further comprising, after displaying the game map on the game interface in the preset map orientation:
and closing the game map in response to the triggering operation of the skill control.
11. The game map display method according to claim 1, wherein the displaying a game map in a preset map orientation on the game interface in response to the trigger operation on the scene thumbnail includes:
in response to a viewing operation on the scene thumbnail, enlarging the scene thumbnail;
and displaying the enlarged scene thumbnail on the game interface in the preset map orientation.
12. The game map display method according to claim 1, further comprising:
displaying a first direction indicator corresponding to a first virtual object in the game map, wherein the first direction indicator is used for indicating the position and the orientation of the first virtual object in the game scene, and the first virtual object and the target virtual object are in the same game;
in response to the selection operation of the first virtual object, displaying navigation information on the game interface, wherein the navigation information is used for indicating route information of the target virtual object to the first virtual object.
13. A game map display device, comprising:
the game system comprises a first display unit, a second display unit and a third display unit, wherein the first display unit is used for displaying a game interface of a target virtual object, the game interface comprises a game picture and a scene thumbnail, the game picture comprises at least part of a game scene and at least part of the virtual object positioned in the game scene, and the scene thumbnail is a thumbnail corresponding to the game scene;
the second display unit is used for responding to the trigger operation of the scene thumbnail and displaying a game map in a preset map orientation on the game interface, wherein the game map comprises a target orientation indicator corresponding to the target virtual object, and the target orientation indicator is used for indicating the position and the orientation of the target virtual object in the game scene;
and the rotating unit is used for responding to a navigation trigger instruction, controlling the target direction indicator to always point to a preset direction in the game interface, and rotating the game map in real time to enable the target direction indicator to always point to the orientation of the target virtual object in the game scene in the game map.
14. A terminal comprising a memory, a processor and a computer program stored on the memory and executable on the processor, wherein the processor when executing the computer program implements the steps of the game map display method according to any one of claims 1 to 12.
15. A computer-readable storage medium, on which a computer program is stored, wherein the computer program, when being executed by a processor, carries out the steps of the game map display method according to any one of claims 1 to 12.
CN202110871986.XA 2021-07-30 2021-07-30 Game map display method, game map display device, terminal and storage medium Active CN113546419B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110871986.XA CN113546419B (en) 2021-07-30 2021-07-30 Game map display method, game map display device, terminal and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110871986.XA CN113546419B (en) 2021-07-30 2021-07-30 Game map display method, game map display device, terminal and storage medium

Publications (2)

Publication Number Publication Date
CN113546419A true CN113546419A (en) 2021-10-26
CN113546419B CN113546419B (en) 2024-04-30

Family

ID=78105032

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110871986.XA Active CN113546419B (en) 2021-07-30 2021-07-30 Game map display method, game map display device, terminal and storage medium

Country Status (1)

Country Link
CN (1) CN113546419B (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114042315A (en) * 2021-10-29 2022-02-15 腾讯科技(深圳)有限公司 Virtual scene-based graphic display method, device, equipment and medium
CN116351057A (en) * 2023-03-31 2023-06-30 广州三七极耀网络科技有限公司 Game fight prompting method, game fight prompting device, electronic equipment and storage medium
WO2023173649A1 (en) * 2022-03-17 2023-09-21 网易(杭州)网络有限公司 Method and apparatus for processing information in game, and electronic device and storage medium

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103116444A (en) * 2013-02-07 2013-05-22 腾讯科技(深圳)有限公司 Electronic map control method and electronic map device
CN107899241A (en) * 2017-11-22 2018-04-13 网易(杭州)网络有限公司 Information processing method and device, storage medium, electronic equipment
CN108014495A (en) * 2017-11-23 2018-05-11 网易(杭州)网络有限公司 Method, storage medium and the electronic equipment of vision compensating sound information
CN108499104A (en) * 2018-04-17 2018-09-07 腾讯科技(深圳)有限公司 Direction display method, device, electronic device in virtual scene and medium
CN108888955A (en) * 2018-06-29 2018-11-27 网易(杭州)网络有限公司 Method of controlling viewing angle and device in a kind of game
JP2019209115A (en) * 2018-06-04 2019-12-12 任天堂株式会社 Game program, information processing system, information processor, and information processing method
CN110833694A (en) * 2019-11-15 2020-02-25 网易(杭州)网络有限公司 Display control method and device in game
CN111760288A (en) * 2020-06-10 2020-10-13 网易(杭州)网络有限公司 Method, device, terminal and storage medium for displaying orientation in virtual three-dimensional scene
CN112221143A (en) * 2020-10-09 2021-01-15 腾讯科技(深圳)有限公司 Method, device and storage medium for controlling movement of virtual object
CN112402976A (en) * 2020-11-24 2021-02-26 网易(杭州)网络有限公司 Game role control method, terminal, readable storage medium and electronic device
CN113082712A (en) * 2021-03-30 2021-07-09 网易(杭州)网络有限公司 Control method and device of virtual role, computer equipment and storage medium

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103116444A (en) * 2013-02-07 2013-05-22 腾讯科技(深圳)有限公司 Electronic map control method and electronic map device
CN107899241A (en) * 2017-11-22 2018-04-13 网易(杭州)网络有限公司 Information processing method and device, storage medium, electronic equipment
CN108014495A (en) * 2017-11-23 2018-05-11 网易(杭州)网络有限公司 Method, storage medium and the electronic equipment of vision compensating sound information
CN108499104A (en) * 2018-04-17 2018-09-07 腾讯科技(深圳)有限公司 Direction display method, device, electronic device in virtual scene and medium
JP2019209115A (en) * 2018-06-04 2019-12-12 任天堂株式会社 Game program, information processing system, information processor, and information processing method
CN108888955A (en) * 2018-06-29 2018-11-27 网易(杭州)网络有限公司 Method of controlling viewing angle and device in a kind of game
CN110833694A (en) * 2019-11-15 2020-02-25 网易(杭州)网络有限公司 Display control method and device in game
CN111760288A (en) * 2020-06-10 2020-10-13 网易(杭州)网络有限公司 Method, device, terminal and storage medium for displaying orientation in virtual three-dimensional scene
CN112221143A (en) * 2020-10-09 2021-01-15 腾讯科技(深圳)有限公司 Method, device and storage medium for controlling movement of virtual object
CN112402976A (en) * 2020-11-24 2021-02-26 网易(杭州)网络有限公司 Game role control method, terminal, readable storage medium and electronic device
CN113082712A (en) * 2021-03-30 2021-07-09 网易(杭州)网络有限公司 Control method and device of virtual role, computer equipment and storage medium

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114042315A (en) * 2021-10-29 2022-02-15 腾讯科技(深圳)有限公司 Virtual scene-based graphic display method, device, equipment and medium
CN114042315B (en) * 2021-10-29 2023-06-16 腾讯科技(深圳)有限公司 Virtual scene-based graphic display method, device, equipment and medium
WO2023173649A1 (en) * 2022-03-17 2023-09-21 网易(杭州)网络有限公司 Method and apparatus for processing information in game, and electronic device and storage medium
CN116351057A (en) * 2023-03-31 2023-06-30 广州三七极耀网络科技有限公司 Game fight prompting method, game fight prompting device, electronic equipment and storage medium
CN116351057B (en) * 2023-03-31 2024-03-29 广州三七极耀网络科技有限公司 Game fight prompting method, game fight prompting device, electronic equipment and storage medium

Also Published As

Publication number Publication date
CN113546419B (en) 2024-04-30

Similar Documents

Publication Publication Date Title
WO2019153824A1 (en) Virtual object control method, device, computer apparatus, and storage medium
CN113546419B (en) Game map display method, game map display device, terminal and storage medium
WO2019024700A1 (en) Emoji display method and device, and computer readable storage medium
CN113350793B (en) Interface element setting method and device, electronic equipment and storage medium
WO2022057407A1 (en) Widget dislay method and electronic device
CN113332719B (en) Virtual article marking method, device, terminal and storage medium
CN113633963A (en) Game control method, device, terminal and storage medium
CN111159449A (en) Image display method and electronic equipment
TWI817208B (en) Method and apparatus for determining selected target, computer device, non-transitory computer-readable storage medium, and computer program product
WO2024124814A1 (en) Game signal feedback method and apparatus, electronic device, and readable storage medium
WO2024045528A1 (en) Game control method and apparatus, and computer device and storage medium
CN110531905A (en) A kind of icon control method and terminal
EP4125274A1 (en) Method and apparatus for playing videos
CN113426115B (en) Game role display method, device and terminal
CN115193046A (en) Game display control method and device, computer equipment and storage medium
CN115999153A (en) Virtual character control method and device, storage medium and terminal equipment
CN114327197B (en) Message sending method, device, equipment and medium
JP2023528119A (en) VIRTUAL OBJECT CONTROL METHOD, APPARATUS, DEVICE, AND COMPUTER PROGRAM
CN112783386A (en) Page jump method, device, storage medium and computer equipment
CN112245914A (en) Visual angle adjusting method and device, storage medium and computer equipment
WO2024152504A1 (en) Game interaction method and apparatus, and computer device and storage medium
CN111399718B (en) Icon management method and electronic equipment
CN116617660A (en) Map element guiding method, device, terminal and storage medium in game
CN116764191A (en) Virtual scene area selection method and device, storage medium and computer equipment
CN117582665A (en) Character control method, character control device, electronic device and computer readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant