CN113426124B - Display control method and device in game, storage medium and computer equipment - Google Patents

Display control method and device in game, storage medium and computer equipment Download PDF

Info

Publication number
CN113426124B
CN113426124B CN202110736323.7A CN202110736323A CN113426124B CN 113426124 B CN113426124 B CN 113426124B CN 202110736323 A CN202110736323 A CN 202110736323A CN 113426124 B CN113426124 B CN 113426124B
Authority
CN
China
Prior art keywords
scene
dimensional game
ground
map
virtual character
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110736323.7A
Other languages
Chinese (zh)
Other versions
CN113426124A (en
Inventor
卢振宇
胡志鹏
程龙
刘勇成
袁思思
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Netease Hangzhou Network Co Ltd
Original Assignee
Netease Hangzhou Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netease Hangzhou Network Co Ltd filed Critical Netease Hangzhou Network Co Ltd
Priority to CN202110736323.7A priority Critical patent/CN113426124B/en
Publication of CN113426124A publication Critical patent/CN113426124A/en
Application granted granted Critical
Publication of CN113426124B publication Critical patent/CN113426124B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • A63F13/56Computing the motion of game characters with respect to other game characters, game objects or elements of the game scene, e.g. for simulating the behaviour of a group of virtual soldiers or for path finding
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/837Shooting of targets
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/8076Shooting

Abstract

The embodiment of the application discloses a display control method, a device, a storage medium and computer equipment in a game, wherein the game comprises a three-dimensional game scene and virtual characters positioned in the three-dimensional game scene, and the method comprises the following steps: responding to a movement control instruction aiming at the virtual character, and controlling the position of the virtual character in the three-dimensional game scene; responding to an orientation control instruction aiming at the virtual character, and controlling the sight direction of the virtual character in the three-dimensional game scene; determining game screen content displayed in a graphical user interface of the computer device based on the position and the gaze direction; and displaying a scene map of the three-dimensional game scene at a first designated area of the scene ground in response to the angle between the gaze direction and the scene ground of the three-dimensional game scene meeting a predetermined condition, wherein the first designated area is determined on the scene ground according to the position of the virtual character and/or the gaze direction of the virtual character.

Description

Display control method and device in game, storage medium and computer equipment
Technical Field
The present invention relates to the field of computers, and in particular, to a method and apparatus for controlling display in a game, a computer readable storage medium, and a computer device.
Background
In recent years, with development and popularization of computer equipment technology, applications having a virtual environment, such as: virtual reality applications, three-dimensional map programs, military simulation programs, first person shooter games (First person shooting game, FPS), multiplayer online tactical athletic games (Multiplayer Online Battle Arena Games, MOBA), etc.
In the related art, taking FPS game as an example, a user is generally required to trigger display of a map to view where himself is located, indicate the location of enemy to teammates, and the like. When receiving the operation of displaying the map triggered by the user, a map display interface can be displayed in the current user interface, and then the map can be displayed on the map display interface. However, in the related art, the current user interface is excessively blocked when the map is displayed.
Disclosure of Invention
The embodiment of the application provides a display control method and device in a game, a computer readable storage medium and computer equipment, which can improve the flexibility of map display.
In order to solve the technical problems, the embodiment of the application provides the following technical scheme:
a display control method in a game including a three-dimensional game scene and a virtual character located in the three-dimensional game scene, the method comprising:
Responding to a movement control instruction aiming at the virtual character, and controlling the position of the virtual character in the three-dimensional game scene;
responding to a direction control instruction aiming at the virtual character, and controlling the sight direction of the virtual character in the three-dimensional game scene;
determining game screen content displayed in a graphical user interface of the computer device according to the position and the line-of-sight direction;
and displaying a scene map of the three-dimensional game scene in a first designated area of the scene ground in response to the included angle between the sight direction and the scene ground of the three-dimensional game scene meeting a predetermined condition, wherein the first designated area is determined on the scene ground according to the position of the virtual character and/or the sight direction of the virtual character.
A display control device in a game, the game including a three-dimensional game scene and virtual characters located in the three-dimensional game scene, the device comprising:
the first control module is used for responding to a movement control instruction aiming at the virtual character and controlling the position of the virtual character in the three-dimensional game scene;
the second control module is used for responding to the direction control instruction aiming at the virtual character and controlling the sight direction of the virtual character in the three-dimensional game scene;
A determining module for determining game screen content displayed in a graphical user interface of the computer device according to the position and the line of sight direction;
and the first display module is used for responding to the fact that the included angle between the sight direction and the scene ground of the three-dimensional game scene meets a preset condition, and displaying a scene map of the three-dimensional game scene in a first designated area of the scene ground, wherein the first designated area is determined on the scene ground according to the position of the virtual character and/or the sight direction of the virtual character.
In some embodiments, the first display module includes:
and the first display sub-module is used for responding that the included angle between the sight line direction and the scene ground of the three-dimensional game scene meets the preset condition, no interactable virtual object exists in the first appointed area of the scene ground, and the scene map of the three-dimensional game scene is displayed in the first appointed area of the scene ground.
In some embodiments, the first display module includes:
and the second display sub-module is used for responding that the included angle between the sight line direction and the scene ground of the three-dimensional game scene meets the preset condition and receiving a scene map display instruction, and displaying a scene map of the three-dimensional game scene in a first appointed area of the scene ground.
In some embodiments, the apparatus further comprises:
the detection module is used for detecting whether the included angle between the changed sight direction and the scene ground of the three-dimensional game scene meets a preset condition or not when the position of the virtual character in the three-dimensional game scene and/or the sight direction of the virtual character in the three-dimensional game scene are changed;
and the second display module is used for displaying the scene map of the three-dimensional game scene in a second designated area on the scene ground if the virtual character is changed, wherein the second designated area is determined on the scene ground according to the changed position of the virtual character and/or the changed sight direction of the virtual character.
In some embodiments, the apparatus further comprises:
and the first dismissal module is used for dismissing and displaying the scene map of the three-dimensional game scene if the included angle between the changed sight line direction and the scene ground of the three-dimensional game scene does not meet the preset condition.
In some embodiments, the first display module includes:
the first acquisition submodule is used for responding to the fact that the included angle between the sight line direction and the scene ground of the three-dimensional game scene meets a preset condition, and acquiring a target included angle between the sight line direction and the scene ground;
The determining submodule is used for determining the size information of the scene map according to the target included angle;
and the third display sub-module is used for displaying the scene map of the three-dimensional game scene in the first appointed area of the scene ground according to the size information.
In some embodiments, the first display module includes:
the second acquisition sub-module is used for acquiring parameter information of the orientation control instruction, wherein the parameter information comprises at least one of control speed, control direction and residence time after control is completed;
and the fourth display sub-module is used for responding to the fact that the included angle between the sight line direction and the scene ground of the three-dimensional game scene meets the preset condition when the parameter information meets the preset condition, and displaying the scene map of the three-dimensional game scene in a first appointed area of the scene ground.
In some embodiments, the apparatus further comprises:
and the second dismissal module is used for dismissing and displaying the scene map of the three-dimensional game scene in response to the closing instruction aiming at the scene map.
In some embodiments, the apparatus further comprises:
the acquisition module is used for responding to the clicking instruction aiming at the scene map and acquiring the clicking position of the clicking instruction in the scene map;
The first marking module is used for marking the clicking position in the scene map and generating prompt information;
and the third display module is used for displaying the marks and broadcasting and/or displaying the prompt information.
In some embodiments, the apparatus further comprises:
the control module is used for responding to the direction control instruction aiming at the virtual character and controlling the sight direction of the virtual character in the three-dimensional game scene to obtain the adjusted sight direction;
the second marking module is used for marking the position corresponding to the falling point when the falling point of the adjusted sight direction is positioned on the scene map, and generating prompt information based on the falling point;
and the fourth display module is used for displaying the marks and broadcasting and/or displaying the prompt information.
In some embodiments, the apparatus further comprises:
and the adjusting module is used for responding to the size adjusting operation aiming at the scene map and adjusting the display size of the scene map.
In some embodiments, the first designated area is an area including a line-of-sight position where the line-of-sight direction intersects the scene ground; or (b)
The first specified area is an area including any position whose distance from the line-of-sight position is within a preset distance range.
A computer readable storage medium storing a plurality of instructions adapted to be loaded by a processor to perform steps in a display control method in a game as described above.
A computer device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, the processor implementing steps in a display control method in a game as described above when the program is executed.
In the embodiment of the application, the position of the virtual character in the three-dimensional game scene can be controlled by responding to the movement control instruction aiming at the virtual character; responding to a direction control instruction aiming at the virtual character, and controlling the sight direction of the virtual character in the three-dimensional game scene; determining game screen content displayed in a graphical user interface of the computer device according to the position and the line-of-sight direction; and displaying a scene map of the three-dimensional game scene in a first designated area of the scene ground in response to the included angle between the sight direction and the scene ground of the three-dimensional game scene meeting a predetermined condition, wherein the first designated area is determined on the scene ground according to the position of the virtual character and/or the sight direction of the virtual character. Therefore, the scene map is displayed in the appointed area on the scene ground, and other areas in the displayed three-dimensional game scene can be prevented from being blocked, so that the graphical user interface is not excessively blocked.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed in the description of the embodiments will be briefly introduced below, it being obvious that the drawings in the following description are only some embodiments of the present application, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1a is a schematic system diagram of a map display system according to an embodiment of the present application.
Fig. 1b is a schematic flow chart of a first method for controlling display in a game according to an embodiment of the present application.
Fig. 1c is a first schematic diagram of a graphical user interface provided in an embodiment of the present application.
Fig. 1d is a schematic diagram of a world coordinate system in a three-dimensional game scene according to an embodiment of the present application.
Fig. 1e is a rotation schematic diagram of a rotation angle of view of a camera model according to an embodiment of the present application.
Fig. 1f is a schematic view of a scenario provided in an embodiment of the present application.
Fig. 1g is a second schematic diagram of a graphical user interface provided in an embodiment of the present application.
Fig. 1h is a schematic diagram of a scene map provided in an embodiment of the present application.
Fig. 2 is a schematic diagram of a second flow of a display control method in a game according to an embodiment of the present application.
Fig. 3 is a schematic structural diagram of a display control device in a game according to an embodiment of the present application.
Fig. 4 is a schematic structural diagram of a computer device according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application. It will be apparent that the described embodiments are only some, but not all, of the embodiments of the present application. All other embodiments, which can be made by those skilled in the art based on the embodiments herein without making any inventive effort, are intended to be within the scope of the present application.
The embodiment of the application provides a display control method and device in a game, a storage medium and computer equipment. Specifically, the method for controlling display in the game according to the embodiment of the present application may be executed by a computer device, where the computer device may be a device such as a terminal or a server. The terminal may be a terminal device such as a smart phone, a tablet computer, a notebook computer, a touch screen, a game console, a personal computer (PC, personal Computer), a personal digital assistant (Personal Digital Assistant, PDA), etc., and the terminal device may further include a client, which may be a game application client, a browser client carrying a game program, or an instant messaging client, etc. The server may be an independent physical server, a server cluster or a distributed system formed by a plurality of physical servers, or a cloud server providing cloud services, cloud databases, cloud computing, cloud functions, cloud storage, network services, cloud communication, middleware services, domain name services, security services, CDNs, basic cloud computing services such as big data and artificial intelligent platforms.
For example, when the in-game display control method is run on the terminal, the terminal device stores a game application and presents a part of the game scene in the game through the display component. The terminal device is used for interacting with a user through a graphical user interface, for example, the terminal device downloads and installs a game application program and runs the game application program. The way in which the terminal device presents the graphical user interface to the user may include a variety of ways, for example, the graphical user interface may be rendered for display on a display screen of the terminal device, or presented by holographic projection. For example, the terminal device may include a touch display screen for presenting a graphical user interface including game screens and receiving operation instructions generated by a user acting on the graphical user interface, and a processor for running the game, generating the graphical user interface, responding to the operation instructions, and controlling the display of the graphical user interface on the touch display screen.
For example, when the display control method in the game is run on a server, it may be a cloud game. Cloud gaming refers to a game style based on cloud computing. In the running mode of the cloud game, a running main body of the game application program and a game picture presentation main body are separated, and storage and running of a display control method in the game are completed on a cloud game server. The game image presentation is completed at a cloud game client, which is mainly used for receiving and sending game data and presenting game images, for example, the cloud game client may be a display device with a data transmission function, such as a mobile terminal, a television, a computer, a palm computer, a personal digital assistant, etc., near a user side, but a terminal device executing a display control method in a game is a cloud game server in a cloud. When playing the game, the user operates the cloud game client to send an operation instruction to the cloud game server, the cloud game server runs the game according to the operation instruction, codes and compresses data such as game pictures and the like, returns the data to the cloud game client through a network, and finally decodes the data through the cloud game client and outputs the game pictures.
Referring to fig. 1a, fig. 1a is a schematic system diagram of a display control device in a game according to an embodiment of the present application. The system may include at least one terminal 1000, at least one server 2000, at least one database 3000, and a network 4000. Terminal 1000 held by a user may be connected to servers of different games through network 4000. Terminal 1000 can be any device having computing hardware capable of supporting and executing software products corresponding to a game. In addition, terminal 1000 can have one or more multi-touch sensitive screens for sensing and obtaining input from a user through touch or slide operations performed at multiple points of one or more touch sensitive display screens. In addition, when the system includes a plurality of terminals 1000, a plurality of servers 2000, and a plurality of networks 4000, different terminals 1000 may be connected to each other through different networks 4000, through different servers 2000. The network 4000 may be a wireless network or a wired network, such as a Wireless Local Area Network (WLAN), a Local Area Network (LAN), a cellular network, a 2G network, a 3G network, a 4G network, a 5G network, etc. In addition, the different terminals 1000 may be connected to other terminals or to a server or the like using their own bluetooth network or hotspot network. For example, multiple users may be online through different terminals 1000 so as to be connected via an appropriate network and synchronized with each other to support multiplayer games. In addition, the system may include a plurality of databases 3000, the plurality of databases 3000 being coupled to different servers 2000, and information related to the game environment may be continuously stored in the databases 3000 while different users play the multiplayer game online.
The embodiment of the application provides a display control method in a game, which can be executed by a terminal or a server. The embodiment of the present application will be described with an example in which a display control method in a game is executed by a terminal. The terminal comprises a display component and a processor, wherein the display component is used for presenting a graphical user interface and receiving operation instructions generated by a user acting on the display component. When a user operates the graphical user interface through the display component, the graphical user interface can control the local content of the terminal through responding to the received operation instruction, and can also control the content of the opposite-end server through responding to the received operation instruction. For example, the user-generated operational instructions for the graphical user interface include instructions for launching the gaming application, and the processor is configured to launch the gaming application after receiving the user-provided instructions for launching the gaming application. Further, the processor is configured to render and draw a graphical user interface associated with the game on the touch-sensitive display screen. A touch display screen is a multi-touch-sensitive screen capable of sensing touch or slide operations performed simultaneously by a plurality of points on the screen. The user performs touch operation on the graphical user interface by using a finger, and when the graphical user interface detects the touch operation, the graphical user interface controls different virtual objects in the graphical user interface of the game to perform actions corresponding to the touch operation. For example, the game may be any one of a leisure game, an action game, a role playing game, a strategy game, a sports game, an educational game, a first person shooter game (First person shooting game, FPS), and the like. Wherein the game may comprise a virtual scene of the game drawn on a graphical user interface. Further, one or more virtual objects, such as virtual characters, controlled by a user (or player) may be included in the virtual scene of the game. In addition, one or more obstacles, such as rails, ravines, walls, etc., may also be included in the virtual scene of the game to limit movement of the virtual object, e.g., to limit movement of the one or more objects to a particular area within the virtual scene. Optionally, the virtual scene of the game also includes one or more elements, such as skills, scores, character health status, energy, etc., to provide assistance to the player, provide virtual services, increase scores related to the player's performance, etc. In addition, the graphical user interface may also present one or more indicators to provide indication information to the player. For example, a game may include a player controlled virtual object and one or more other virtual objects (such as enemy characters). In one embodiment, one or more other virtual objects are controlled by other players of the game. For example, one or more other virtual objects may be computer controlled, such as a robot using an Artificial Intelligence (AI) algorithm, implementing a human-machine engagement mode. For example, virtual objects possess various skills or capabilities that a game player uses to achieve a goal. For example, the virtual object may possess one or more weapons, props, tools, etc. that may be used to eliminate other objects from the game. Such skills or capabilities may be activated by the player of the game using one of a plurality of preset touch operations with the touch display screen of the terminal. The processor may be configured to present a corresponding game screen in response to an operation instruction generated by a touch operation of the user.
It should be noted that, the system schematic diagram of the map display system shown in fig. 1a is only an example, and the map display system and the scene described in the embodiments of the present application are for more clearly describing the technical solutions of the embodiments of the present application, and do not constitute a limitation on the technical solutions provided in the embodiments of the present application, and those skilled in the art can know that, with the evolution of the map display system and the appearance of a new service scene, the technical solutions provided in the embodiments of the present application are equally applicable to similar technical problems.
In this embodiment, description will be made from the viewpoint of a display control apparatus in a game which can be integrated in a computer device having a storage unit and a microprocessor mounted thereto and having arithmetic capability.
Referring to fig. 1b, fig. 1b is a schematic flow chart of a first method for controlling display in a game according to an embodiment of the present application. The display control method in the game comprises the following steps:
101. and controlling the position of the virtual character in the three-dimensional game scene in response to the movement control instruction for the virtual character.
When a user runs a First person shooter game (FPS) through the computer device, a three-dimensional game scene contained in the game and a virtual character in the three-dimensional game scene controlled by the user are displayed on a graphical user interface of the computer device. The three-dimensional game scene is a game scene provided by an application program when the application program runs on the computer equipment, and can be a simulation environment for the real world, a semi-simulation and semi-fictional scene or a pure fictional scene. When the application program runs on the computer device, the game picture content of the three-dimensional game scene can be displayed on the graphical user interface of the computer device, and the game picture content displayed on the graphical user interface of the computer device is the game picture content presented when the virtual character observes the three-dimensional game scene. The virtual character can observe the three-dimensional game scene through a camera model, taking FPS game as an example, when the virtual character is at a first person viewing angle, the camera model is positioned at the head or neck of the virtual character, and when the virtual character is at a third person viewing angle, the camera model is positioned at the rear of the virtual character, so that the viewing angles are different, and the displayed game picture content is different. In addition, when the position and/or the line of sight direction of the virtual character in the three-dimensional game scene is changed, the game screen content is correspondingly changed.
Specifically, referring to fig. 1c, fig. 1c is a first schematic diagram of a graphical user interface according to an embodiment of the present application. The graphical user interface is presented by a screen of terminal 1000, in which is included virtual character 110 manipulated by the user, and aiming identification 120 for prompting the aiming position of the virtual weapon in the graphical user interface, mobile control 130 for controlling the movement of virtual character 110 in the three-dimensional virtual scene, aiming control 140 that virtual character 110 can use when attacking, attack control 150 for controlling the attack operation of virtual character 110 in the virtual environment, and scene resource object 161 (scene ground), scene resource object 162 (house), and scene resource object 163 (car) corresponding to the three-dimensional virtual environment, etc.
The movement control instruction may be an instruction generated by a user for a sliding operation of the movement control 130. Referring to fig. 1d, fig. 1d is a schematic diagram of a world coordinate system in a three-dimensional game scene according to an embodiment of the present application. The three-dimensional game scene has a world coordinate system constructed by an X-axis, a Y-axis and a Z-axis, so that the virtual character located in the three-dimensional game scene also has its corresponding coordinates (X 1 ,Y 1 ,Z 1 ). Specifically, the X-axis and the Y-axis may constitute the scene ground 20 of the three-dimensional game scene, and the user may control the virtual character to move on the scene ground 20 constituted by the X-axis and the Y-axis by a sliding operation with respect to the movement control 130, thereby controlling the position of the virtual character in the three-dimensional game scene.
102. And controlling the sight direction of the virtual character in the three-dimensional game scene in response to the direction control instruction for the virtual character.
The direction control instruction of the virtual character is used for controlling the sight direction of the virtual character in the three-dimensional game scene. The line of sight direction is achieved by adjusting the orientation of the lens in the camera model. Referring to fig. 1e, fig. 1e is a rotation schematic diagram of a rotation angle of a camera model according to an embodiment of the present application. The view angle direction of the lens 30 in the camera model can be adjusted by rotating the U-axis and the R-axis, thereby realizing the effect of controlling the view direction of the virtual character in the three-dimensional game scene.
Specifically, the direction control instruction may be an instruction generated by a sliding operation on the display screen of the terminal 1000, so as to achieve the effect of controlling the direction of the line of sight by sliding other areas of the graphical user interface except for the functional control (e.g. the movement control 130) that is already displayed.
103. The game screen content displayed in the graphical user interface of the computer device is determined based on the location and the gaze direction.
Wherein after determining the position of the virtual character in the three-dimensional game scene and the viewing direction of the virtual character in the three-dimensional game scene, game screen content presented when the three-dimensional game scene is observed in the viewing direction of the virtual character at the position of the virtual character can be displayed in the graphical user interface.
Taking fig. 1c as an example, the current position of the virtual character is in the vicinity of the car and the house, and when the sight direction of the virtual character is controlled to be 30 ° north-east, the car and the house are positioned in the sight direction of the virtual character in 30 ° north-east, that is, part of the three-dimensional game scene about the car and the house, that is, the game picture content, is displayed in the graphical user interface.
104. And displaying a scene map of the three-dimensional game scene in a first designated area of the scene ground in response to the angle between the line of sight direction and the scene ground of the three-dimensional game scene meeting a predetermined condition.
In the related art, when a scene map is required to be displayed, a map display interface is displayed on the graphical user interface, and then the scene map is displayed on the map display interface. Generally, the map display interface occupies a half area of the graphical user interface, and the partial area of the graphical user interface is blocked, so that the graphical user interface is excessively blocked when a scene map is displayed, and most of game picture contents displayed in the graphical user interface cannot be seen by a user, so that the user cannot react to events occurring in a current game scene when viewing the map.
In this embodiment, the scene map may be displayed in the first designated area on the scene ground, so that only the first designated area on the scene ground may be blocked, and other areas of the graphical user interface may not be blocked, so as to avoid excessively blocking the graphical user interface, so that a user may see most of the game screen content displayed in the graphical user interface, and further, when viewing the map, the user may react to an event occurring in the current game scene.
For example, when an included angle between a line of sight direction of the virtual character and a scene ground of the three-dimensional game scene satisfies a predetermined condition, a scene map of the three-dimensional game scene may be displayed in a first designated area of the scene ground. The preset conditions can be preset by the computer equipment according to a certain rule, for example, the preset conditions can be that the included angle between the sight direction of the virtual character and the scene ground of the three-dimensional game scene is between 20 degrees and 90 degrees; alternatively, the predetermined condition may be that an angle between a line of sight direction of the virtual character and a scene ground displayed on the graphical user interface is between 20 ° and 90 °. Specifically, the first designated area is determined on the scene floor according to the position of the virtual character and/or the line-of-sight direction of the virtual character. For example, the first designated area may be an area including a line-of-sight position where the line-of-sight direction of the virtual character intersects the scene ground. The first designated area may be an area including an arbitrary position whose distance from the line-of-sight position is within a preset distance range. The first designated area may also be any area in the scene ground. The preset distance range can be preset by the computer equipment according to a certain rule.
Referring to fig. 1f, fig. 1f is a schematic view of a scene provided in an embodiment of the present application. The virtual character is located at a certain position in a three-dimensional game scene constructed by an X axis, a Y axis and a Z axis, the sight direction is an alpha angle smaller than 90 degrees with the scene ground constructed by the X axis and the Y axis, and the alpha angle is smaller than 90 degrees, so that a scene map can be displayed in a first designated area of the scene ground.
Specifically, referring to fig. 1c and fig. 1g together, when the user controls the avatar 110 in fig. 1c to be low, the gui displayed by the computer device can be switched from the gui in fig. 1c to the gui in fig. 1g, i.e. the complete scene resource object 162 and the complete scene resource object 163 are switched to the partial scene resource object 162 and the partial scene resource object 163, and the alignment position of the alignment mark 120 is also shifted down. Assuming that the line of sight direction of the virtual character 110 in fig. 1g in the three-dimensional game scene is less than 90 ° from the scene ground 161, a scene map of the three-dimensional game scene may be displayed in the first designated area of the scene ground 161, which may be the scene map 170 in fig. 1g, for example.
In some embodiments, when the angle between the gaze direction of the virtual character and the scene ground is small, for example, less than 30 °, displaying the scene map in an area including a gaze location where the gaze direction intersects the scene ground may make it impossible for the user to see the scene map, and thus, when the angle between the gaze direction of the virtual character and the scene ground is small, the scene map may be displayed in an area including a location where the distance from the gaze location is within a preset distance range and the distance from the virtual character is less than the preset distance. The preset distance can be set by the computer equipment according to a certain rule.
In other embodiments, when the scene map is located at a position far from the virtual character in the three-dimensional game scene, for example, at a position where the angle between the line of sight direction of the virtual character and the scene ground is β, there may be a case where the user cannot see the scene map, so that the predetermined condition may be defined that the angle between the line of sight direction of the virtual character and the scene ground of the three-dimensional game scene is between 45 ° and 90 °, thereby avoiding the scene map from being displayed at a position far from the virtual character in the three-dimensional game scene.
It should be noted that the first designated area for displaying the scene map may be displayed on the graphical user interface so that the user can see the scene map.
In the embodiment of the application, the position of the virtual character in the three-dimensional game scene can be controlled by responding to the movement control instruction aiming at the virtual character; responding to an orientation control instruction aiming at the virtual character, and controlling the sight direction of the virtual character in the three-dimensional game scene; determining game screen content displayed in a graphical user interface of the computer device based on the position and the gaze direction; and displaying a scene map of the three-dimensional game scene at a first designated area of the scene ground in response to the angle between the gaze direction and the scene ground of the three-dimensional game scene meeting a predetermined condition, wherein the first designated area is determined on the scene ground according to the position of the virtual character and/or the gaze direction of the virtual character. Therefore, the scene map is displayed in the appointed area on the scene ground, and other areas in the displayed three-dimensional game scene can be prevented from being blocked, so that the graphical user interface is not excessively blocked.
In some embodiments, displaying a scene map of the three-dimensional game scene at a first designated area of the scene ground in response to an angle between the gaze direction and the scene ground of the three-dimensional game scene meeting a predetermined condition, comprises:
And displaying a scene map of the three-dimensional game scene in the first designated area of the scene ground in response to the angle between the line of sight direction and the scene ground of the three-dimensional game scene meeting a predetermined condition and no interactable virtual object being present in the first designated area of the scene ground.
The interactable virtual object can be an article for the survival of the virtual character, such as a first-aid kit, and the like, can also be an article for the attack of the virtual character, such as a weapon or a bullet, and can also be an article for assisting the attack, such as an octagon mirror, and the like.
For example, when displaying a graphical user interface of a first person shooter game (e.g., a chicken game) application, interactive virtual objects are typically placed on the ground, on a house, or in a house at intervals, which the user can control the virtual character to pick up.
In this embodiment, when determining that the first designated area of the scene map can be displayed on the scene ground, determining the area where the interactable virtual object exists as the first designated area should be avoided, so as to avoid a situation that the scene map is displayed on the interactable virtual object, and thus the user cannot pick up the interactable virtual object required by the user in time.
In some embodiments, displaying a scene map of the three-dimensional game scene at a first designated area of the scene ground in response to an angle between the gaze direction and the scene ground of the three-dimensional game scene meeting a predetermined condition, comprises:
and displaying a scene map of the three-dimensional game scene in a first designated area of the scene ground in response to the included angle between the line-of-sight direction and the scene ground of the three-dimensional game scene meeting a predetermined condition and receiving a scene map display instruction.
When the computer equipment is a mobile phone or a tablet computer, a map display control can be arranged on the graphical user interface, and a user can click on the map display control through a finger, so that the computer equipment can generate a scene map display instruction; physical keys of the mobile phone or the tablet computer, such as volume keys, power keys or other shoulder keys, can be multiplexed into map display controls when the computer device runs the game application program, and a user can press the physical keys through fingers so as to generate scene map display instructions.
When the computer equipment is a notebook computer or a desktop computer, a map display control can be arranged on the graphical user interface, and a user can control a mouse to click the map display control, so that the computer equipment can generate a scene map display instruction; a map display key may also be provided, for example, designating the M key of the keyboard as a map display button, and the user may press the M key on the keyboard with a finger so that the computer device may generate a scene map display instruction.
When the included angle between the sight line direction and the scene ground of the three-dimensional game scene meets the preset condition and the map display instruction is received, a scene map of the three-dimensional game scene can be displayed in a first designated area of the scene ground.
In some embodiments, after displaying the scene map of the three-dimensional game scene in the first designated area of the scene ground in response to the angle between the gaze direction and the scene ground of the three-dimensional game scene meeting the predetermined condition, further comprising:
(1) When the position of the virtual character in the three-dimensional game scene and/or the sight direction of the virtual character in the three-dimensional game scene are changed, detecting whether the included angle between the changed sight direction and the scene ground of the three-dimensional game scene meets a preset condition or not;
(2) If so, displaying a scene map of the three-dimensional game scene in a second designated area of the scene ground, wherein the second designated area is determined on the scene ground according to the changed position of the virtual character and/or the changed sight direction of the virtual character.
Wherein, after the scene map has been displayed in the first designated area of the scene map, the user may control the virtual object to continue moving or adjusting the line of sight direction in the three-dimensional game scene with respect to the slide operation, the keyboard operation, and the mouse operation. Therefore, when the position of the virtual character in the three-dimensional game scene and/or the sight direction of the virtual character in the three-dimensional game scene are detected to be changed, whether the included angle between the changed sight direction and the scene ground of the three-dimensional game scene meets the preset condition or not is detected, and when the preset condition is met, the scene map is continuously displayed in the second designated area. Wherein the second designated area is determined on the scene floor based on the changed position of the virtual character and/or the changed line of sight direction of the virtual character. For example, the second designated area may be an area including a line-of-sight position where the changed line-of-sight direction of the virtual character intersects the scene ground. The second designated area may be an area including an arbitrary position whose distance from the line-of-sight position is within a preset distance range. The second designated area may also be any area in the scene ground. The preset distance range can be preset by the computer equipment according to a certain rule. The second designated area may be the same as the first designated area or may be different from the first designated area.
Referring to fig. 1f, when the position of the virtual character in the three-dimensional game scene and/or the viewing direction of the virtual character in the three-dimensional game scene is changed, for example, the virtual character moves from the viewing direction having an angle alpha with the scene ground to the viewing direction having an angle gamma, a scene map of the three-dimensional game scene may be displayed at a second designated area of the scene ground when the angle gamma satisfies a predetermined condition. Wherein the second designated area may be different from the first designated area. For example, the first designated area may be a first area having a midpoint of a line-of-sight position intersecting the scene ground in a line-of-sight direction having an angle α with the scene ground, the first area being determinable according to a size of the scene ground and a size of the scene map to be displayed, the size of the first area being smaller than the size of the scene ground to be displayed on the graphical user interface, greater than or equal to the size of the scene map to be displayed; the second designated area may be a second area having a line-of-sight position intersecting the scene ground in a line-of-sight direction having an angle γ with the scene ground as a midpoint, the second area may be determined according to a size of the scene ground and a size of the scene map to be displayed, and the size of the second area is smaller than the size of the scene ground displayed on the graphical user interface and is greater than or equal to the size of the scene map to be displayed.
In some embodiments, to make the occlusion range of the scene map to the graphical user interface smaller, the display size of the scene map may be controlled to be smaller, and the size of the first designated area may just accommodate the scene map.
In some embodiments, when the size of the first area or the second area is smaller than the size to be displayed of the scene map, the size to be displayed of the scene map may be adjusted so that the adjusted size to be displayed of the scene map is smaller than or equal to the size of the first area or the second area.
In some embodiments, the method further comprises:
and if the included angle between the changed sight line direction and the scene ground of the three-dimensional game scene does not meet the preset condition, the scene map of the three-dimensional game scene is not displayed.
When the included angle between the changed sight line direction and the scene ground of the three-dimensional game scene does not meet the preset condition, the scene map of the three-dimensional game scene can be displayed in a dismissing mode. For example, assuming that the predetermined condition is that the angle between the gaze direction of the virtual character and the scene map is between 20 ° and 90 °, when the angle between the gaze direction of the virtual character after the change and the scene map is not in the above-described section, or when there is no angle between the gaze direction of the virtual character after the change and the scene ground displayed on the graphical user interface, it may be determined that the angle between the gaze direction after the change and the scene map does not satisfy the predetermined condition. For example, if the user adjusts the viewing direction of the virtual character to control the virtual character to raise the head to look at the sky, the included angle does not satisfy the predetermined condition, and thus the scene map can be dismissed by controlling the virtual character to raise the head.
In some embodiments, displaying a scene map of the three-dimensional game scene at a first designated area of the scene ground in response to an angle between the gaze direction and the scene ground of the three-dimensional game scene meeting a predetermined condition, comprises:
(1) Acquiring a target included angle between the sight line direction and the scene ground in response to the included angle between the sight line direction and the scene ground of the three-dimensional game scene meeting a preset condition;
(2) Determining the size information of the scene map according to the target included angle;
(3) And displaying a scene map of the three-dimensional game scene in a first designated area of the scene ground according to the size information.
The size information of the scene map displayed on the scene ground can be adaptively adjusted according to different angles of the target included angle between the sight line direction and the scene ground. For example, in fig. 1f, the angle of angle α is 45 °, the angle of angle β is 30 °, and the angle of angle γ is 60 °. The map displayed at the angle alpha, which is based on the size information of the scene map displayed at 45 deg., is a normal map without the need of resizing. Because the angle of the beta angle is smaller than 45 degrees, when the scene map is displayed according to the normal size, the problem that the scene map cannot be seen is caused, and the scene map can be stretched transversely and longitudinally, so that the scene map is enlarged, and a user can see the scene map clearly. Since the angle of the gamma angle is larger than 45 degrees, if the scene map is displayed according to the normal size, the problem of wasting the user interface area is caused, so that the scene map can be reduced horizontally and vertically, and the scene map is reduced. The above is merely an example, and the mapping relationship between the angles and the size information of the scene map may be set, so that the size information of the scene map corresponding to the different angles may be determined according to the mapping relationship, and the specific size adjustment means is not limited herein.
In some embodiments, displaying a scene map of the three-dimensional game scene at a first designated area of the scene ground in response to an angle between the gaze direction and the scene ground of the three-dimensional game scene meeting a predetermined condition, comprises:
(1) Acquiring parameter information of an orientation control instruction, wherein the parameter information comprises at least one of control speed, control direction and residence time after control is completed;
(2) When the parameter information meets the preset condition, a scene map of the three-dimensional game scene is displayed in a first designated area of the scene ground in response to the fact that the included angle between the sight line direction and the scene ground of the three-dimensional game scene meets the preset condition.
Wherein, the effect of automatically displaying the scene map on the scene ground can be realized by detecting the parameter information of the orientation control instruction. The parameter information includes at least one of a control speed, a control direction, and a residence time after the control is completed. Taking the example that the parameter information includes the stay time after the control is completed, when the stay time after the control is completed is greater than a preset time, for example, 1s, it is explained that the virtual character is looking at the scene ground for a long time, so it can be determined that the user needs to look at the scene map on the scene ground, and therefore the scene map of the three-dimensional game scene can be automatically displayed in the first designated area of the scene ground.
Specifically, the direction control instruction may be an instruction generated according to a corresponding sliding operation, and the parameter information of the direction control instruction may include at least one of a sliding speed, a sliding direction, and a residence time after the sliding is completed.
In some embodiments, after displaying the scene map of the three-dimensional game scene in the first designated area of the scene ground in response to the angle between the gaze direction and the scene ground of the three-dimensional game scene meeting the predetermined condition, further comprising:
and in response to a closing instruction for the scene map, the scene map of the three-dimensional game scene is canceled to be displayed.
When the computer equipment is a mobile phone or a tablet computer, a map closing control can be arranged on the displayed scene map, and a user can click the map closing control through a finger, so that the computer equipment can generate a closing instruction for the scene map; physical keys of the mobile phone or the tablet computer, such as a volume key, a power key, or other shoulder keys, can be multiplexed into map display controls when the computer device runs the game application program, and a user can press the physical keys through fingers so as to generate a closing instruction for a scene map.
When the computer equipment is a notebook computer or a desktop computer, a map closing control can be arranged on the graphical user interface, and a user can control a mouse to click the map closing control, so that the computer equipment can generate a closing instruction for a scene map; a map close key may also be provided, for example, designate an L key of the keyboard as a map close key, and the user may press the L key on the keyboard by a finger, so that the computer device may generate a close instruction for the scene map. In response to the closing instruction for the scene map, the scene map of the three-dimensional game scene can be dismissed to be displayed.
In some embodiments, after displaying the scene map of the three-dimensional game scene in the first designated area of the scene ground in response to the angle between the gaze direction and the scene ground of the three-dimensional game scene meeting the predetermined condition, further comprising:
(1) Responding to a clicking instruction aiming at a scene map, and acquiring a clicking position of the clicking instruction in the scene map;
(2) Marking the clicking position in the scene map and generating prompt information;
(3) Displaying the mark, and broadcasting and/or displaying the prompt information.
The user can prompt himself or teammates to have a specific condition at the mark by marking the scene map.
In particular, there are different marking modes for different marking operations. The first type of marking operation is a way in which the user clicks on the scene map by a mouse or clicks on the scene map by a finger. When a user clicks a scene map through a mouse or clicks a certain position of the scene map through a finger, the clicking position can be marked, and prompt information is generated, so that the mark is displayed, and the prompt information is broadcasted and/or displayed. For example, when the click location is the location of an enemy, a message "there is an enemy" may be announced and/or displayed; when the clicking position is the position of the teammate, broadcasting and/or displaying the information of the 'protection target'; when the click position is a blank position, information of "attack here" can be broadcasted and/or displayed.
As shown in fig. 1h, fig. 1h is a schematic diagram of a scene map provided in an embodiment of the present application. The scene map 170 includes a position display identifier 171 for displaying the virtual character manipulated by the user and a mark 172 identified by the user. After any player in the same team marks, other players in the same team can see the mark in the scene map.
In some embodiments, after displaying the scene map of the three-dimensional game scene in the first designated area of the scene ground in response to the angle between the gaze direction and the scene ground of the three-dimensional game scene meeting the predetermined condition, further comprising:
(1) Responding to an orientation control instruction aiming at the virtual character, controlling the sight direction of the virtual character in the three-dimensional game scene, and obtaining an adjusted sight direction;
(2) When the falling point of the adjusted sight direction is positioned on the scene map, marking the position corresponding to the falling point, and generating prompt information based on the falling point;
(3) Displaying the mark, and broadcasting and/or displaying the prompt information.
Wherein the second marking operation is to control the sight direction of the virtual character by the user toward the control instruction, thereby determining the marking based on the landing point of the sight direction on the scene map. For example, an intersection point of the line of sight direction and the scene map is determined, and the intersection point is a drop point, and the drop point can be determined as a mark on the scene map required by the user, and prompt information is generated, so that the mark is displayed, and the prompt information is broadcasted and/or displayed.
In some embodiments, after displaying the scene map of the three-dimensional game scene in the first designated area of the scene ground in response to the angle between the line of sight direction and the scene ground of the three-dimensional game scene satisfying the predetermined condition, the method further includes:
the display size of the scene map is adjusted in response to a resizing operation for the scene map.
After the scene map of the three-dimensional game scene is displayed in the first designated area of the scene ground, the user can adjust the size of the scene map according to the self requirement. The resizing operation may be a sliding operation of at least two fingers, the size of the scene map being determined by determining a distance of the two fingers.
Referring to fig. 2, fig. 2 is a schematic flow chart of a second method for controlling display in a game according to an embodiment of the present application. The method flow may include:
201. and controlling the position of the virtual character in the three-dimensional game scene in response to the movement control instruction for the virtual character.
202. And controlling the sight direction of the virtual character in the three-dimensional game scene in response to the direction control instruction for the virtual character.
203. The game screen content displayed in the graphical user interface of the computer device is determined based on the location and the gaze direction.
204. And displaying a scene map of the three-dimensional game scene in the first designated area of the scene ground in response to the angle between the line of sight direction and the scene ground of the three-dimensional game scene meeting a predetermined condition and no interactable virtual object being present in the first designated area of the scene ground.
205. When the position of the virtual character in the three-dimensional game scene and/or the sight direction of the virtual character in the three-dimensional game scene are changed, whether the included angle between the changed sight direction and the scene ground of the three-dimensional game scene meets the preset condition or not is detected.
206. And if the included angle between the changed sight line direction and the scene ground of the three-dimensional game scene meets the preset condition, displaying a scene map of the three-dimensional game scene in a second designated area of the scene ground.
207. And if the included angle between the changed sight line direction and the scene ground of the three-dimensional game scene does not meet the preset condition, the scene map of the three-dimensional game scene is not displayed.
The implementation of steps 201 to 207 can be referred to the previous embodiments, and will not be repeated here.
In order to facilitate better implementation of the display control method in the game provided by the embodiment of the application, the embodiment of the application also provides a device based on the display control method in the game. Where the meaning of nouns is the same as in the display control method in the game described above, specific implementation details may be referred to the description in the method embodiment.
Referring to fig. 3, fig. 3 is a schematic structural diagram of an in-game display control device according to an embodiment of the present application, where the in-game display control device may include a first control module 301, a second control module 302, a determining module 303, a first display module 304, and so on.
In some embodiments, the first display module 304 includes:
and the first display sub-module is used for responding that the included angle between the sight line direction and the scene ground of the three-dimensional game scene meets the preset condition, no interactable virtual object exists in the first appointed area of the scene ground, and the scene map of the three-dimensional game scene is displayed in the first appointed area of the scene ground.
In some embodiments, the first display module 304 includes:
and the second display sub-module is used for responding that the included angle between the sight line direction and the scene ground of the three-dimensional game scene meets the preset condition and receiving a scene map display instruction, and displaying a scene map of the three-dimensional game scene in a first appointed area of the scene ground.
In some embodiments, the apparatus further comprises:
the detection module is used for detecting whether the included angle between the changed sight direction and the scene ground of the three-dimensional game scene meets a preset condition or not when the position of the virtual character in the three-dimensional game scene and/or the sight direction of the virtual character in the three-dimensional game scene are changed;
And the second display module is used for displaying a scene map of the three-dimensional game scene in a second designated area of the scene ground if the included angle between the changed sight line direction and the scene ground of the three-dimensional game scene meets a preset condition, wherein the second designated area is determined on the scene ground according to the changed position of the virtual character and/or the changed sight line direction of the virtual character.
In some embodiments, the apparatus further comprises:
and the first dismissal module is used for dismissing and displaying the scene map of the three-dimensional game scene if the included angle between the changed sight line direction and the scene ground of the three-dimensional game scene does not meet the preset condition.
In some embodiments, the first display module 304 includes:
the first acquisition submodule is used for responding to the fact that the included angle between the sight line direction and the scene ground of the three-dimensional game scene meets a preset condition, and acquiring a target included angle between the sight line direction and the scene ground;
the determining submodule is used for determining the size information of the scene map according to the target included angle;
and the third display sub-module is used for displaying the scene map of the three-dimensional game scene in the first appointed area of the scene ground according to the size information.
In some embodiments, the first display module 304 includes:
the second acquisition sub-module is used for acquiring parameter information of the orientation control instruction, wherein the parameter information comprises at least one of control speed, control direction and residence time after control is completed;
and the fourth display sub-module is used for responding to the fact that the included angle between the sight line direction and the scene ground of the three-dimensional game scene meets the preset condition when the parameter information meets the preset condition, and displaying the scene map of the three-dimensional game scene in a first appointed area of the scene ground.
In some embodiments, the apparatus further comprises:
and the second dismissal module is used for dismissing and displaying the scene map of the three-dimensional game scene in response to the closing instruction aiming at the scene map.
In some embodiments, the apparatus further comprises:
the acquisition module is used for responding to the clicking instruction aiming at the scene map and acquiring the clicking position of the clicking instruction in the scene map;
the first marking module is used for marking the clicking position in the scene map and generating prompt information;
and the third display module is used for displaying the marks and broadcasting and/or displaying the prompt information.
In some embodiments, the apparatus further comprises:
the control module is used for responding to the direction control instruction aiming at the virtual character and controlling the sight direction of the virtual character in the three-dimensional game scene to obtain the adjusted sight direction;
the second marking module is used for marking the position corresponding to the falling point when the falling point of the adjusted sight direction is positioned on the scene map, and generating prompt information based on the falling point;
and the fourth display module is used for displaying the marks and broadcasting and/or displaying the prompt information.
In some embodiments, the apparatus further comprises:
and the adjusting module is used for responding to the size adjusting operation aiming at the scene map and adjusting the display size of the scene map.
In some embodiments, the first designated area is an area including a line-of-sight position where the line-of-sight direction intersects the scene ground; or (b)
The first specified area is an area including any position whose distance from the line-of-sight position is within a preset distance range.
As can be seen from the foregoing, in the embodiments of the present application, the first control module 301 responds to the movement control instruction for the virtual character to control the position of the virtual character in the three-dimensional game scene; the second control module 302 responds to the direction control instruction for the virtual character and controls the sight direction of the virtual character in the three-dimensional game scene; the determining module 303 determines game screen content displayed in a graphical user interface of the computer device according to the position and the line of sight direction; the first display module 304 displays a scene map of the three-dimensional game scene in a first designated area of the scene ground in response to an angle between the line-of-sight direction and the scene ground of the three-dimensional game scene meeting a predetermined condition, wherein the first designated area is determined on the scene ground according to the position of the virtual character and/or the line-of-sight direction of the virtual character. Therefore, the scene map is displayed in the appointed area on the scene ground, and other areas in the displayed three-dimensional game scene can be prevented from being blocked, so that the graphical user interface is not excessively blocked.
The specific implementation of each operation above may be referred to the previous embodiments, and will not be described herein.
Correspondingly, the embodiment of the application also provides a computer device, which can be a terminal or a server, wherein the terminal can be a terminal device such as a smart phone, a tablet computer, a notebook computer, a touch screen, a game console, a personal computer (PC, personal Computer), a personal digital assistant (Personal Digital Assistant, PDA) and the like. Fig. 4 is a schematic structural diagram of a computer device according to an embodiment of the present application, as shown in fig. 4. The computer apparatus 400 includes a processor 401 having one or more processing cores, a memory 402 having one or more computer readable storage media, and a computer program stored on the memory 402 and executable on the processor. The processor 401 is electrically connected to the memory 402. It will be appreciated by those skilled in the art that the computer device structure shown in the figures is not limiting of the computer device and may include more or fewer components than shown, or may combine certain components, or a different arrangement of components.
Processor 401 is a control center of computer device 400 and connects the various portions of the entire computer device 400 using various interfaces and lines to perform various functions of computer device 400 and process data by running or loading software programs and/or modules stored in memory 402 and invoking data stored in memory 402, thereby performing overall monitoring of computer device 400.
In the embodiment of the present application, the processor 401 in the computer device 400 loads the instructions corresponding to the processes of one or more application programs into the memory 402 according to the following steps, and the processor 401 executes the application programs stored in the memory 402, so as to implement various functions:
responding to a movement control instruction aiming at the virtual character, and controlling the position of the virtual character in the three-dimensional game scene;
responding to a direction control instruction aiming at the virtual character, and controlling the sight direction of the virtual character in the three-dimensional game scene;
determining game screen content displayed in a graphical user interface of the computer device according to the position and the line-of-sight direction;
and displaying a scene map of the three-dimensional game scene in a first designated area of the scene ground in response to the included angle between the sight direction and the scene ground of the three-dimensional game scene meeting a predetermined condition, wherein the first designated area is determined on the scene ground according to the position of the virtual character and/or the sight direction of the virtual character.
The specific implementation of each operation above may be referred to the previous embodiments, and will not be described herein.
Optionally, as shown in fig. 4, the computer device 400 further includes: a touch display 403, a radio frequency circuit 404, an audio circuit 405, an input unit 406, and a power supply 407. The processor 401 is electrically connected to the touch display 403, the radio frequency circuit 404, the audio circuit 405, the input unit 406, and the power supply 407, respectively. Those skilled in the art will appreciate that the computer device structure shown in FIG. 4 is not limiting of the computer device and may include more or fewer components than shown, or may be combined with certain components, or a different arrangement of components.
The touch display 403 may be used to display a graphical user interface and receive operation instructions generated by a user acting on the graphical user interface. The touch display screen 403 may include a display panel and a touch panel. Wherein the display panel may be used to display information entered by a user or provided to a user as well as various graphical user interfaces of a computer device, which may be composed of graphics, text, icons, video, and any combination thereof. Alternatively, the display panel may be configured in the form of a liquid crystal display (LCD, liquid Crystal Display), an Organic Light-Emitting Diode (OLED), or the like. The touch panel may be used to collect touch operations on or near the user (such as operations on or near the touch panel by the user using any suitable object or accessory such as a finger, stylus, etc.), and generate corresponding operation instructions, and the operation instructions execute corresponding programs. Alternatively, the touch panel may include two parts, a touch detection device and a touch controller. The touch detection device detects the touch azimuth of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch detection device, converts it into touch point coordinates, and sends the touch point coordinates to the processor 401, and can receive and execute commands sent from the processor 401. The touch panel may overlay the display panel, and upon detection of a touch operation thereon or thereabout, the touch panel is passed to the processor 401 to determine the type of touch event, and the processor 401 then provides a corresponding visual output on the display panel in accordance with the type of touch event. In the embodiment of the present application, the touch panel and the display panel may be integrated into the touch display screen 403 to implement the input and output functions. In some embodiments, however, the touch panel and the touch panel may be implemented as two separate components to perform the input and output functions. I.e. the touch-sensitive display 403 may also implement an input function as part of the input unit 406.
In the embodiment of the application, the game application program executed by the processor 401 generates a user interface, i.e. a graphical user interface, on the touch display screen 403, and the virtual environment on the graphical user interface contains the scene resource object. The touch display 403 is used for presenting a graphical user interface and receiving an operation instruction generated by a user acting on the graphical user interface.
The radio frequency circuitry 404 may be used to transceive radio frequency signals to establish wireless communications with a network device or other computer device via wireless communications.
The audio circuitry 405 may be used to provide an audio interface between a user and a computer device through speakers, microphones, and so on. The audio circuit 405 may transmit the received electrical signal after audio data conversion to a speaker, where the electrical signal is converted into a sound signal for output; on the other hand, the microphone converts the collected sound signals into electrical signals, which are received by the audio circuit 405 and converted into audio data, which are processed by the audio data output processor 401 and sent via the radio frequency circuit 404 to, for example, another computer device, or which are output to the memory 402 for further processing. The audio circuit 405 may also include an ear bud jack to provide communication of the peripheral ear bud with the computer device.
The input unit 406 may be used to receive input numbers, character information, or user characteristic information (e.g., fingerprint, iris, facial information, etc.), and to generate keyboard, mouse, joystick, optical, or trackball signal inputs related to user settings and function control.
The power supply 407 is used to power the various components of the computer device 400. Alternatively, the power supply 407 may be logically connected to the processor 401 through a power management system, so as to implement functions of managing charging, discharging, and power consumption management through the power management system. The power supply 407 may also include one or more of any of a direct current or alternating current power supply, a recharging system, a power failure detection circuit, a power converter or inverter, a power status indicator, and the like.
Although not shown in fig. 4, the computer device 400 may further include a camera, a sensor, a wireless fidelity module, a bluetooth module, etc., and will not be described herein.
In the foregoing embodiments, the descriptions of the embodiments are emphasized, and for parts of one embodiment that are not described in detail, reference may be made to related descriptions of other embodiments.
As can be seen from the above, the computer device provided in this embodiment may control the position of the virtual character in the three-dimensional game scene by responding to the movement control instruction for the virtual character; responding to a direction control instruction aiming at the virtual character, and controlling the sight direction of the virtual character in the three-dimensional game scene; determining game screen content displayed in a graphical user interface of the computer device according to the position and the line-of-sight direction; and displaying a scene map of the three-dimensional game scene in a first designated area of the scene ground in response to the included angle between the sight direction and the scene ground of the three-dimensional game scene meeting a predetermined condition, wherein the first designated area is determined on the scene ground according to the position of the virtual character and/or the sight direction of the virtual character. Therefore, the scene map is displayed in the appointed area on the scene ground, and other areas in the displayed three-dimensional game scene can be prevented from being blocked, so that the graphical user interface is not excessively blocked.
Those of ordinary skill in the art will appreciate that all or a portion of the steps of the various methods of the above embodiments may be performed by instructions, or by instructions controlling associated hardware, which may be stored in a computer-readable storage medium and loaded and executed by a processor.
To this end, embodiments of the present application provide a computer readable storage medium having stored therein a plurality of computer programs that can be loaded by a processor to perform steps in any of the skills control methods provided by embodiments of the present application. For example, the computer program may perform the steps of:
responding to a movement control instruction aiming at the virtual character, and controlling the position of the virtual character in the three-dimensional game scene; responding to a direction control instruction aiming at the virtual character, and controlling the sight direction of the virtual character in the three-dimensional game scene; determining game screen content displayed in a graphical user interface of the computer device according to the position and the line-of-sight direction; and displaying a scene map of the three-dimensional game scene in a first designated area of the scene ground in response to the included angle between the sight direction and the scene ground of the three-dimensional game scene meeting a predetermined condition, wherein the first designated area is determined on the scene ground according to the position of the virtual character and/or the sight direction of the virtual character.
The specific implementation of each operation above may be referred to the previous embodiments, and will not be described herein.
Wherein the storage medium may include: read Only Memory (ROM), random access Memory (RAM, random Access Memory), magnetic or optical disk, and the like.
The steps in any one of the display control methods in the game provided in the embodiments of the present application may be executed by the computer program stored in the storage medium, so that the beneficial effects that any one of the display control methods in the game provided in the embodiments of the present application may be achieved can be achieved, which are detailed in the previous embodiments and are not described herein again.
The foregoing has described in detail the method, apparatus, storage medium and computer device for controlling display in a game provided by the embodiments of the present application, and specific examples have been applied to illustrate the principles and embodiments of the present application, where the foregoing description of the embodiments is only for aiding in understanding the method and core idea of the present application; meanwhile, those skilled in the art will have variations in the specific embodiments and application scope in light of the ideas of the present application, and the present description should not be construed as limiting the present application in view of the above.

Claims (15)

1. A display control method in a game, wherein the game includes a three-dimensional game scene and a virtual character located in the three-dimensional game scene, the method comprising:
responding to a movement control instruction aiming at the virtual character, and controlling the position of the virtual character in the three-dimensional game scene;
responding to a direction control instruction aiming at the virtual character, and controlling the sight direction of the virtual character in the three-dimensional game scene;
determining game screen content displayed in a graphical user interface of the computer device according to the position and the line-of-sight direction;
and displaying a scene map of the three-dimensional game scene in a first designated area of the scene ground in response to the included angle between the sight direction and the scene ground of the three-dimensional game scene meeting a predetermined condition, wherein the first designated area is determined on the scene ground according to the position of the virtual character and/or the sight direction of the virtual character.
2. The method of claim 1, wherein displaying a scene map of the three-dimensional game scene at a first designated area of the scene ground in response to an angle between the gaze direction and the scene ground of the three-dimensional game scene meeting a predetermined condition comprises:
And displaying a scene map of the three-dimensional game scene in a first designated area of the scene ground in response to the included angle between the sight line direction and the scene ground of the three-dimensional game scene meeting a predetermined condition and no interactable virtual object being present in the first designated area of the scene ground.
3. The method of claim 1, wherein displaying a scene map of the three-dimensional game scene at a first designated area of the scene ground in response to an angle between the gaze direction and the scene ground of the three-dimensional game scene meeting a predetermined condition comprises:
and responding to the included angle between the sight line direction and the scene ground of the three-dimensional game scene to meet a preset condition and receiving a scene map display instruction, and displaying a scene map of the three-dimensional game scene in a first designated area of the scene ground.
4. The method of claim 1, wherein the responding to the angle between the line of sight direction and the scene floor of the three-dimensional game scene satisfying a predetermined condition further comprises, after displaying a scene map of the three-dimensional game scene in a first designated area of the scene floor:
When the position of the virtual character in the three-dimensional game scene and/or the sight direction of the virtual character in the three-dimensional game scene change, detecting whether the included angle between the changed sight direction and the scene ground of the three-dimensional game scene meets a preset condition or not;
and if so, displaying a scene map of the three-dimensional game scene in a second designated area of the scene ground, wherein the second designated area is determined on the scene ground according to the changed position of the virtual character and/or the changed sight direction of the virtual character.
5. The method according to claim 4, wherein the method further comprises:
and if the included angle between the changed sight line direction and the scene ground of the three-dimensional game scene does not meet the preset condition, the scene map of the three-dimensional game scene is dismissed to be displayed.
6. The method of claim 1, wherein displaying a scene map of the three-dimensional game scene at a first designated area of the scene ground in response to an angle between the gaze direction and the scene ground of the three-dimensional game scene meeting a predetermined condition comprises:
Responding to the fact that the included angle between the sight line direction and the scene ground of the three-dimensional game scene meets a preset condition, and obtaining a target included angle between the sight line direction and the scene ground;
determining the size information of the scene map according to the target included angle;
and displaying the scene map of the three-dimensional game scene in a first designated area of the scene ground according to the size information.
7. The method of claim 1, wherein displaying a scene map of the three-dimensional game scene at a first designated area of the scene ground in response to an angle between the gaze direction and the scene ground of the three-dimensional game scene meeting a predetermined condition comprises:
acquiring parameter information of the orientation control instruction, wherein the parameter information comprises at least one of control speed, control direction and residence time after control is completed;
and when the parameter information meets the preset condition, responding to the fact that the included angle between the sight line direction and the scene ground of the three-dimensional game scene meets the preset condition, and displaying a scene map of the three-dimensional game scene in a first appointed area of the scene ground.
8. The method of claim 1, wherein the responding to the angle between the line of sight direction and the scene floor of the three-dimensional game scene satisfying a predetermined condition further comprises, after displaying a scene map of the three-dimensional game scene in a first designated area of the scene floor:
And responding to a closing instruction aiming at the scene map, and displaying the scene map of the three-dimensional game scene in a dismissing way.
9. The method of claim 1, wherein the responding to the angle between the line of sight direction and the scene floor of the three-dimensional game scene satisfying a predetermined condition further comprises, after displaying a scene map of the three-dimensional game scene in a first designated area of the scene floor:
responding to a clicking instruction aiming at the scene map, and acquiring the clicking position of the clicking instruction in the scene map;
marking the clicking position in the scene map and generating prompt information;
and displaying the mark, and broadcasting and/or displaying the prompt information.
10. The method of claim 1, wherein the responding to the angle between the line of sight direction and the scene floor of the three-dimensional game scene satisfying a predetermined condition further comprises, after displaying a scene map of the three-dimensional game scene in a first designated area of the scene floor:
responding to an orientation control instruction aiming at the virtual character, and controlling the sight direction of the virtual character in the three-dimensional game scene to obtain an adjusted sight direction;
When the falling point of the adjusted sight direction is positioned on the scene map, marking the position corresponding to the falling point, and generating prompt information based on the falling point;
and displaying the mark, and broadcasting and/or displaying the prompt information.
11. The method of claim 1, wherein the responding to the angle between the line of sight direction and the scene floor of the three-dimensional game scene satisfying a predetermined condition further comprises, after displaying a scene map of the three-dimensional game scene in a first designated area of the scene floor:
and adjusting the display size of the scene map in response to a resizing operation for the scene map.
12. The method according to any one of claims 1 to 11, wherein the first specified area is an area including a line-of-sight position where the line-of-sight direction intersects the scene floor; or (b)
The first specified area is an area including any position whose distance from the line-of-sight position is within a preset distance range.
13. A display control apparatus in a game, the game comprising a three-dimensional game scene and a virtual character located in the three-dimensional game scene, the apparatus comprising:
The first control module is used for responding to a movement control instruction aiming at the virtual character and controlling the position of the virtual character in the three-dimensional game scene;
the second control module is used for responding to the direction control instruction aiming at the virtual character and controlling the sight direction of the virtual character in the three-dimensional game scene;
a determining module for determining game screen content displayed in a graphical user interface of the computer device according to the position and the line of sight direction;
and the first display module is used for responding to the fact that the included angle between the sight direction and the scene ground of the three-dimensional game scene meets a preset condition, and displaying a scene map of the three-dimensional game scene in a first designated area of the scene ground, wherein the first designated area is determined on the scene ground according to the position of the virtual character and/or the sight direction of the virtual character.
14. A computer readable storage medium storing a plurality of instructions adapted to be loaded by a processor to perform the steps in the method of controlling display in a game according to any one of claims 1 to 12.
15. A computer device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, characterized in that the processor implements the steps of the display control method in a game according to any one of claims 1 to 12 when the program is executed.
CN202110736323.7A 2021-06-30 2021-06-30 Display control method and device in game, storage medium and computer equipment Active CN113426124B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110736323.7A CN113426124B (en) 2021-06-30 2021-06-30 Display control method and device in game, storage medium and computer equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110736323.7A CN113426124B (en) 2021-06-30 2021-06-30 Display control method and device in game, storage medium and computer equipment

Publications (2)

Publication Number Publication Date
CN113426124A CN113426124A (en) 2021-09-24
CN113426124B true CN113426124B (en) 2024-03-12

Family

ID=77758292

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110736323.7A Active CN113426124B (en) 2021-06-30 2021-06-30 Display control method and device in game, storage medium and computer equipment

Country Status (1)

Country Link
CN (1) CN113426124B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113797543B (en) * 2021-09-27 2023-06-23 腾讯科技(深圳)有限公司 Game processing method, game processing device, computer device, storage medium and program product
CN114225416A (en) * 2021-12-16 2022-03-25 网易(杭州)网络有限公司 Game control method and device
CN114253401A (en) * 2021-12-27 2022-03-29 郑州捷安高科股份有限公司 Method and device for determining position in virtual scene, electronic equipment and storage medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2014195715A (en) * 2014-06-10 2014-10-16 任天堂株式会社 Game program and game device
CN108854068A (en) * 2018-06-27 2018-11-23 网易(杭州)网络有限公司 Display control method and device, storage medium and terminal in game
CN109692477A (en) * 2019-02-01 2019-04-30 网易(杭州)网络有限公司 A kind of method and apparatus that interface is shown
CN110180168A (en) * 2019-05-31 2019-08-30 网易(杭州)网络有限公司 A kind of display methods and device, storage medium and processor of game picture
CN112827170A (en) * 2021-02-08 2021-05-25 网易(杭州)网络有限公司 Information processing method in game, electronic device and storage medium

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4297804B2 (en) * 2004-02-19 2009-07-15 任天堂株式会社 GAME DEVICE AND GAME PROGRAM
US10807001B2 (en) * 2017-09-12 2020-10-20 Netease (Hangzhou) Network Co., Ltd. Information processing method, apparatus and computer readable storage medium

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2014195715A (en) * 2014-06-10 2014-10-16 任天堂株式会社 Game program and game device
CN108854068A (en) * 2018-06-27 2018-11-23 网易(杭州)网络有限公司 Display control method and device, storage medium and terminal in game
CN109692477A (en) * 2019-02-01 2019-04-30 网易(杭州)网络有限公司 A kind of method and apparatus that interface is shown
CN110180168A (en) * 2019-05-31 2019-08-30 网易(杭州)网络有限公司 A kind of display methods and device, storage medium and processor of game picture
CN112827170A (en) * 2021-02-08 2021-05-25 网易(杭州)网络有限公司 Information processing method in game, electronic device and storage medium

Also Published As

Publication number Publication date
CN113426124A (en) 2021-09-24

Similar Documents

Publication Publication Date Title
CN113426124B (en) Display control method and device in game, storage medium and computer equipment
CN113101652A (en) Information display method and device, computer equipment and storage medium
CN113082707B (en) Virtual object prompting method and device, storage medium and computer equipment
CN114522423A (en) Virtual object control method and device, storage medium and computer equipment
CN113398566A (en) Game display control method and device, storage medium and computer equipment
US20230271087A1 (en) Method and apparatus for controlling virtual character, device, and storage medium
CN111530075B (en) Method, device, equipment and medium for displaying picture of virtual environment
CN115193064A (en) Virtual object control method and device, storage medium and computer equipment
CN115999153A (en) Virtual character control method and device, storage medium and terminal equipment
CN112245914B (en) Viewing angle adjusting method and device, storage medium and computer equipment
CN113577781B (en) Non-player character NPC control method, device, equipment and medium
CN116139483A (en) Game function control method, game function control device, storage medium and computer equipment
CN115212572A (en) Control method and device of game props, computer equipment and storage medium
CN113398564B (en) Virtual character control method, device, storage medium and computer equipment
CN113332721B (en) Game control method, game control device, computer equipment and storage medium
US20240131434A1 (en) Method and apparatus for controlling put of virtual resource, computer device, and storage medium
CN116850594A (en) Game interaction method, game interaction device, computer equipment and computer readable storage medium
CN116999825A (en) Game control method, game control device, computer equipment and storage medium
CN116139484A (en) Game function control method, game function control device, storage medium and computer equipment
CN116328301A (en) Information prompting method, device, computer equipment and storage medium
CN115970284A (en) Attack method and device of virtual weapon, storage medium and computer equipment
CN116036589A (en) Attack perception method and device of virtual weapon, storage medium and computer equipment
CN116870472A (en) Game view angle switching method and device, computer equipment and storage medium
CN116474367A (en) Virtual lens control method and device, storage medium and computer equipment
CN116672714A (en) Virtual character control method, device, storage medium and terminal equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant