CN113426124A - Display control method and device in game, storage medium and computer equipment - Google Patents

Display control method and device in game, storage medium and computer equipment Download PDF

Info

Publication number
CN113426124A
CN113426124A CN202110736323.7A CN202110736323A CN113426124A CN 113426124 A CN113426124 A CN 113426124A CN 202110736323 A CN202110736323 A CN 202110736323A CN 113426124 A CN113426124 A CN 113426124A
Authority
CN
China
Prior art keywords
scene
dimensional game
map
virtual character
ground
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110736323.7A
Other languages
Chinese (zh)
Other versions
CN113426124B (en
Inventor
卢振宇
胡志鹏
程龙
刘勇成
袁思思
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Netease Hangzhou Network Co Ltd
Original Assignee
Netease Hangzhou Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netease Hangzhou Network Co Ltd filed Critical Netease Hangzhou Network Co Ltd
Priority to CN202110736323.7A priority Critical patent/CN113426124B/en
Publication of CN113426124A publication Critical patent/CN113426124A/en
Application granted granted Critical
Publication of CN113426124B publication Critical patent/CN113426124B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • A63F13/56Computing the motion of game characters with respect to other game characters, game objects or elements of the game scene, e.g. for simulating the behaviour of a group of virtual soldiers or for path finding
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/837Shooting of targets
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/8076Shooting

Abstract

The embodiment of the application discloses a display control method, a display control device, a storage medium and computer equipment in a game, wherein the game comprises a three-dimensional game scene and virtual characters positioned in the three-dimensional game scene, and the method comprises the following steps: responding to a movement control instruction aiming at the virtual character, and controlling the position of the virtual character in the three-dimensional game scene; responding to an orientation control instruction aiming at the virtual character, and controlling the sight direction of the virtual character in the three-dimensional game scene; determining game picture content displayed in a graphical user interface of the computer device according to the position and the sight direction; and displaying a scene map of the three-dimensional game scene in a first designated area of the scene ground in response to an included angle between the sight line direction and the scene ground of the three-dimensional game scene meeting a preset condition, wherein the first designated area is determined on the scene ground according to the position of the virtual character and/or the sight line direction of the virtual character.

Description

Display control method and device in game, storage medium and computer equipment
Technical Field
The present application relates to the field of computers, and in particular, to a method and an apparatus for controlling display in a game, a computer-readable storage medium, and a computer device.
Background
In recent years, along with the development and popularization of computer device technology, more and more applications with virtual environments are emerging, such as: virtual reality applications, three-dimensional map programs, military simulation programs, First person shooter Games (FPS), Multiplayer Online Battle Arena Games (MOBA), and the like.
In the related art, taking an FPS game as an example, a user generally needs to trigger a map to be displayed so as to view the position of the user, indicate the position of an enemy to teammates, and the like. When receiving an operation of displaying a map triggered by a user, a map display interface can be displayed in a current user interface, and then the map can be displayed on the map display interface. However, in the related art, the current user interface is excessively occluded when the map is displayed.
Disclosure of Invention
The embodiment of the application provides a display control method and device in a game, a computer readable storage medium and computer equipment, which can improve the flexibility of map display.
In order to solve the above technical problem, an embodiment of the present application provides the following technical solutions:
a display control method in a game including a three-dimensional game scene and virtual characters located in the three-dimensional game scene, the method comprising:
responding to a movement control instruction aiming at the virtual character, and controlling the position of the virtual character in the three-dimensional game scene;
responding to an orientation control instruction aiming at the virtual character, and controlling the sight direction of the virtual character in the three-dimensional game scene;
determining game picture content displayed in a graphical user interface of the computer device according to the position and the sight direction;
and in response to an included angle between the sight line direction and a scene ground of the three-dimensional game scene meeting a preset condition, displaying a scene map of the three-dimensional game scene in a first designated area of the scene ground, wherein the first designated area is determined on the scene ground according to the position of the virtual character and/or the sight line direction of the virtual character.
A display control apparatus in a game including a three-dimensional game scene and virtual characters located in the three-dimensional game scene, the apparatus comprising:
the first control module is used for responding to a movement control instruction aiming at the virtual character and controlling the position of the virtual character in the three-dimensional game scene;
the second control module is used for responding to an orientation control instruction aiming at the virtual character and controlling the sight direction of the virtual character in the three-dimensional game scene;
the determining module is used for determining game picture content displayed in a graphical user interface of the computer equipment according to the position and the sight direction;
and the first display module is used for responding that an included angle between the sight line direction and the scene ground of the three-dimensional game scene meets a preset condition, and displaying a scene map of the three-dimensional game scene in a first designated area of the scene ground, wherein the first designated area is determined on the scene ground according to the position of the virtual character and/or the sight line direction of the virtual character.
In some embodiments, the first display module comprises:
and the first display sub-module is used for responding to the fact that the included angle between the sight line direction and the scene ground of the three-dimensional game scene meets a preset condition and no interactive virtual object exists in a first designated area of the scene ground, and displaying a scene map of the three-dimensional game scene in the first designated area of the scene ground.
In some embodiments, the first display module comprises:
and the second display submodule is used for responding to the fact that the included angle between the sight line direction and the scene ground of the three-dimensional game scene meets a preset condition and receiving a scene map display instruction, and displaying the scene map of the three-dimensional game scene in the first designated area of the scene ground.
In some embodiments, the apparatus further comprises:
the detection module is used for detecting whether an included angle between the changed sight line direction and the scene ground of the three-dimensional game scene meets a preset condition or not when the position of the virtual character in the three-dimensional game scene and/or the sight line direction of the virtual character in the three-dimensional game scene are changed;
and if so, displaying a scene map of the three-dimensional game scene in a second designated area of the scene ground, wherein the second designated area is determined on the scene ground according to the changed position of the virtual character and/or the changed sight direction of the virtual character.
In some embodiments, the apparatus further comprises:
and the first canceling module is used for canceling and displaying the scene map of the three-dimensional game scene if the included angle between the changed sight line direction and the scene ground of the three-dimensional game scene does not meet the preset condition.
In some embodiments, the first display module comprises:
the first acquisition submodule is used for responding that an included angle between the sight line direction and the scene ground of the three-dimensional game scene meets a preset condition, and acquiring a target included angle between the sight line direction and the scene ground;
the determining submodule is used for determining the size information of the scene map according to the target included angle;
and the third display submodule is used for displaying the scene map of the three-dimensional game scene in the first designated area on the scene ground according to the size information.
In some embodiments, the first display module comprises:
the second obtaining submodule is used for obtaining parameter information of the orientation control instruction, and the parameter information comprises at least one of control speed, control direction and residence time after control is finished;
and the fourth display submodule is used for responding that an included angle between the sight line direction and the scene ground of the three-dimensional game scene meets a preset condition when the parameter information meets a preset condition, and displaying the scene map of the three-dimensional game scene in a first designated area of the scene ground.
In some embodiments, the apparatus further comprises:
and the second cancelling module is used for cancelling the scene map of the three-dimensional game scene in response to the closing instruction aiming at the scene map.
In some embodiments, the apparatus further comprises:
the acquisition module is used for responding to a click command aiming at the scene map and acquiring the click position of the click command in the scene map;
the first marking module is used for marking the click position in the scene map and generating prompt information;
and the third display module is used for displaying the mark and broadcasting and/or displaying the prompt message.
In some embodiments, the apparatus further comprises:
the control module is used for responding to a direction control instruction aiming at the virtual character, and controlling the sight direction of the virtual character in the three-dimensional game scene to obtain an adjusted sight direction;
the second marking module is used for marking the position corresponding to the drop point when the drop point of the adjusted sight line direction is positioned on the scene map, and generating prompt information based on the drop point;
and the fourth display module is used for displaying the mark and broadcasting and/or displaying the prompt message.
In some embodiments, the apparatus further comprises:
and the adjusting module is used for responding to the size adjusting operation aiming at the scene map and adjusting the display size of the scene map.
In some embodiments, the first designated area is an area including a gaze location where the gaze direction intersects the scene floor; or
The first designated area is an area including an arbitrary position whose distance from the sight line position is within a preset distance range.
A computer readable storage medium, storing a plurality of instructions, the instructions being suitable for being loaded by a processor to execute the steps of the display control method in the game.
A computer device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, the processor implementing the steps in the method of display control in a game as described above when executing the program.
In the embodiment of the application, the position of the virtual character in the three-dimensional game scene can be controlled by responding to the movement control instruction aiming at the virtual character; responding to an orientation control instruction aiming at the virtual character, and controlling the sight direction of the virtual character in the three-dimensional game scene; determining game picture content displayed in a graphical user interface of the computer device according to the position and the sight direction; and in response to an included angle between the sight line direction and a scene ground of the three-dimensional game scene meeting a preset condition, displaying a scene map of the three-dimensional game scene in a first designated area of the scene ground, wherein the first designated area is determined on the scene ground according to the position of the virtual character and/or the sight line direction of the virtual character. Therefore, the scene map is displayed in the designated area of the scene ground, and other areas in the displayed three-dimensional game scene can be prevented from being shielded, so that the graphical user interface is not shielded too much.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1a is a system schematic diagram of a map display system according to an embodiment of the present application.
Fig. 1b is a first flowchart illustrating a display control method in a game according to an embodiment of the present application.
Fig. 1c is a first schematic diagram of a graphical user interface provided in an embodiment of the present application.
Fig. 1d is a schematic diagram of a world coordinate system in a three-dimensional game scene according to an embodiment of the present application.
Fig. 1e is a schematic rotation diagram of a camera model rotation angle provided in this embodiment of the present application.
Fig. 1f is a scene schematic diagram provided in the embodiment of the present application.
Fig. 1g is a second schematic diagram of a graphical user interface provided in an embodiment of the present application.
Fig. 1h is a schematic diagram of a scene map provided in the embodiment of the present application.
Fig. 2 is a second flowchart of a display control method in a game according to an embodiment of the present application.
Fig. 3 is a schematic structural diagram of a display control device in a game according to an embodiment of the present application.
Fig. 4 is a schematic structural diagram of a computer device according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application. It is to be understood that the embodiments described are only a few embodiments of the present application and not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The embodiment of the application provides a display control method and device in a game, a storage medium and computer equipment. Specifically, the display control method in the game according to the embodiment of the present application may be executed by a computer device, where the computer device may be a terminal or a server or other devices. The terminal may be a terminal device such as a smart phone, a tablet Computer, a notebook Computer, a touch screen, a game machine, a Personal Computer (PC), a Personal Digital Assistant (PDA), and the like, and the terminal device may further include a client, where the client may be a game application client, a browser client carrying a game program, or an instant messaging client, and the like. The server may be an independent physical server, a server cluster or a distributed system formed by a plurality of physical servers, or a cloud server providing basic cloud computing services such as cloud service, a cloud database, cloud computing, a cloud function, cloud storage, network service, cloud communication, middleware service, domain name service, security service, CDN, and a big data and artificial intelligence platform.
For example, when the in-game display control method is executed on a terminal, the terminal device stores a game application program and presents part of game scenes in the game through a display component. The terminal device is used for interacting with a user through a graphical user interface, for example, downloading and installing a game application program through the terminal device and running the game application program. The manner in which the terminal device provides the graphical user interface to the user may include a variety of ways, for example, the graphical user interface may be rendered for display on a display screen of the terminal device or presented by holographic projection. For example, the terminal device may include a touch display screen for presenting a graphical user interface including a game screen and receiving operation instructions generated by a user acting on the graphical user interface, and a processor for executing the game, generating the graphical user interface, responding to the operation instructions, and controlling display of the graphical user interface on the touch display screen.
For example, when the display control method in the game is executed in a server, the game may be a cloud game. Cloud gaming refers to a gaming regime based on cloud computing. In the running mode of the cloud game, the running main body of the game application program and the game picture presenting main body are separated, and the storage and the running of the display control method in the game are finished on the cloud game server. The game screen presentation is performed at a cloud game client, which is mainly used for receiving and sending game data and presenting the game screen, for example, the cloud game client may be a display device with a data transmission function near a user side, such as a mobile terminal, a television, a computer, a palm computer, a personal digital assistant, and the like, but a terminal device for executing a display control method in a game is a cloud game server at the cloud end. When a game is played, a user operates the cloud game client to send an operation instruction to the cloud game server, the cloud game server runs the game according to the operation instruction, data such as game pictures and the like are encoded and compressed, the data are returned to the cloud game client through a network, and finally the data are decoded through the cloud game client and the game pictures are output.
Referring to fig. 1a, fig. 1a is a system schematic diagram of a display control device in a game according to an embodiment of the present application. The system may include at least one terminal 1000, at least one server 2000, at least one database 3000, and a network 4000. The terminal 1000 held by the user can be connected to servers of different games through the network 4000. Terminal 1000 can be any device having computing hardware capable of supporting and executing a software product corresponding to a game. In addition, terminal 1000 can have one or more multi-touch sensitive screens for sensing and obtaining user input through touch or slide operations performed at multiple points on one or more touch sensitive display screens. In addition, when the system includes a plurality of terminals 1000, a plurality of servers 2000, and a plurality of networks 4000, different terminals 1000 may be connected to each other through different networks 4000 and through different servers 2000. The network 4000 may be a wireless network or a wired network, such as a Wireless Local Area Network (WLAN), a Local Area Network (LAN), a cellular network, a 2G network, a 3G network, a 4G network, a 5G network, and so on. In addition, different terminals 1000 may be connected to other terminals or a server using their own bluetooth network or hotspot network. For example, a plurality of users may be online through different terminals 1000 to be connected and synchronized with each other through a suitable network to support multiplayer games. In addition, the system may include a plurality of databases 3000, the plurality of databases 3000 being coupled to different servers 2000, and information related to the game environment may be continuously stored in the databases 3000 when different users play the multiplayer game online.
The embodiment of the application provides a display control method in a game, which can be executed by a terminal or a server. The embodiment of the present application will be described with an example in which a display control method in a game is executed by a terminal. The terminal comprises a display component and a processor, wherein the display component is used for presenting a graphical user interface and receiving an operation instruction generated by a user acting on the display component. When the user operates the graphical user interface through the display component, the graphical user interface can control the local content of the terminal through responding to the received operation instruction, and can also control the content of the opposite-end server through responding to the received operation instruction. For example, the operation instruction generated by the user acting on the graphical user interface comprises an instruction for starting a game application, and the processor is configured to start the game application after receiving the instruction provided by the user for starting the game application. Further, the processor is configured to render and draw a graphical user interface associated with the game on the touch display screen. A touch display screen is a multi-touch sensitive screen capable of sensing a touch or slide operation performed at a plurality of points on the screen at the same time. The user uses a finger to perform touch operation on the graphical user interface, and when the graphical user interface detects the touch operation, different virtual objects in the graphical user interface of the game are controlled to perform actions corresponding to the touch operation. For example, the game may be any one of a leisure game, an action game, a role playing game, a strategy game, a sports game, a game for developing intelligence, a First Person Shooter (FPS) game, and the like. Wherein the game may include a virtual scene of the game drawn on a graphical user interface. Further, one or more virtual objects, such as virtual characters, controlled by the user (or player) may be included in the virtual scene of the game. Additionally, one or more obstacles, such as railings, ravines, walls, etc., may also be included in the virtual scene of the game to limit movement of the virtual objects, e.g., to limit movement of one or more objects to a particular area within the virtual scene. Optionally, the virtual scene of the game also includes one or more elements, such as skills, points, character health, energy, etc., to provide assistance to the player, provide virtual services, increase points related to player performance, etc. In addition, the graphical user interface may also present one or more indicators to provide instructional information to the player. For example, a game may include a player-controlled virtual object and one or more other virtual objects (such as enemy characters). In one embodiment, one or more other virtual objects are controlled by other players of the game. For example, one or more other virtual objects may be computer controlled, such as a robot using Artificial Intelligence (AI) algorithms, to implement a human-machine fight mode. For example, the virtual objects possess various skills or capabilities that the game player uses to achieve the goal. For example, the virtual object possesses one or more weapons, props, tools, etc. that may be used to eliminate other objects from the game. Such skills or capabilities may be activated by a player of the game using one of a plurality of preset touch operations with a touch display screen of the terminal. The processor may be configured to present a corresponding game screen in response to an operation instruction generated by a touch operation of a user.
It should be noted that the system schematic diagram of the map display system shown in fig. 1a is merely an example, the map display system and the scenario described in the embodiment of the present application are for more clearly illustrating the technical solution of the embodiment of the present application, and do not form a limitation to the technical solution provided in the embodiment of the present application, and as a person skilled in the art knows that along with the evolution of the map display system and the appearance of a new service scenario, the technical solution provided in the embodiment of the present application is also applicable to similar technical problems.
In the present embodiment, the description will be made from the viewpoint of the in-game display control apparatus, which may be specifically integrated in a computer device having a storage unit and a microprocessor mounted thereon and having an arithmetic capability.
Referring to fig. 1b, fig. 1b is a first flowchart illustrating a display control method in a game according to an embodiment of the present application. The display control method in the game comprises the following steps:
101. and responding to the movement control instruction aiming at the virtual character, and controlling the position of the virtual character in the three-dimensional game scene.
When a user runs a First-person shooter game (FPS) through a computer device, a three-dimensional game scene included in the game and a virtual character in the three-dimensional game scene, which is controlled by the user, are displayed on a graphical user interface of the computer device. The three-dimensional game scene is a game scene provided by an application program when the application program runs on computer equipment, and can be a simulation environment of a real world, a semi-simulation semi-fictional scene or a pure fictional scene. When the application program runs on the computer device, the game picture content of the three-dimensional game scene can be displayed on the graphical user interface of the computer device, and the game picture content displayed on the graphical user interface of the computer device is the game picture content presented when the virtual character observes the three-dimensional game scene. The virtual character can observe the three-dimensional game scene through the camera model, for example, the FPS game is adopted, when the virtual character is at a first person view angle, the camera model is positioned at the head or the neck of the virtual character, and when the virtual character is at a third person view angle, the camera model is positioned behind the virtual character, so that the virtual character is positioned at different view angles, and the displayed game picture content is different. In addition, when the position and/or the sight line direction of the virtual character in the three-dimensional game scene are changed, the game picture content is also changed correspondingly.
Specifically, please refer to fig. 1c, in which fig. 1c is a first schematic diagram of a graphical user interface provided in an embodiment of the present application. The graphical user interface is presented by a screen of terminal 1000, and the graphical user interface includes a virtual character 110 manipulated by a user, an aiming identifier 120 for prompting an aiming position of a virtual weapon in the graphical user interface, a movement control 130 for controlling the virtual character 110 to move in a three-dimensional virtual scene, an aiming control 140 that can be used by the virtual character 110 during an attack, an attack control 150 for controlling the virtual character 110 to perform an attack operation in a virtual environment, and a scene resource object 161 (scene ground), a scene resource object 162 (house), a scene resource object 163 (vehicle), and the like corresponding to the three-dimensional virtual environment.
The movement control instruction may be an instruction generated by a user for a sliding operation of the movement control 130. Referring to fig. 1d, fig. 1d is a schematic view of a world coordinate system in a three-dimensional game scene according to an embodiment of the present disclosure. The three-dimensional game scene has a world coordinate system constructed by an X axis, a Y axis and a Z axis, so that the virtual character positioned in the three-dimensional game scene also has corresponding coordinates (X axis)1,Y1,Z1). Specifically, the X-axis Y-axis may constitute the scene floor 20 of the three-dimensional game scene, and the user may control the virtual character to move on the scene floor 20 constituted by the X-axis Y-axis by the sliding operation of the movement control 130, so as to control the position of the virtual character in the three-dimensional game scene.
102. And responding to the orientation control instruction aiming at the virtual character, and controlling the sight direction of the virtual character in the three-dimensional game scene.
The orientation control instruction of the virtual character is used for controlling the sight direction of the virtual character in the three-dimensional game scene. The direction of the line of sight is achieved by adjusting the orientation of the lens in the camera model. Referring to fig. 1e, fig. 1e is a schematic rotation diagram of a camera model rotation angle according to an embodiment of the present disclosure. The view direction of the lens 30 in the camera model can be adjusted by rotating the U axis and the R axis, thereby achieving the effect of controlling the view direction of the virtual character in the three-dimensional game scene.
Specifically, the orientation control instruction may be an instruction generated by a sliding operation on a display screen of the terminal 1000, and the effect of controlling the direction of the line of sight is achieved by sliding other areas of the graphical user interface except the displayed function control (e.g., the movement control 130).
103. Game picture content displayed in a graphical user interface of a computer device is determined based on the position and gaze direction.
After the position of the virtual character in the three-dimensional game scene and the sight line direction of the virtual character in the three-dimensional game scene are determined, game picture content which is displayed when the three-dimensional game scene is observed in the sight line direction of the virtual character at the position of the virtual character can be displayed in the graphical user interface.
Taking fig. 1c as an example, the current position of the virtual character is in the vicinity of the car and the house, and when the line of sight of the virtual character is controlled to be 30 ° north, the car and the house are located in the line of sight of the virtual character in 30 ° north, that is, a part of the scene about the car and the house, that is, the game screen content, in the three-dimensional game scene is displayed in the graphical user interface.
104. And displaying a scene map of the three-dimensional game scene in a first designated area of the scene ground in response to an included angle between the sight line direction and the scene ground of the three-dimensional game scene meeting a preset condition.
In the related art, when a scene map needs to be displayed, a map display interface is displayed on a graphical user interface, and then the scene map is displayed on the map display interface. Generally, the map display interface occupies a half area of the graphical user interface, and the partial area of the graphical user interface is shielded, so that the graphical user interface is excessively shielded when a scene map is displayed, and a user cannot see most game picture contents displayed in the graphical user interface, so that the user cannot react to an event occurring in a current game scene when viewing the map.
In this embodiment, the scene map may be displayed in the first designated area of the scene ground, so that only the first designated area of the scene ground may be shielded, and other areas of the graphical user interface may not be shielded, thereby preventing the graphical user interface from being excessively shielded, allowing the user to see most of the game picture content displayed in the graphical user interface, and allowing the user to react to an event occurring in the current game scene when viewing the map.
For example, when an angle between the visual line direction of the virtual character and the scene ground of the three-dimensional game scene satisfies a predetermined condition, a scene map of the three-dimensional game scene may be displayed in a first designated area of the scene ground. The preset condition can be preset by the computer equipment according to a certain rule, for example, the preset condition can be that the included angle between the sight line direction of the virtual character and the scene ground of the three-dimensional game scene is between 20 degrees and 90 degrees; alternatively, the predetermined condition may be that the angle between the direction of the virtual character's line of sight and the scene floor displayed on the graphical user interface is between 20 ° and 90 °. Specifically, the first designated area is determined on the scene ground according to the position of the virtual character and/or the sight line direction of the virtual character. For example, the first designated area may be an area including a gaze location where a gaze direction of the virtual character intersects the scene ground. The first designated area may also be an area including an arbitrary position within a preset distance range from the sight line position. The first designated area may also be any area in the scene floor. Wherein the preset distance range can be preset by the computer device according to a certain rule.
Referring to fig. 1f, fig. 1f is a schematic view of a scene according to an embodiment of the present disclosure. The virtual character is located at a certain position in a three-dimensional game scene constructed by an X axis, a Y axis and a Z axis, the sight line direction is an alpha angle which is smaller than 90 degrees with the scene ground constructed by the X axis and the Y axis, and the scene map can be displayed in a first appointed area of the scene ground because the alpha angle is smaller than 90 degrees.
Specifically, referring to fig. 1c and fig. 1g together, when the user controls the virtual character 110 in fig. 1c to lower the head, the graphical user interface displayed by the computer device can be switched from the graphical user interface in fig. 1c to the graphical user interface in fig. 1g, i.e., the complete scene resource object 162 and the complete scene resource object 163 are displayed, and the position where the sighting mark 120 is aligned is shifted downward. Assuming that the direction of the virtual character 110 in fig. 1 g's line of sight in the three-dimensional game scene is less than 90 ° from the scene floor 161, a scene map of the three-dimensional game scene may be displayed in a first designated area of the scene floor 161, which may be, for example, the scene map 170 in fig. 1 g.
In some embodiments, when the angle between the visual line direction of the virtual character and the scene ground is small, for example, less than 30 °, the scene map may be displayed in an area including a visual line position where the visual line direction intersects with the scene ground, so that the user may not be able to see the scene map. Wherein the preset distance can be set by the computer device according to certain rules.
In other embodiments, when the scene map is located at a position far from the virtual character in the three-dimensional game scene, for example, at a position where an angle between a line of sight of the virtual character and a scene ground is β, there may be a case where the user cannot clearly see the scene map, and therefore, the predetermined condition may be defined as that an angle between the line of sight of the virtual character and the scene ground of the three-dimensional game scene is between 45 ° and 90 °, so as to avoid displaying the scene map at a position far from the virtual character in the three-dimensional game scene.
It should be noted that the first designated area for displaying the scene map may be displayed on the graphical user interface, so that the user can see the scene map.
In the embodiment of the application, the position of the virtual character in the three-dimensional game scene can be controlled by responding to the movement control instruction aiming at the virtual character; responding to an orientation control instruction aiming at the virtual character, and controlling the sight direction of the virtual character in the three-dimensional game scene; determining game picture content displayed in a graphical user interface of the computer device according to the position and the sight direction; and displaying a scene map of the three-dimensional game scene in a first designated area of the scene ground in response to an included angle between the sight line direction and the scene ground of the three-dimensional game scene meeting a preset condition, wherein the first designated area is determined on the scene ground according to the position of the virtual character and/or the sight line direction of the virtual character. Therefore, the scene map is displayed in the designated area of the scene ground, and other areas in the displayed three-dimensional game scene can be prevented from being shielded, so that the graphical user interface is not shielded too much.
In some embodiments, displaying a scene map of a three-dimensional game scene in a first designated area of a scene floor in response to an angle between a gaze direction and the scene floor of the three-dimensional game scene satisfying a predetermined condition comprises:
and displaying a scene map of the three-dimensional game scene in a first designated area of the scene ground in response to the fact that the included angle between the sight line direction and the scene ground of the three-dimensional game scene meets a preset condition and no interactive virtual object exists in the first designated area of the scene ground.
The interactive virtual object can be an article for the virtual character to live, such as an emergency kit and the like, an article for the virtual character to attack, such as a weapon or a bullet, and an article for assisting the attack, such as an octave mirror and the like.
For example, when displaying a graphical user interface for a first person shooter game application (e.g., a chicken game), interactive virtual objects are typically placed on the ground, on a house, in the house at intervals, and the user may control the virtual character to pick up the virtual object.
In this embodiment, when determining that the first designated area of the scene map can be displayed on the scene ground, it is to be avoided that the area where the interactive virtual object exists is determined as the first designated area, so as to avoid a situation that the scene map is displayed on the interactive virtual object, which results in that the user cannot timely pick up the interactive virtual object required by the user.
In some embodiments, displaying a scene map of a three-dimensional game scene in a first designated area of a scene floor in response to an angle between a gaze direction and the scene floor of the three-dimensional game scene satisfying a predetermined condition comprises:
and displaying a scene map of the three-dimensional game scene in a first designated area of the scene ground in response to the fact that the included angle between the sight line direction and the scene ground of the three-dimensional game scene meets a preset condition and a scene map display instruction is received.
When the computer equipment is a mobile phone or a tablet computer, a map display control can be arranged on a graphical user interface, and a user can click the map display control through fingers, so that the computer equipment can generate a scene map display instruction; and when the computer equipment runs the game application program, physical keys of the mobile phone or the tablet computer, such as a volume key, a power key or other shoulder keys, can be multiplexed into the map display control, and a user can press the physical keys through fingers, so that a scene map display instruction is generated.
When the computer equipment is a notebook computer or a desktop computer, a map display control can be arranged on the graphical user interface, and a user can control a mouse to click the map display control, so that the computer equipment can generate a scene map display instruction; a map display key may also be provided, for example, designating the M key of the keyboard as a map display button, and the user may press the M key on the keyboard by a finger, so that the computer device may generate a scene map display instruction.
When the included angle between the sight line direction and the scene ground of the three-dimensional game scene meets the preset condition and the map display instruction is received, the scene map of the three-dimensional game scene can be displayed in the first appointed area of the scene ground.
In some embodiments, after displaying the scene map of the three-dimensional game scene in the first designated area of the scene floor in response to the angle between the gaze direction and the scene floor of the three-dimensional game scene satisfying the predetermined condition, the method further comprises:
(1) when the position of the virtual character in the three-dimensional game scene and/or the sight direction of the virtual character in the three-dimensional game scene are/is changed, whether an included angle between the changed sight direction and the scene ground of the three-dimensional game scene meets a preset condition or not is detected;
(2) and if so, displaying a scene map of the three-dimensional game scene in a second designated area of the scene ground, wherein the second designated area is determined on the scene ground according to the changed position of the virtual character and/or the changed sight direction of the virtual character.
After the scene map is displayed in the first designated area of the scene map, the user may control the virtual object to continue to move or adjust the sight direction in the three-dimensional game scene for the sliding operation, the keyboard operation and the mouse operation. Therefore, when the position of the virtual character in the three-dimensional game scene and/or the visual line direction of the virtual character in the three-dimensional game scene are/is detected to be changed, whether the included angle between the changed visual line direction and the scene ground of the three-dimensional game scene meets the preset condition or not is detected, and when the preset condition is met, the scene map is continuously displayed in the second appointed area. The second designated area is determined on the scene ground according to the changed position of the virtual character and/or the changed sight line direction of the virtual character. For example, the second designated area may be an area including a gaze location where the changed gaze direction of the virtual character intersects the scene ground. The second designated area may also be an area including an arbitrary position within a preset distance range from the sight line position. The second designated area may also be any area in the scene floor. Wherein the preset distance range can be preset by the computer device according to a certain rule. The second designated area may be the same as or different from the first designated area.
Referring to fig. 1f, when the position of the virtual character in the three-dimensional game scene and/or the viewing direction of the virtual character in the three-dimensional game scene changes, for example, moves from the viewing direction at an angle α to the viewing direction at an angle γ, a scene map of the three-dimensional game scene may be displayed in a second designated area of the scene ground when the angle γ satisfies a predetermined condition. Wherein the second designated area may be different from the first designated area. For example, the first designated area may be a first area having a line-of-sight position as a midpoint, where a line-of-sight direction at an angle α to the scene ground intersects the scene ground, and the first area may be determined according to a size of the scene ground and a size of the scene map to be displayed, where the size of the first area is smaller than the size of the scene ground displayed on the graphical user interface and is greater than or equal to the size of the scene map to be displayed; the second designated area may be a second area having a line-of-sight position as a midpoint, the line-of-sight position intersecting the scene ground in a line-of-sight direction at an angle γ with the scene ground, and the second area may be determined according to a size of the scene ground and a size of the scene map to be displayed, and the size of the second area is smaller than the size of the scene ground displayed on the graphical user interface and is greater than or equal to the size of the scene map to be displayed.
In some embodiments, in order to make the shielding range of the scene map to the graphical user interface smaller, the display size of the controllable scene map is smaller, and the size of the first designated area just can accommodate the scene map.
In some embodiments, when the size of the first area or the second area is smaller than the size of the scene map to be displayed, the size of the scene map to be displayed may be adjusted, so that the size of the adjusted scene map to be displayed is smaller than or equal to the size of the first area or the second area.
In some embodiments, the method further comprises:
and if the included angle between the changed sight line direction and the scene ground of the three-dimensional game scene does not meet the preset condition, canceling to display the scene map of the three-dimensional game scene.
When the included angle between the changed sight line direction and the scene ground of the three-dimensional game scene does not meet the preset condition, the scene map of the three-dimensional game scene can be cancelled and displayed. For example, assuming that the predetermined condition is that the angle between the viewing direction of the virtual character and the scene map is between 20 ° and 90 °, when the angle between the changed viewing direction of the virtual character and the scene map is not in the above-mentioned interval or there is no angle between the changed viewing direction of the virtual character and the scene ground displayed on the graphical user interface, it may be determined that the angle between the changed viewing direction and the scene map does not satisfy the predetermined condition. For example, the user adjusts the sight line direction of the virtual character, so that the virtual character is controlled to lift the head to look towards the sky, and the included angle does not meet the preset condition, so that the scene map can be cancelled and displayed by controlling the virtual character to lift the head.
In some embodiments, displaying a scene map of a three-dimensional game scene in a first designated area of a scene floor in response to an angle between a gaze direction and the scene floor of the three-dimensional game scene satisfying a predetermined condition comprises:
(1) responding that an included angle between the sight line direction and the scene ground of the three-dimensional game scene meets a preset condition, and acquiring a target included angle between the sight line direction and the scene ground;
(2) determining the size information of the scene map according to the target included angle;
(3) and displaying a scene map of the three-dimensional game scene in a first designated area of the scene ground according to the size information.
The size information of the scene map displayed on the scene ground can be adaptively adjusted according to different angles of the target included angle between the sight line direction and the scene ground. For example, in FIG. 1f, angle α is 45, angle β is 30, and angle γ is 60. The map that can be displayed at the angle α is a normal map without resizing, with the size information of the scene map that can be displayed at 45 ° as a reference. Because the angle of the beta angle is less than 45 degrees, the problem that the scene map cannot be seen clearly can be caused if the scene map is displayed according to the normal size when the scene map is displayed, so that the scene map can be transversely and longitudinally stretched, and the scene map can be enlarged so that a user can see clearly. Since the angle of the gamma angle is larger than 45 degrees, if the scene map is displayed according to the normal size when the scene map is displayed, the problem of wasting the user interface area is caused, so that the scene map can be transversely and longitudinally reduced, and the scene map is reduced. The above is merely an example, and similarly, the mapping relationship between the angle and the size information of the scene map may be set, so as to determine the size information of the scene map corresponding to different angles according to the mapping relationship, and the specific size adjustment means is not limited here.
In some embodiments, displaying a scene map of a three-dimensional game scene in a first designated area of a scene floor in response to an angle between a gaze direction and the scene floor of the three-dimensional game scene satisfying a predetermined condition comprises:
(1) acquiring parameter information facing to the control instruction, wherein the parameter information comprises at least one of control speed, control direction and residence time after control is finished;
(2) and when the parameter information meets the preset condition, responding that an included angle between the sight line direction and the scene ground of the three-dimensional game scene meets the preset condition, and displaying a scene map of the three-dimensional game scene in a first designated area of the scene ground.
The effect of automatically displaying the scene map on the scene ground can be achieved by detecting the parameter information of the orientation control instruction. The parameter information includes at least one of a control speed, a control direction, and a stay time after completion of the control. Taking the example that the parameter information includes the stay time after the control is completed, when the stay time after the control is completed is longer than the preset time, for example, 1s, it is indicated that the virtual character looks at the scene ground for a longer time, and therefore it can be determined that the user needs to view the scene map on the scene ground, and therefore the scene map of the three-dimensional game scene can be automatically displayed in the first designated area of the scene ground.
Specifically, the orientation control command may be a command generated according to a corresponding sliding operation, and the parameter information of the orientation control command may include at least one of a sliding speed, a sliding direction, and a staying time after the sliding is completed.
In some embodiments, after displaying the scene map of the three-dimensional game scene in the first designated area of the scene floor in response to the angle between the gaze direction and the scene floor of the three-dimensional game scene satisfying the predetermined condition, the method further comprises:
and in response to a closing instruction aiming at the scene map, canceling the display of the scene map of the three-dimensional game scene.
When the computer equipment is a mobile phone or a tablet computer, a map closing control can be arranged on a displayed scene map, and a user can click the map closing control through fingers, so that the computer equipment can generate a closing instruction aiming at the scene map; and when the computer equipment runs the game application program, multiplexing physical keys of the mobile phone or the tablet computer, such as a volume key, a power key or other shoulder keys, into a map display control, and pressing the physical keys by a finger of a user to generate a closing instruction for the scene map.
When the computer equipment is a notebook computer or a desktop computer, a map closing control can be arranged on the graphical user interface, and a user can control a mouse to click the map closing control, so that the computer equipment can generate a closing instruction aiming at the scene map; a map closing key may also be provided, for example, an L key of a keyboard is designated as the map closing key, and a user may press the L key on the keyboard with a finger, so that the computer device may generate a closing instruction for the scene map. And in response to the closing instruction for the scene map, the scene map of the three-dimensional game scene can be cancelled and displayed.
In some embodiments, after displaying the scene map of the three-dimensional game scene in the first designated area of the scene floor in response to the angle between the gaze direction and the scene floor of the three-dimensional game scene satisfying the predetermined condition, the method further comprises:
(1) responding to a click instruction aiming at the scene map, and acquiring a click position of the click instruction in the scene map;
(2) marking the click position in the scene map and generating prompt information;
(3) and displaying the mark, and broadcasting and/or displaying prompt information.
The user can prompt the user or the teammates that a certain specific situation exists at the marked position in a mode of marking the scene map.
Specifically, different marking modes exist for different marking operations. The first marking operation is a manner in which a user clicks a scene map by a mouse or clicks the scene map by a finger. When a user clicks the scene map through a mouse or clicks a certain position of the scene map through a finger, the click position can be marked and prompt information can be generated, so that the mark is displayed, and the prompt information is broadcasted and/or displayed. For example, when the click position is the position of an enemy, the information of 'enemy exists here' can be broadcasted and/or displayed; when the clicking position is the position of the teammate, the information of the 'protection target' can be broadcasted and/or displayed; when the click position is a blank position, information of "attack here" can be broadcasted and/or displayed.
As shown in fig. 1h, fig. 1h is a schematic diagram of a scene map provided in the embodiment of the present application. The scene map 170 includes a position display identifier 171 for displaying the virtual character manipulated by the user and a mark 172 identified by the user. After any player in the same team marks, other players in the same team can see the mark in the scene map.
In some embodiments, after displaying the scene map of the three-dimensional game scene in the first designated area of the scene floor in response to the angle between the gaze direction and the scene floor of the three-dimensional game scene satisfying the predetermined condition, the method further comprises:
(1) responding to a direction control instruction aiming at the virtual character, and controlling the sight direction of the virtual character in the three-dimensional game scene to obtain an adjusted sight direction;
(2) when the drop point of the adjusted sight line direction is on the scene map, marking the position corresponding to the drop point, and generating prompt information based on the drop point;
(3) and displaying the mark, and broadcasting and/or displaying prompt information.
Wherein the second type of marking is that the user controls the sight line direction of the virtual character by facing the control instruction, thereby determining the mark based on the landing point of the sight line direction on the scene map. For example, an intersection point of the sight line direction and the scene map is determined, and the intersection point is a drop point, and the drop point can be determined as a mark on the scene map required by the user, and prompt information is generated, so that the mark is displayed, and the prompt information is broadcasted and/or displayed.
In some embodiments, after displaying the scene map of the three-dimensional game scene in the first designated area of the scene ground in response to the angle between the gaze direction and the scene ground of the three-dimensional game scene satisfying the predetermined condition, the method further comprises:
and responding to the size adjustment operation aiming at the scene map, and adjusting the display size of the scene map.
After the scene map of the three-dimensional game scene is displayed in the first designated area of the scene ground, the user can adjust the size of the scene map according to the self requirement. The resizing operation may be a sliding operation of at least two fingers, the size of the scene map being determined by determining the distance of the two fingers.
Referring to fig. 2, fig. 2 is a second flowchart illustrating a display control method in a game according to an embodiment of the present application. The method flow can comprise the following steps:
201. and responding to the movement control instruction aiming at the virtual character, and controlling the position of the virtual character in the three-dimensional game scene.
202. And responding to the orientation control instruction aiming at the virtual character, and controlling the sight direction of the virtual character in the three-dimensional game scene.
203. Game picture content displayed in a graphical user interface of a computer device is determined based on the position and gaze direction.
204. And displaying a scene map of the three-dimensional game scene in a first designated area of the scene ground in response to the fact that the included angle between the sight line direction and the scene ground of the three-dimensional game scene meets a preset condition and no interactive virtual object exists in the first designated area of the scene ground.
205. When the position of the virtual character in the three-dimensional game scene and/or the sight line direction of the virtual character in the three-dimensional game scene are/is changed, whether an included angle between the changed sight line direction and the scene ground of the three-dimensional game scene meets a preset condition or not is detected.
206. And if the included angle between the changed sight line direction and the scene ground of the three-dimensional game scene meets a preset condition, displaying a scene map of the three-dimensional game scene in a second specified area of the scene ground.
207. And if the included angle between the changed sight line direction and the scene ground of the three-dimensional game scene does not meet the preset condition, canceling to display the scene map of the three-dimensional game scene.
The specific implementation of step 201 to step 207 can refer to the foregoing embodiments, and will not be described herein.
In order to better implement the display control method in the game provided by the embodiment of the application, the embodiment of the application also provides a device based on the display control method in the game. The meaning of the noun is the same as that in the display control method in the game, and the details of the specific implementation can be referred to the description in the method embodiment.
Referring to fig. 3, fig. 3 is a schematic structural diagram of a display control device in a game according to an embodiment of the present disclosure, where the display control device in the game may include a first control module 301, a second control module 302, a determination module 303, a first display module 304, and the like.
In some embodiments, the first display module 304 includes:
and the first display sub-module is used for responding to the fact that the included angle between the sight line direction and the scene ground of the three-dimensional game scene meets a preset condition and no interactive virtual object exists in a first designated area of the scene ground, and displaying a scene map of the three-dimensional game scene in the first designated area of the scene ground.
In some embodiments, the first display module 304 includes:
and the second display submodule is used for responding to the fact that the included angle between the sight line direction and the scene ground of the three-dimensional game scene meets a preset condition and receiving a scene map display instruction, and displaying the scene map of the three-dimensional game scene in the first designated area of the scene ground.
In some embodiments, the apparatus further comprises:
the detection module is used for detecting whether an included angle between the changed sight line direction and the scene ground of the three-dimensional game scene meets a preset condition or not when the position of the virtual character in the three-dimensional game scene and/or the sight line direction of the virtual character in the three-dimensional game scene are changed;
and the second display module is used for displaying a scene map of the three-dimensional game scene in a second designated area of the scene ground if an included angle between the changed sight line direction and the scene ground of the three-dimensional game scene meets a preset condition, wherein the second designated area is determined on the scene ground according to the changed position of the virtual character and/or the changed sight line direction of the virtual character.
In some embodiments, the apparatus further comprises:
and the first canceling module is used for canceling and displaying the scene map of the three-dimensional game scene if the included angle between the changed sight line direction and the scene ground of the three-dimensional game scene does not meet the preset condition.
In some embodiments, the first display module 304 includes:
the first acquisition submodule is used for responding that an included angle between the sight line direction and the scene ground of the three-dimensional game scene meets a preset condition, and acquiring a target included angle between the sight line direction and the scene ground;
the determining submodule is used for determining the size information of the scene map according to the target included angle;
and the third display submodule is used for displaying the scene map of the three-dimensional game scene in the first designated area on the scene ground according to the size information.
In some embodiments, the first display module 304 includes:
the second obtaining submodule is used for obtaining parameter information of the orientation control instruction, and the parameter information comprises at least one of control speed, control direction and residence time after control is finished;
and the fourth display submodule is used for responding that an included angle between the sight line direction and the scene ground of the three-dimensional game scene meets a preset condition when the parameter information meets a preset condition, and displaying the scene map of the three-dimensional game scene in a first designated area of the scene ground.
In some embodiments, the apparatus further comprises:
and the second cancelling module is used for cancelling the scene map of the three-dimensional game scene in response to the closing instruction aiming at the scene map.
In some embodiments, the apparatus further comprises:
the acquisition module is used for responding to a click command aiming at the scene map and acquiring the click position of the click command in the scene map;
the first marking module is used for marking the click position in the scene map and generating prompt information;
and the third display module is used for displaying the mark and broadcasting and/or displaying the prompt message.
In some embodiments, the apparatus further comprises:
the control module is used for responding to a direction control instruction aiming at the virtual character, and controlling the sight direction of the virtual character in the three-dimensional game scene to obtain an adjusted sight direction;
the second marking module is used for marking the position corresponding to the drop point when the drop point of the adjusted sight line direction is positioned on the scene map, and generating prompt information based on the drop point;
and the fourth display module is used for displaying the mark and broadcasting and/or displaying the prompt message.
In some embodiments, the apparatus further comprises:
and the adjusting module is used for responding to the size adjusting operation aiming at the scene map and adjusting the display size of the scene map.
In some embodiments, the first designated area is an area including a gaze location where the gaze direction intersects the scene floor; or
The first designated area is an area including an arbitrary position whose distance from the sight line position is within a preset distance range.
As can be seen from the above, in the embodiment of the present application, the first control module 301 responds to the movement control instruction for the virtual character, and controls the position of the virtual character in the three-dimensional game scene; the second control module 302 responds to the orientation control instruction aiming at the virtual character and controls the sight direction of the virtual character in the three-dimensional game scene; the determining module 303 determines the game picture content displayed in the graphical user interface of the computer device according to the position and the gaze direction; the first display module 304 displays a scene map of the three-dimensional game scene in a first designated area of a scene ground in response to an included angle between the sight line direction and the scene ground of the three-dimensional game scene satisfying a predetermined condition, wherein the first designated area is determined on the scene ground according to the position of the virtual character and/or the sight line direction of the virtual character. Therefore, the scene map is displayed in the designated area of the scene ground, and other areas in the displayed three-dimensional game scene can be prevented from being shielded, so that the graphical user interface is not shielded too much.
The above operations can be implemented in the foregoing embodiments, and are not described in detail herein.
Correspondingly, the embodiment of the present application further provides a Computer device, where the Computer device may be a terminal or a server, and the terminal may be a terminal device such as a smart phone, a tablet Computer, a notebook Computer, a touch screen, a game machine, a Personal Computer (PC), a Personal Digital Assistant (PDA), and the like. As shown in fig. 4, fig. 4 is a schematic structural diagram of a computer device according to an embodiment of the present application. The computer apparatus 400 includes a processor 401 having one or more processing cores, a memory 402 having one or more computer-readable storage media, and a computer program stored on the memory 402 and executable on the processor. The processor 401 is electrically connected to the memory 402. Those skilled in the art will appreciate that the computer device configurations illustrated in the figures are not meant to be limiting of computer devices and may include more or fewer components than those illustrated, or some components may be combined, or a different arrangement of components.
The processor 401 is a control center of the computer device 400, connects the respective parts of the entire computer device 400 using various interfaces and lines, performs various functions of the computer device 400 and processes data by running or loading software programs and/or modules stored in the memory 402 and calling data stored in the memory 402, thereby monitoring the computer device 400 as a whole.
In the embodiment of the present application, the processor 401 in the computer device 400 loads instructions corresponding to processes of one or more application programs into the memory 402 according to the following steps, and the processor 401 runs the application programs stored in the memory 402, thereby implementing various functions:
responding to a movement control instruction aiming at the virtual character, and controlling the position of the virtual character in the three-dimensional game scene;
responding to an orientation control instruction aiming at the virtual character, and controlling the sight direction of the virtual character in the three-dimensional game scene;
determining game picture content displayed in a graphical user interface of the computer device according to the position and the sight direction;
and in response to an included angle between the sight line direction and a scene ground of the three-dimensional game scene meeting a preset condition, displaying a scene map of the three-dimensional game scene in a first designated area of the scene ground, wherein the first designated area is determined on the scene ground according to the position of the virtual character and/or the sight line direction of the virtual character.
The above operations can be implemented in the foregoing embodiments, and are not described in detail herein.
Optionally, as shown in fig. 4, the computer device 400 further includes: touch-sensitive display screen 403, radio frequency circuit 404, audio circuit 405, input unit 406 and power 407. The processor 401 is electrically connected to the touch display screen 403, the radio frequency circuit 404, the audio circuit 405, the input unit 406, and the power source 407. Those skilled in the art will appreciate that the computer device configuration illustrated in FIG. 4 does not constitute a limitation of computer devices, and may include more or fewer components than those illustrated, or some components may be combined, or a different arrangement of components.
The touch display screen 403 may be used for displaying a graphical user interface and receiving operation instructions generated by a user acting on the graphical user interface. The touch display screen 403 may include a display panel and a touch panel. The display panel may be used, among other things, to display information entered by or provided to a user and various graphical user interfaces of the computer device, which may be made up of graphics, text, icons, video, and any combination thereof. Alternatively, the Display panel may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like. The touch panel may be used to collect touch operations of a user on or near the touch panel (for example, operations of the user on or near the touch panel using any suitable object or accessory such as a finger, a stylus pen, and the like), and generate corresponding operation instructions, and the operation instructions execute corresponding programs. Alternatively, the touch panel may include two parts, a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 401, and can receive and execute commands sent by the processor 401. The touch panel may overlay the display panel, and when the touch panel detects a touch operation thereon or nearby, the touch panel may transmit the touch operation to the processor 401 to determine the type of the touch event, and then the processor 401 may provide a corresponding visual output on the display panel according to the type of the touch event. In the embodiment of the present application, the touch panel and the display panel may be integrated into the touch display screen 403 to realize input and output functions. However, in some embodiments, the touch panel and the touch panel can be implemented as two separate components to perform the input and output functions. That is, the touch display screen 403 may also be used as a part of the input unit 406 to implement an input function.
In the embodiment of the present application, a game application executed by the processor 401 generates a user interface, i.e., a graphical user interface, on the touch display screen 403, where a virtual environment on the graphical user interface includes scene resource objects. The touch display screen 403 is used for presenting a graphical user interface and receiving an operation instruction generated by a user acting on the graphical user interface.
The rf circuit 404 may be used for transceiving rf signals to establish wireless communication with a network device or other computer device via wireless communication, and for transceiving signals with the network device or other computer device.
The audio circuit 405 may be used to provide an audio interface between a user and a computer device through speakers, microphones. The audio circuit 405 may transmit the electrical signal converted from the received audio data to a speaker, and convert the electrical signal into a sound signal for output; on the other hand, the microphone converts the collected sound signal into an electrical signal, which is received by the audio circuit 405 and converted into audio data, which is then processed by the audio data output processor 401, and then sent to, for example, another computer device via the radio frequency circuit 404, or output to the memory 402 for further processing. The audio circuit 405 may also include an earbud jack to provide communication of a peripheral headset with the computer device.
The input unit 406 may be used to receive input numbers, character information, or user characteristic information (e.g., fingerprint, iris, facial information, etc.), and to generate keyboard, mouse, joystick, optical, or trackball signal inputs related to user settings and function control.
The power supply 407 is used to power the various components of the computer device 400. Optionally, the power source 407 may be logically connected to the processor 401 through a power management system, so as to implement functions of managing charging, discharging, power consumption management, and the like through the power management system. The power supply 407 may also include one or more dc or ac power sources, recharging systems, power failure detection circuitry, power converters or inverters, power status indicators, or any other component.
Although not shown in fig. 4, the computer device 400 may further include a camera, a sensor, a wireless fidelity module, a bluetooth module, etc., which are not described in detail herein.
In the foregoing embodiments, the descriptions of the respective embodiments have respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to related descriptions of other embodiments.
As can be seen from the above, the computer device provided in this embodiment may control the position of the virtual character in the three-dimensional game scene by responding to the movement control instruction for the virtual character; responding to an orientation control instruction aiming at the virtual character, and controlling the sight direction of the virtual character in the three-dimensional game scene; determining game picture content displayed in a graphical user interface of the computer device according to the position and the sight direction; and in response to an included angle between the sight line direction and a scene ground of the three-dimensional game scene meeting a preset condition, displaying a scene map of the three-dimensional game scene in a first designated area of the scene ground, wherein the first designated area is determined on the scene ground according to the position of the virtual character and/or the sight line direction of the virtual character. Therefore, the scene map is displayed in the designated area of the scene ground, and other areas in the displayed three-dimensional game scene can be prevented from being shielded, so that the graphical user interface is not shielded too much.
It will be understood by those skilled in the art that all or part of the steps of the methods of the above embodiments may be performed by instructions or by associated hardware controlled by the instructions, which may be stored in a computer readable storage medium and loaded and executed by a processor.
To this end, the present application provides a computer-readable storage medium, in which a plurality of computer programs are stored, where the computer programs can be loaded by a processor to execute the steps in the control method according to any one of the techniques provided in the present application. For example, the computer program may perform the steps of:
responding to a movement control instruction aiming at the virtual character, and controlling the position of the virtual character in the three-dimensional game scene; responding to an orientation control instruction aiming at the virtual character, and controlling the sight direction of the virtual character in the three-dimensional game scene; determining game picture content displayed in a graphical user interface of the computer device according to the position and the sight direction; and in response to an included angle between the sight line direction and a scene ground of the three-dimensional game scene meeting a preset condition, displaying a scene map of the three-dimensional game scene in a first designated area of the scene ground, wherein the first designated area is determined on the scene ground according to the position of the virtual character and/or the sight line direction of the virtual character.
The above operations can be implemented in the foregoing embodiments, and are not described in detail herein.
Wherein the storage medium may include: read Only Memory (ROM), Random Access Memory (RAM), magnetic or optical disks, and the like.
Since the computer program stored in the storage medium can execute the steps in any game display control method provided in the embodiments of the present application, the beneficial effects that can be achieved by any game display control method provided in the embodiments of the present application can be achieved, which are detailed in the foregoing embodiments and will not be described herein again.
The foregoing detailed description is directed to a method, an apparatus, a storage medium, and a computer device for controlling display in a game according to embodiments of the present application, and a specific example is applied to illustrate the principles and implementations of the present application, and the description of the foregoing embodiments is only used to help understand the method and the core concept of the present application; meanwhile, for those skilled in the art, according to the idea of the present application, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present application.

Claims (15)

1. A display control method in a game, wherein the game includes a three-dimensional game scene and a virtual character located in the three-dimensional game scene, the method comprising:
responding to a movement control instruction aiming at the virtual character, and controlling the position of the virtual character in the three-dimensional game scene;
responding to an orientation control instruction aiming at the virtual character, and controlling the sight direction of the virtual character in the three-dimensional game scene;
determining game picture content displayed in a graphical user interface of the computer device according to the position and the sight direction;
and in response to an included angle between the sight line direction and a scene ground of the three-dimensional game scene meeting a preset condition, displaying a scene map of the three-dimensional game scene in a first designated area of the scene ground, wherein the first designated area is determined on the scene ground according to the position of the virtual character and/or the sight line direction of the virtual character.
2. The method of claim 1, wherein displaying a scene map of the three-dimensional game scene in a first designated area of a scene floor in response to an angle between the gaze direction and the scene floor of the three-dimensional game scene satisfying a predetermined condition comprises:
and in response to the included angle between the sight line direction and the scene ground of the three-dimensional game scene meeting a preset condition and no interactive virtual object exists in the first designated area of the scene ground, displaying a scene map of the three-dimensional game scene in the first designated area of the scene ground.
3. The method of claim 1, wherein displaying a scene map of the three-dimensional game scene in a first designated area of a scene floor in response to an angle between the gaze direction and the scene floor of the three-dimensional game scene satisfying a predetermined condition comprises:
and responding to the fact that the included angle between the sight line direction and the scene ground of the three-dimensional game scene meets a preset condition and receiving a scene map display instruction, and displaying a scene map of the three-dimensional game scene in a first designated area of the scene ground.
4. The method of claim 1, wherein, in response to the angle between the gaze direction and a scene floor of the three-dimensional gaming scene satisfying a predetermined condition, after displaying a scene map of the three-dimensional gaming scene at a first designated area of the scene floor, further comprising:
when the position of the virtual character in the three-dimensional game scene and/or the sight direction of the virtual character in the three-dimensional game scene are changed, detecting whether an included angle between the changed sight direction and the scene ground of the three-dimensional game scene meets a preset condition;
if so, displaying a scene map of the three-dimensional game scene in a second designated area of the scene ground, wherein the second designated area is determined on the scene ground according to the changed position of the virtual character and/or the changed sight line direction of the virtual character.
5. The method of claim 4, further comprising:
and if the included angle between the changed sight line direction and the scene ground of the three-dimensional game scene does not meet the preset condition, canceling to display the scene map of the three-dimensional game scene.
6. The method of claim 1, wherein displaying a scene map of the three-dimensional game scene in a first designated area of a scene floor in response to an angle between the gaze direction and the scene floor of the three-dimensional game scene satisfying a predetermined condition comprises:
responding that an included angle between the sight line direction and the scene ground of the three-dimensional game scene meets a preset condition, and acquiring a target included angle between the sight line direction and the scene ground;
determining the size information of the scene map according to the target included angle;
and displaying a scene map of the three-dimensional game scene in a first designated area on the scene ground according to the size information.
7. The method of claim 1, wherein displaying a scene map of the three-dimensional game scene in a first designated area of a scene floor in response to an angle between the gaze direction and the scene floor of the three-dimensional game scene satisfying a predetermined condition comprises:
acquiring parameter information of the orientation control instruction, wherein the parameter information comprises at least one of control speed, control direction and residence time after control is finished;
and when the parameter information meets a preset condition, responding that an included angle between the sight line direction and the scene ground of the three-dimensional game scene meets a preset condition, and displaying a scene map of the three-dimensional game scene in a first designated area of the scene ground.
8. The method of claim 1, wherein, in response to the angle between the gaze direction and a scene floor of the three-dimensional gaming scene satisfying a predetermined condition, after displaying a scene map of the three-dimensional gaming scene at a first designated area of the scene floor, further comprising:
and in response to a closing instruction aiming at the scene map, canceling the display of the scene map of the three-dimensional game scene.
9. The method of claim 1, wherein, in response to the angle between the gaze direction and a scene floor of the three-dimensional gaming scene satisfying a predetermined condition, after displaying a scene map of the three-dimensional gaming scene at a first designated area of the scene floor, further comprising:
responding to a click command aiming at the scene map, and acquiring a click position of the click command in the scene map;
marking the click position in the scene map and generating prompt information;
and displaying the mark, and broadcasting and/or displaying the prompt message.
10. The method of claim 1, wherein, in response to the angle between the gaze direction and a scene floor of the three-dimensional gaming scene satisfying a predetermined condition, after displaying a scene map of the three-dimensional gaming scene at a first designated area of the scene floor, further comprising:
responding to a direction control instruction aiming at the virtual character, and controlling the sight direction of the virtual character in the three-dimensional game scene to obtain an adjusted sight direction;
when the drop point of the adjusted sight line direction is located on the scene map, marking the position corresponding to the drop point, and generating prompt information based on the drop point;
and displaying the mark, and broadcasting and/or displaying the prompt message.
11. The method of claim 1, wherein, in response to the angle between the gaze direction and a scene floor of the three-dimensional gaming scene satisfying a predetermined condition, after displaying a scene map of the three-dimensional gaming scene at a first designated area of the scene floor, further comprising:
and responding to the size adjustment operation aiming at the scene map, and adjusting the display size of the scene map.
12. The method of any one of claims 1 to 11, wherein the first designated area is an area including a gaze location where the gaze direction intersects the scene floor; or
The first designated area is an area including an arbitrary position whose distance from the sight line position is within a preset distance range.
13. A display control apparatus in a game including a three-dimensional game scene and virtual characters located in the three-dimensional game scene, the apparatus comprising:
the first control module is used for responding to a movement control instruction aiming at the virtual character and controlling the position of the virtual character in the three-dimensional game scene;
the second control module is used for responding to an orientation control instruction aiming at the virtual character and controlling the sight direction of the virtual character in the three-dimensional game scene;
the determining module is used for determining game picture content displayed in a graphical user interface of the computer equipment according to the position and the sight direction;
and the first display module is used for responding that an included angle between the sight line direction and the scene ground of the three-dimensional game scene meets a preset condition, and displaying a scene map of the three-dimensional game scene in a first designated area of the scene ground, wherein the first designated area is determined on the scene ground according to the position of the virtual character and/or the sight line direction of the virtual character.
14. A computer-readable storage medium storing a plurality of instructions adapted to be loaded by a processor to perform the steps of the in-game display control method according to any one of claims 1 to 12.
15. A computer device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, characterized in that the processor implements the steps in the in-game display control method according to any one of claims 1 to 12 when executing the program.
CN202110736323.7A 2021-06-30 2021-06-30 Display control method and device in game, storage medium and computer equipment Active CN113426124B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110736323.7A CN113426124B (en) 2021-06-30 2021-06-30 Display control method and device in game, storage medium and computer equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110736323.7A CN113426124B (en) 2021-06-30 2021-06-30 Display control method and device in game, storage medium and computer equipment

Publications (2)

Publication Number Publication Date
CN113426124A true CN113426124A (en) 2021-09-24
CN113426124B CN113426124B (en) 2024-03-12

Family

ID=77758292

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110736323.7A Active CN113426124B (en) 2021-06-30 2021-06-30 Display control method and device in game, storage medium and computer equipment

Country Status (1)

Country Link
CN (1) CN113426124B (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113797543A (en) * 2021-09-27 2021-12-17 腾讯科技(深圳)有限公司 Game processing method, game processing device, computer device, storage medium, and program product
CN114253401A (en) * 2021-12-27 2022-03-29 郑州捷安高科股份有限公司 Method and device for determining position in virtual scene, electronic equipment and storage medium
WO2023109328A1 (en) * 2021-12-16 2023-06-22 网易(杭州)网络有限公司 Game control method and apparatus

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050187015A1 (en) * 2004-02-19 2005-08-25 Nintendo Co., Ltd. Game machine and data storage medium having stored therein game program
JP2014195715A (en) * 2014-06-10 2014-10-16 任天堂株式会社 Game program and game device
CN108854068A (en) * 2018-06-27 2018-11-23 网易(杭州)网络有限公司 Display control method and device, storage medium and terminal in game
US20190076739A1 (en) * 2017-09-12 2019-03-14 Netease (Hangzhou) Network Co.,Ltd. Information processing method, apparatus and computer readable storage medium
CN109692477A (en) * 2019-02-01 2019-04-30 网易(杭州)网络有限公司 A kind of method and apparatus that interface is shown
CN110180168A (en) * 2019-05-31 2019-08-30 网易(杭州)网络有限公司 A kind of display methods and device, storage medium and processor of game picture
CN112827170A (en) * 2021-02-08 2021-05-25 网易(杭州)网络有限公司 Information processing method in game, electronic device and storage medium

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050187015A1 (en) * 2004-02-19 2005-08-25 Nintendo Co., Ltd. Game machine and data storage medium having stored therein game program
JP2014195715A (en) * 2014-06-10 2014-10-16 任天堂株式会社 Game program and game device
US20190076739A1 (en) * 2017-09-12 2019-03-14 Netease (Hangzhou) Network Co.,Ltd. Information processing method, apparatus and computer readable storage medium
CN108854068A (en) * 2018-06-27 2018-11-23 网易(杭州)网络有限公司 Display control method and device, storage medium and terminal in game
CN109692477A (en) * 2019-02-01 2019-04-30 网易(杭州)网络有限公司 A kind of method and apparatus that interface is shown
CN110180168A (en) * 2019-05-31 2019-08-30 网易(杭州)网络有限公司 A kind of display methods and device, storage medium and processor of game picture
CN112827170A (en) * 2021-02-08 2021-05-25 网易(杭州)网络有限公司 Information processing method in game, electronic device and storage medium

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113797543A (en) * 2021-09-27 2021-12-17 腾讯科技(深圳)有限公司 Game processing method, game processing device, computer device, storage medium, and program product
CN113797543B (en) * 2021-09-27 2023-06-23 腾讯科技(深圳)有限公司 Game processing method, game processing device, computer device, storage medium and program product
WO2023109328A1 (en) * 2021-12-16 2023-06-22 网易(杭州)网络有限公司 Game control method and apparatus
CN114253401A (en) * 2021-12-27 2022-03-29 郑州捷安高科股份有限公司 Method and device for determining position in virtual scene, electronic equipment and storage medium

Also Published As

Publication number Publication date
CN113426124B (en) 2024-03-12

Similar Documents

Publication Publication Date Title
CN113101652A (en) Information display method and device, computer equipment and storage medium
CN113426124B (en) Display control method and device in game, storage medium and computer equipment
CN113082707B (en) Virtual object prompting method and device, storage medium and computer equipment
CN114522423A (en) Virtual object control method and device, storage medium and computer equipment
CN113398566A (en) Game display control method and device, storage medium and computer equipment
US20230271087A1 (en) Method and apparatus for controlling virtual character, device, and storage medium
WO2024051116A1 (en) Control method and apparatus for virtual character, and storage medium and terminal device
WO2024011894A1 (en) Virtual-object control method and apparatus, and storage medium and computer device
CN112245914B (en) Viewing angle adjusting method and device, storage medium and computer equipment
CN115212572A (en) Control method and device of game props, computer equipment and storage medium
CN115382201A (en) Game control method and device, computer equipment and storage medium
CN114225412A (en) Information processing method, information processing device, computer equipment and storage medium
CN116139483A (en) Game function control method, game function control device, storage medium and computer equipment
CN113398564B (en) Virtual character control method, device, storage medium and computer equipment
US20240131434A1 (en) Method and apparatus for controlling put of virtual resource, computer device, and storage medium
CN116850594A (en) Game interaction method, game interaction device, computer equipment and computer readable storage medium
CN116999825A (en) Game control method, game control device, computer equipment and storage medium
CN115970284A (en) Attack method and device of virtual weapon, storage medium and computer equipment
CN116139484A (en) Game function control method, game function control device, storage medium and computer equipment
CN115212566A (en) Virtual object display method and device, computer equipment and storage medium
CN117482516A (en) Game interaction method, game interaction device, computer equipment and computer readable storage medium
CN115430150A (en) Game skill release method and device, computer equipment and storage medium
CN115518375A (en) Game word skipping display method and device, computer equipment and storage medium
CN116474367A (en) Virtual lens control method and device, storage medium and computer equipment
CN117160031A (en) Game skill processing method, game skill processing device, computer equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant