CN113398566A - Game display control method and device, storage medium and computer equipment - Google Patents

Game display control method and device, storage medium and computer equipment Download PDF

Info

Publication number
CN113398566A
CN113398566A CN202110808611.9A CN202110808611A CN113398566A CN 113398566 A CN113398566 A CN 113398566A CN 202110808611 A CN202110808611 A CN 202110808611A CN 113398566 A CN113398566 A CN 113398566A
Authority
CN
China
Prior art keywords
virtual
game
virtual scene
user interface
character
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110808611.9A
Other languages
Chinese (zh)
Inventor
罗书翰
林�智
胡志鹏
程龙
刘勇成
袁思思
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Netease Hangzhou Network Co Ltd
Original Assignee
Netease Hangzhou Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netease Hangzhou Network Co Ltd filed Critical Netease Hangzhou Network Co Ltd
Priority to CN202110808611.9A priority Critical patent/CN113398566A/en
Publication of CN113398566A publication Critical patent/CN113398566A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/214Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads
    • A63F13/2145Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads the surface being also a display device, e.g. touch screens
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/23Input arrangements for video game devices for interfacing with the game device, e.g. specific interfaces between game controller and console
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/53Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game
    • A63F13/537Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game using indicators, e.g. showing the condition of a game character on screen
    • A63F13/5378Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game using indicators, e.g. showing the condition of a game character on screen for displaying an additional top view, e.g. radar screens or maps
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1068Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals being specially adapted to detect the point of contact of the player on a surface, e.g. floor mat, touch pad
    • A63F2300/1075Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals being specially adapted to detect the point of contact of the player on a surface, e.g. floor mat, touch pad using a touch screen
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/30Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device
    • A63F2300/303Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device for displaying additional data, e.g. simulating a Head Up Display
    • A63F2300/307Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device for displaying additional data, e.g. simulating a Head Up Display for displaying an additional window with a view from the top of the game field, e.g. radar screen
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/30Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device
    • A63F2300/308Details of the user interface

Abstract

The embodiment of the application discloses a display control method and device of a game, a storage medium and computer equipment. The method comprises the following steps: displaying a first virtual scene picture in the graphical user interface, wherein the first virtual scene picture is a virtual scene picture determined according to the position of the player virtual character in the virtual scene; and responding to a preset game event, providing an observation window in the graphical user interface, and displaying a second virtual scene picture in the observation window, wherein the second virtual scene picture is a virtual scene picture determined according to the position of the first virtual character in the virtual scene. By adopting the scheme of the embodiment of the application, the game user corresponding to the user terminal can know the conditions of other teammates in the same game formation in time, and if a preset game event occurs, the player can provide help for the teammates in time.

Description

Game display control method and device, storage medium and computer equipment
Technical Field
The present application relates to the field of computer technologies, and in particular, to a method and an apparatus for controlling display of a game, a storage medium, and a computer device.
Background
With the development and popularization of computer equipment technology, more and more terminal games emerge. At present, most games provide a play method for team formation, a plurality of game users form a game team to fight, however, in the current team formation game, when teammates are out of sight of players, the players can hardly notice the situation of the teammates in time, and the intelligent team formation game carries out voice intercommunication messages or signal prompt through wheat. When the information is communicated by voice, the fighting condition of teammates and the like can not be accurately judged, and if the teammates and the fighting are in progress, the prompt information is difficult to operate and send, so that the teammates and the like are difficult to provide help in time according to the condition of the teammates.
Disclosure of Invention
The embodiment of the application provides a display control method and device for a game, a storage medium and computer equipment, which can timely acquire the fighting condition of teammates in game teams.
In a first aspect, an embodiment of the present application provides a display control method for a game,
displaying a graphical user interface of the game on a user terminal, wherein the game comprises a virtual scene and virtual characters positioned in the virtual scene, the virtual characters comprise player virtual characters and first virtual characters in the same battle with the player virtual characters, and the method comprises the following steps:
displaying a first virtual scene picture in the graphical user interface, wherein the first virtual scene picture is a virtual scene picture determined according to the position of the player virtual character in the virtual scene;
and responding to a preset game event, providing an observation window in the graphical user interface, and displaying a second virtual scene picture in the observation window, wherein the second virtual scene picture is a virtual scene picture determined according to the position of the first virtual character in the virtual scene.
In some embodiments, the second virtual scene picture is a virtual scene picture determined according to the position and orientation of the first virtual character in the virtual scene.
In some embodiments, the providing a viewing window in the graphical user interface and displaying a second virtual scene screen in the viewing window in response to a preset game event includes:
responding to a preset game event, and determining a target virtual character corresponding to the preset game event from the first virtual character;
providing observation windows on the graphical user interface according to the number of the target virtual roles, wherein one observation window corresponds to one target virtual role;
and for each observation window, displaying a second virtual scene picture in the observation window, wherein the second virtual scene picture is a virtual scene picture determined according to the position of the target virtual character corresponding to the observation window in the virtual scene.
In some embodiments, the providing a viewing window in the graphical user interface and displaying a second virtual scene screen in the viewing window in response to a preset game event includes:
responding to a preset game event, and determining a target virtual character corresponding to the preset game event from the first virtual character;
acquiring the distance between the target virtual roles, and determining two or more target virtual roles of which the distances are smaller than a preset threshold value as a target virtual role group;
and providing observation windows on the graphical user interface according to the number of the target virtual role groups and the number of target virtual roles which are not coded into the target virtual role groups, wherein one observation window corresponds to one target virtual role or one target virtual role group, for each observation window, a second virtual scene picture is displayed in the observation window, and the second virtual scene picture is a virtual scene picture determined according to the positions of the target virtual roles or the target virtual role groups corresponding to the observation windows in the virtual scene.
In some embodiments, the predetermined game event is that the first virtual character is in a fighting state, and/or a distance between the first virtual character and the player virtual character in the virtual scene is greater than a predetermined distance threshold.
In some embodiments, the preset game event is that a relationship between a game user operating the first virtual character and a game user operating the player virtual character in the game belongs to a preset relationship.
In some embodiments, the preset game event is that the interactive information issued by the first virtual character through the graphical user interface includes preset help seeking information.
In some embodiments, after displaying the first virtual scene screen in the graphical user interface, the method further comprises:
and responding to the preset game event, and marking the first virtual character as a fighting state on a small map of the graphical user interface, wherein the preset game event is that the first virtual character is in the fighting state.
In some embodiments, marking the first virtual character as a battle state on a minimap of the graphical user interface in response to the preset game event comprises:
responding to the preset game event, and determining a marking parameter corresponding to the first virtual character in the fighting state;
and marking the first virtual character as a fighting state according to the marking parameters on a small map of the graphical user interface.
In some embodiments, after providing a viewing window in the graphical user interface in response to a preset game event and displaying a second virtual scene picture in the viewing window, the method further includes:
in response to a virtual item viewing operation triggered by the graphical user interface, displaying a virtual item list on the graphical user interface, the virtual item list including item identifications of virtual items held by the player virtual character;
responding to a drag operation of an article identifier, and determining a first target observation window where the end position of the drag operation is located;
and giving the target virtual article corresponding to the dragged article identifier to the first virtual role corresponding to the first target observation window.
In some embodiments, the target virtual item is a virtual drug;
the presenting the target virtual item corresponding to the dragged item identifier to the first virtual role corresponding to the first target observation window includes:
and treating the first virtual role corresponding to the first target observation window based on the target virtual object corresponding to the dragged object identifier.
In some embodiments, after providing a viewing window in the graphical user interface and displaying a second virtual scene picture in the viewing window in response to a preset game event, the method further includes:
and responding to a window closing event corresponding to a second target observation window in the observation windows, and closing the second target observation window on the graphical user interface.
In some embodiments, the window closing event includes that the continuous duration of the first virtual character corresponding to the observation window in the non-engagement state is not lower than a preset duration threshold.
In a second aspect, an embodiment of the present application further provides a display control apparatus for a game, where a graphical user interface of the game is displayed on a user terminal, where the game includes a virtual scene and virtual characters located in the virtual scene, and the virtual characters include a player virtual character and a first virtual character in a same battle with the player virtual character, the apparatus includes:
a first display module, configured to display a first virtual scene picture in the graphical user interface, where the first virtual scene picture is a virtual scene picture determined according to a position of the player virtual character in the virtual scene;
and the second display module is used for responding to a preset game event, providing an observation window in the graphical user interface and displaying a second virtual scene picture in the observation window, wherein the second virtual scene picture is a virtual scene picture determined according to the position of the first virtual character in the virtual scene.
In a third aspect, an embodiment of the present application further provides a computer-readable storage medium, on which a computer program is stored, and when the computer program runs on a computer, the computer is caused to execute the display control method of a game provided in any embodiment of the present application.
In a fourth aspect, an embodiment of the present application further provides a computer device, including a processor and a memory, where the memory has a computer program, and the processor is configured to execute the display control method of the game provided in any embodiment of the present application by calling the computer program.
According to the technical scheme provided by the embodiment of the application, a graphical user interface of a game is displayed on a user terminal, wherein the game comprises a virtual scene and virtual characters positioned in the virtual scene, the virtual characters comprise player virtual characters and first virtual characters in the same arrangement with the player virtual characters, a first virtual scene picture is displayed on the graphical user interface, the first virtual scene picture is a virtual scene picture determined according to the positions of the player virtual characters in the virtual scene, an observation window is provided in the graphical user interface in response to a preset game event, a second virtual scene picture is displayed in the observation window, the second virtual scene picture is a virtual scene picture determined according to the positions of the first virtual characters in the virtual scene, and in this way, a game user corresponding to the user terminal can know the conditions of other teammates in the game team in time, if a predetermined game event occurs, the player can provide assistance to the teammate in a timely manner.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1a is a system schematic diagram of a display control device of a game according to an embodiment of the present application.
Fig. 1b is a first flowchart of a display control method of a game according to an embodiment of the present application.
Fig. 1c is a schematic view of a first game screen of a display control method of a game according to an embodiment of the present application.
Fig. 1d is a schematic view of a second game screen of the display control method of the game according to the embodiment of the present application.
Fig. 1e is a third game screen diagram of a display control method of a game according to an embodiment of the present application.
Fig. 2 is a schematic structural diagram of a display control device of a game according to an embodiment of the present application.
Fig. 3 is a schematic structural diagram of a computer device according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application. It is to be understood that the embodiments described are only a few embodiments of the present application and not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without inventive step, are within the scope of the present application.
Reference herein to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment can be included in at least one embodiment of the application. The appearances of the phrase in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. It is explicitly and implicitly understood by one skilled in the art that the embodiments described herein can be combined with other embodiments.
The embodiment of the present application provides a display control method of a game, and an execution main body of the display control method of the game may be a display control device of the game provided in the embodiment of the present application, or a computer device integrated with the display control device of the game, where the display control device of the game may be implemented in a hardware or software manner. The Computer device may be a terminal device such as a smart phone, a tablet Computer, a notebook Computer, a touch screen, a game machine, a Personal Computer (PC), a Personal Digital Assistant (PDA), and the like, and the terminal device may further include a client, which may be a game application client, a browser client carrying a game program, or an instant messaging client, and the like. The computer device can be in remote communication with a server of a game, wherein the server can be an independent physical server, a server cluster or a distributed system formed by a plurality of physical servers, and a cloud server providing basic cloud computing services such as cloud service, a cloud database, cloud computing, cloud functions, cloud storage, network service, cloud communication, middleware service, domain name service, security service, CDN, a big data and artificial intelligence platform and the like.
When the display control method of the game is operated on the server, the terminal device acquires game data through remote communication with the server and displays the game data on the user image interface, for example, the terminal device stores a game application program and is used for presenting a virtual scene in a game picture. The terminal device is used for interacting with a user through a graphical user interface, for example, downloading and installing a game application program through the terminal device and running the game application program. The manner in which the terminal device provides the graphical user interface to the user may include a variety of ways, for example, the graphical user interface may be rendered for display on a display screen of the terminal device or presented by holographic projection. For example, the terminal device may include a touch display screen for presenting a graphical user interface including a game screen and receiving operation instructions generated by a user acting on the graphical user interface, and a processor for executing the game, generating the graphical user interface, responding to the operation instructions, and controlling display of the graphical user interface on the touch display screen.
When the display control method of the game is operated on the server, the game can also be a cloud game. Cloud gaming refers to a gaming regime based on cloud computing. In the running mode of the cloud game, the running main body of the game application program and the game picture presenting main body are separated, and the storage and the running of the display control method of the game are finished on the cloud server. The game screen presentation is performed at a cloud game client, which is mainly used for receiving and sending game data and presenting the game screen, for example, the cloud game client may be a display device with a data transmission function near a user side, such as a mobile terminal, a television, a computer, a palm computer, a personal digital assistant, and the like, but a terminal device executing a display control method of the game is a cloud server in the cloud. When a game is played, a user operates the cloud game client to send an operation instruction to the cloud server, the cloud server runs the game according to the operation instruction, data such as a game picture and the like are encoded and compressed, the data are returned to the cloud game client through a network, and finally the data are decoded through the cloud game client and the game picture is output.
Referring to fig. 1a, fig. 1a is a system schematic diagram of a display control device of a game according to an embodiment of the present application. The system may include at least one terminal 1000, at least one server 2000, at least one database 3000, and a network 4000. The terminal 1000 held by the user can be connected to servers of different games through the network 4000. Terminal 1000 can be any computer device with computing hardware capable of supporting and executing a software product corresponding to a game. In addition, terminal 1000 can have one or more multi-touch sensitive screens for sensing and obtaining user input through touch or slide operations performed at multiple points on one or more touch sensitive display screens. In addition, when the system includes a plurality of terminals 1000, a plurality of servers 2000, and a plurality of networks 4000, different terminals 1000 may be connected to each other through different networks 4000 and through different servers 2000. The network 4000 may be a wireless network or a wired network, such as a Wireless Local Area Network (WLAN), a Local Area Network (LAN), a cellular network, a 2G network, a 3G network, a 4G network, a 5G network, and so on. In addition, different terminals 1000 may be connected to other terminals or a server using their own bluetooth network or hotspot network. For example, a plurality of users may be online through different terminals 1000 to be connected and synchronized with each other through a suitable network to support multiplayer games. In addition, the system may include a plurality of databases 3000, the plurality of databases 3000 being coupled to different servers 2000, and information related to the game environment may be continuously stored in the databases 3000 when different users play the multiplayer game online.
The embodiment of the application provides a display control method of a game, which can be executed by computer equipment. The computer equipment comprises a processor and a memory, wherein the memory is used for storing game data of each game user, and a computer program, and the processor executes each step of the display control method of the game by calling the computer program. The game may be any one of a leisure game, an action game, a role playing game, a strategy game, a sports game, a game for developing intelligence, a First Person Shooter (FPS) game, and the like. Wherein the game may include a virtual scene of the game drawn on a graphical user interface. In addition, one or more virtual characters, such as virtual characters, controlled by the user (or player) may be included in the virtual scene of the game. Additionally, one or more obstacles, such as railings, ravines, walls, etc., may also be included in the virtual scene of the game to limit movement of the virtual character, e.g., to limit movement of one or more objects to a particular area within the virtual scene. Optionally, the virtual scene of the game also includes one or more elements, such as skills, points, character health, energy, etc., to provide assistance to the player, provide virtual services, increase points related to player performance, etc. In addition, the graphical user interface may also present one or more indicators to provide instructional information to the player. For example, a game may include a player-controlled virtual character and one or more other virtual characters (such as enemy characters). In one embodiment, one or more other virtual characters are controlled by other players of the game. For example, one or more other virtual characters may be computer controlled, such as a robot using Artificial Intelligence (AI) algorithms, to implement a human-machine engagement mode. For example, virtual characters possess various skills or abilities that a game user uses to achieve a goal. For example, a virtual character has one or more weapons, props, tools, etc. that can be used to eliminate other objects from the game. Such skills or capabilities may be activated by a player of the game using one of a plurality of preset touch operations with a touch display screen of the terminal. The processor may be configured to present a corresponding game screen in response to an operation instruction generated by a touch operation of a user. In addition, a plurality of game users can form a team to form a game team to match with other game users or game teams.
It should be noted that the system schematic diagram of the display control system of the game shown in fig. 1a is only an example, the display control system and the scene of the game described in the embodiment of the present application are for more clearly illustrating the technical solution of the embodiment of the present application, and do not form a limitation on the technical solution provided in the embodiment of the present application, and as a person having ordinary skill in the art knows, the technical solution provided in the embodiment of the present application is also applicable to similar technical problems along with the evolution of the display control system of the game and the appearance of new service scenes.
In the present embodiment, the description will be made from the viewpoint of a display control device of a game that can be specifically integrated in a computer apparatus having a storage unit and a microprocessor mounted thereon and having an arithmetic capability.
Referring to fig. 1b, fig. 1b is a first flowchart illustrating a game display control method according to an embodiment of the present disclosure. The specific flow of the game display control method provided by the embodiment of the application can be as follows:
101. and displaying a first virtual scene picture in the graphical user interface, wherein the first virtual scene picture is a virtual scene picture determined according to the position of the virtual character of the player in the virtual scene.
In the embodiment of the application, a graphical user interface of a game is displayed on a user terminal, wherein the game comprises a virtual scene and virtual characters located in the virtual scene, and the virtual characters comprise player virtual characters and first virtual characters in the same battle with the player virtual characters. The player virtual character and the first virtual character in the embodiments of the present application are both virtual characters controlled by the game user. The same play here means that the player virtual character and the first virtual character constitute one game team, and the number of the first virtual characters may be one or more.
Taking the player avatar and the first avatar as an example, the player avatar and the first avatar are opposite. For example, if a game user a (corresponding virtual character (r)), a game user B (corresponding virtual character (r)), and a game user C (corresponding virtual character (C)) are grouped and played, in the game team, for the game user a, the virtual character (r) operated by the game user a is a player virtual character, and the virtual character (r) and the virtual character (C) are both first virtual characters; for game user B, the virtual character (c) is a player virtual character, and the virtual character (c) are both the first virtual character. Therefore, the player virtual character may be the virtual character of any one of the game users in one game team, and the player virtual character may be the first virtual character with respect to the player virtual character, which is the virtual character (controlled by the game user) other than the player virtual character in the game team.
Taking a virtual character (r) as a player virtual character as an example, displaying a graphical user interface of a game on a user terminal of a game user A, wherein the game comprises a virtual scene and virtual characters located in the virtual scene, and the virtual characters comprise the player virtual character and a first virtual character in the same battle with the player virtual character.
The game user logs in a game application program of the user terminal through the account, the user terminal generates a graphical user interface through rendering on the touch display screen by executing the game application program, virtual scenes on the graphical user interface include game scenes, and the game scenes include virtual roles. In a case where a game user moves through a game scene, battles against other players, or the like through virtual characters corresponding to interactive control with a graphical user interface, a game screen may change in real time. The user terminal may acquire new game scene data from the server and display a corresponding virtual scene picture.
For the game user A, a first virtual scene picture of the visual angle of the game user A is displayed on the user terminal of the game user A, and the first virtual scene picture is a virtual scene picture determined according to the position of the virtual character of the player in the virtual scene. For example, the first virtual scene screen includes a virtual scene in which a player virtual character and a screen center corresponding to the position of the player virtual character are displayed. As the player virtual character moves in the virtual scene, the first virtual scene screen is always displayed with the player virtual character as the center of the game screen.
102. And responding to a preset game event, providing an observation window in the graphical user interface, and displaying a second virtual scene picture in the observation window, wherein the second virtual scene picture is a virtual scene picture determined according to the position of the first virtual character in the virtual scene.
After the player virtual character and the first virtual character are grouped, they may be in different areas of the virtual scene, and thus, for a game user operating the player virtual character, there may be a case where the first virtual character is not seen in the first virtual scene picture from the viewpoint thereof.
Therefore, when a preset game event occurs, the user terminal provides an observation window in the graphical user interface, and displays a second virtual scene picture in the observation window, wherein the second virtual scene picture is a virtual scene picture determined according to the position of the first virtual character in the virtual scene.
For example, the user terminal receives data of the second virtual scene picture transmitted by the server, and displays the second virtual scene picture in the observation window according to the data. The second virtual scene picture is a virtual scene picture determined according to the position of the first virtual character in the virtual scene. For example, the second virtual scene picture includes a first virtual character and a virtual scene within a preset range of a position where the first virtual character is located. The effect achieved is that the first avatar and the position of the first avatar in the virtual scene are presented in the viewing window, and if the first avatar has only changed its orientation but has not changed its position information, the second virtual scene picture does not need to be updated. And when the position of the first virtual character in the virtual scene is changed, updating a second virtual scene picture displayed in the observation window according to the changed position.
In an embodiment, the second virtual scene picture is a virtual scene picture determined according to the position and orientation of the first virtual character in the virtual scene.
In this embodiment, the second virtual scene picture may be a virtual scene picture displayed on the user terminal of the player corresponding to the first virtual character, that is, a game picture viewed by the second game user is displayed on the viewing window on the graphical user interface of the user terminal of the game user operating the player virtual character in real time, and the display content of the viewing window may be updated in real time along with the update of the game picture viewed by the second game user, that is, when the position and orientation of the first virtual character are changed, the second virtual scene picture is correspondingly changed, so that the game user operating the player virtual character can know the situation of the teammate in real time.
Referring to fig. 1c, fig. 1c is a schematic view of a first game screen of a display control method of a game according to an embodiment of the present application. Still take the game user a (corresponding virtual character:), the game user B (corresponding virtual character ±) and the game user C (corresponding virtual character —) to form a team and play a game as an example, in this scenario, taking the game user a corresponding to the player virtual character, the game user B and the game user C corresponding to the first virtual character as an example, and the game user C in fig. 1C being a graphical user interface in the user terminal corresponding to the game user operating the player virtual character, a first virtual scene picture of the game user operating the player virtual character is displayed, the virtual character located in the center of the picture is the player virtual character, and the game user operating the player virtual character can control the player virtual character to move in the virtual scene. For the player virtual character, the virtual character (II) and the virtual character (III) corresponding to the other two game users in the game team are both the first virtual character. Taking the virtual character (C) as an example, when a preset game event occurs, the user terminal corresponding to the player virtual character provides an observation window on the graphical user interface, and a virtual scene picture determined according to the position of the game user C in the virtual scene is displayed in the observation window in real time.
It can be understood that, when the game user a plays the game, the main game picture of the game user a still needs to display the player virtual character (r) and the virtual scene where the player virtual character is located, so the area occupied by the observation window on the graphical user interface is generally small, and in practical application, the size of the observation window can be set as required. In addition, parameters such as the display position, the shape, the border color and the like of the observation window can be set according to actual needs, for example, the observation window can be displayed on the game interface in a small window form, and the shape can be a square, a circle and the like.
There are various embodiments regarding the preset game event above. Next, a description will be given by taking a few of them.
The operation of judging whether the preset game event occurs can be executed in the server or the user terminal. And once the preset game event occurs, the user terminal responds to the preset game event and provides a viewing window on the graphical user interface, and the second game scene picture is displayed in the viewing window. Wherein, the data of the second game scene picture can be sent to the user terminal by the server.
In one embodiment, the predetermined game event is that the first virtual character is in a fighting state, and/or the distance between the first virtual character and the player virtual character in the virtual scene is greater than a predetermined distance threshold.
In this embodiment, the predetermined game event may be that the first virtual character is in a fighting state. For example, the first virtual character may be considered to be in a battle state when being attacked by other virtual characters in the game, or the first virtual character actively attacks other virtual characters, and the like. When the first virtual character has an event in a fighting state, the user terminal acquires a second game scene picture, for example, receives the second game scene picture sent by the server, provides an observation window on the graphical user interface, and displays the second game scene picture in the observation window. So that the game user can know the fighting condition of the first virtual character in time and can provide help for the first virtual character in time.
Alternatively, the preset game event may be that a distance between the first virtual character and the player virtual character in the virtual scene is greater than a preset distance threshold. When the distance between the first virtual character and the player virtual character is far, if the first virtual character needs the help of the player virtual character, a game user operating the player virtual character can know the situation of the first virtual character in time through the observation window, and further control the player virtual character even if the player virtual character provides the help for the first virtual character.
Or, the preset game event may also be that the preset game event is that the first virtual character is in a fighting state, and a distance between the first virtual character and the player virtual character in the virtual scene is greater than a preset distance threshold. That is, when the first preset character meets the condition that the distance between the first preset character and the virtual character of the player in the virtual scene is greater than the preset distance threshold value and the first preset character is still in the fighting state, it can be determined that the preset game event occurs.
It can be understood that, in the above application scenarios, if only some first virtual characters in the plurality of first virtual characters belonging to the same battle as the player virtual character satisfy the preset game event, only the observation windows corresponding to the first virtual characters satisfying the preset game event need to be displayed.
In an embodiment, the preset game event may also be that a relationship between a game user operating the first virtual character and a game user operating the player virtual character in the game belongs to a preset relationship.
In this embodiment, if the relationship in the game between the game user operating the first virtual character and the game user operating the player virtual character belongs to a preset relationship, it is determined that a preset game event occurs, and the user terminal of the game user operating the first virtual character displays the observation window in response to the preset game event. For example, the preset relationship may be a couple relationship, a teacher-apprentice relationship, or the like, and it may be determined whether the relationship between the game user operating the first virtual character and the game user operating the player virtual character in the game belongs to the preset relationship according to the information of the game user in the game.
In an embodiment, the preset game event may also be that the interactive information issued by the first virtual character through the graphical user interface includes preset help seeking information.
In this embodiment, the interactive information may be a dialog message sent by the first virtual character in a dialog box on the gui, for example, if it is detected that the dialog message contains a specific keyword, it may be determined that a predetermined game event occurs. For example, if the dialog messages contain keywords related to help seeking, such as "help" and "help", it may be determined that a preset game event occurs. Wherein the operation of detecting the key can be executed by a server or a user terminal.
In the above embodiment, when a preset game event occurs, all of the second virtual scene pictures corresponding to the first virtual character may be displayed in the corresponding observation windows. Or displaying the second virtual scene picture of the first virtual character corresponding to the preset game event in the corresponding observation window.
In one embodiment, the step of providing a viewing window in the graphical user interface and displaying the second virtual scene picture in the viewing window in response to the preset game event may include: in response to a preset game event, determining a target virtual character corresponding to the preset game event from the first virtual character; providing observation windows on a graphical user interface according to the number of the target virtual roles, wherein one observation window corresponds to one target virtual role; and for each observation window, displaying a second virtual scene picture in the observation window, wherein the second virtual scene picture is a virtual scene picture determined according to the position of the target virtual character corresponding to the observation window in the virtual scene.
In this embodiment, when a preset game event occurs, a target virtual character corresponding to the occurred preset game event is determined from all the first virtual characters. Taking the preset game event as an example that the first virtual character is in a fighting state, assuming that there are three first virtual characters, only one of the first virtual characters is in the fighting state, that is, only the one first virtual character has the preset game event, and the one first virtual character is determined as the target virtual character. After the target virtual roles are determined, providing observation windows on a graphical user interface according to the number of the target virtual roles, wherein one observation window corresponds to one target virtual role; and for each observation window, displaying a second virtual scene picture in the observation window, wherein the second virtual scene picture is a virtual scene picture determined according to the position of the target virtual character corresponding to the observation window in the virtual scene.
Still taking the scene in the foregoing as an example, only the virtual character of the game user B of the two teammates of the game user a is in the battle state, and fig. 1c shows the effect of displaying the second virtual scene screen corresponding to the game user B on the user terminal of the game user a.
In one embodiment, the step of providing a viewing window in the graphical user interface and displaying the second virtual scene picture in the viewing window in response to the preset game event may include: in response to a preset game event, determining a target virtual character corresponding to the preset game event from the first virtual character; acquiring the distance between each target virtual character, and determining two or more target virtual characters with the distance smaller than a preset threshold value as a target virtual character group; and providing observation windows on the graphical user interface according to the number of the target virtual role groups and the number of the target virtual roles which are not coded into the target virtual role groups, wherein one observation window corresponds to one target virtual role or one target virtual role group, and for each observation window, displaying a second virtual scene picture in the observation window, wherein the second virtual scene picture is a virtual scene picture determined according to the positions of the target virtual roles or the target virtual role groups corresponding to the observation windows in the virtual scene.
In this embodiment, when a preset game event occurs, a target virtual character corresponding to the occurred preset game event is determined from all the first virtual characters. Taking the preset game event as an example that the first virtual character is in a fighting state, assuming that there are four first virtual characters, three of the first virtual characters are in the fighting state, that is, only the three first virtual characters have the preset game event, and the three first virtual characters are determined as the target virtual character. And calculating the distance between every two first virtual characters in the three first virtual characters, and determining two or more target virtual characters with the distances smaller than a preset threshold value as a target virtual character group. For example, two of the three first virtual characters are determined as a target virtual character group according to the distances between the three first virtual characters, and the other target virtual character is not coded in the target virtual character group. And providing observation windows on the graphical user interface according to the number of the target virtual character groups and the number of the target virtual characters which are not coded into the target virtual character groups, namely providing two observation windows, wherein one target virtual character group corresponds to one observation window, and the other target virtual character which is not coded into the target virtual character group corresponds to one observation window. Referring to fig. 1d, fig. 1d is a schematic diagram of a second game screen of a display control method of a game according to an embodiment of the present application. Two viewing windows are shown on the graphical user interface of fig. 1d, wherein virtual characters (c) and (c) form a target virtual character set corresponding to one viewing window. Another virtual character (r) which is not programmed into the target virtual character set corresponds to an observation window. And for each observation window, displaying a second virtual scene picture in the observation window, wherein the second virtual scene picture is a virtual scene picture determined according to the position of the target virtual role or the target virtual role group corresponding to the observation window in the virtual scene.
In an embodiment, after displaying the first virtual scene screen in the graphical user interface, the method further comprises: in response to a preset game event, the first virtual character is marked as a fighting state on a small map of the graphical user interface, and the preset game event is that the first virtual character is in the fighting state.
In this embodiment, while the second virtual scene screen is displayed in the observation window, the user terminal may further mark the first virtual character as a battle state on a small map of the graphical user interface in response to a preset game event. The user terminal displays a minimap on the graphical user interface, wherein the minimap displays the position of the player virtual character and a certain range of area around the position of the player virtual character by taking the position of the player virtual character as a center. At the same time, the position of the first virtual character can also be marked on the small map. When the preset game event is that the first virtual character is in the fighting state, the first virtual character in the fighting state can be marked as the fighting state on the small map. Referring to fig. 1e, fig. 1e is a schematic view of a third game screen of a display control method of a game according to an embodiment of the present application. In the figure, the virtual character II is marked as a fighting state through a red light special effect. Alternatively, in other embodiments, the fight status may also be marked by a light effect of another color.
In one embodiment, marking the first virtual character as a battle state on a minimap of the graphical user interface in response to a preset game event includes: responding to a preset game event, and determining a marking parameter corresponding to a first virtual character in a fighting state; and marking the first virtual character as a fighting state according to the marking parameters on a small map of the graphical user interface.
In this embodiment, for different first virtual characters, different marking parameters may be displayed on the minimap, wherein the marking parameters include the shape, color, special effect, and the like of the mark of the battle state in the minimap.
In some embodiments, different viewing windows should be displayable according to different display parameters, including but not limited to the shape of the viewing window, the border color, etc. For example, the border color of one viewing window may be red and the border color of another viewing window may be blue.
In one embodiment, after providing a viewing window in the graphical user interface in response to a preset game event and displaying the second virtual scene picture in the viewing window, the method further includes: in response to virtual item viewing operation triggered through the graphical user interface, displaying a virtual item list on the graphical user interface, wherein the virtual item list comprises item identifications of virtual items held by the player virtual character; responding to the drag operation of the object identifier, and determining a first target observation window where the end position of the drag operation is located; and giving the target virtual article corresponding to the dragged article identifier to a first virtual role corresponding to the first target observation window.
In this embodiment, a game user operating a player virtual character may further trigger a virtual item viewing operation through a graphical user interface, and a user terminal displays a virtual item list on the graphical user interface in response to the virtual item viewing operation triggered through the graphical user interface, the virtual item list including an item identifier of a virtual item held by the player virtual character; the game user can also present the virtual article to the first virtual character corresponding to the observation window by dragging the article identifier corresponding to the virtual article to the observation window. When the user terminal detects a drag operation based on the object identifier, a first target observation window where the end position of the drag operation is located is determined, and a target virtual object corresponding to the dragged object identifier is given to a first virtual role corresponding to the first target observation window.
In one embodiment, the target virtual item is a virtual drug; giving the target virtual item corresponding to the dragged item identifier to a first virtual role corresponding to a first target observation window, wherein the method comprises the following steps: and treating the first virtual role corresponding to the first target observation window based on the target virtual object corresponding to the dragged object identifier.
In this embodiment, the target virtual item may be a virtual drug. For example, a game user who operates a player virtual character may find that a first virtual character is attacked from the observation window, and a virtual life value is lowered more, and a virtual medicine may be given to the first virtual character. And the user terminal treats the first virtual role corresponding to the first target observation window based on the target virtual object corresponding to the dragged object identifier so as to improve the virtual life value of the corresponding first virtual role.
In one embodiment, after providing a viewing window in the graphical user interface in response to a preset game event and displaying the second virtual scene picture in the viewing window, the method further includes: and responding to a window closing event corresponding to a second target observation window in the observation windows, and closing the second target observation window on the graphical user interface.
In this embodiment, in some cases, a window closing event may be triggered, and when detecting the window closing event, the user terminal determines, from all the observation windows, a second target observation window corresponding to the window closing event, and closes, in response to the window closing event corresponding to the second target observation window, the second target observation window on the graphical user interface. For example, in some embodiments, the window closing event includes that the continuous duration of the first virtual character corresponding to the observation window in the non-engagement state is not lower than a preset duration threshold. Alternatively, the window closing event may also be the detection of a window closing operation triggered based on a window closing control on the observation window.
In particular implementation, the present application is not limited by the execution sequence of the described steps, and some steps may be performed in other sequences or simultaneously without conflict.
From the above, in the display control method for a game provided in the embodiment of the present application, a graphical user interface of the game is displayed at a user terminal, where the game includes a virtual scene and virtual characters located in the virtual scene, the virtual characters include a player virtual character and a first virtual character in the same battle as the player virtual character, the graphical user interface displays a first virtual scene picture, the first virtual scene picture is a virtual scene picture determined according to the position of the player virtual character in the virtual scene, in response to a preset game event, an observation window is provided in the graphical user interface, and a second virtual scene picture is displayed in the observation window, the second virtual scene picture is a virtual scene picture determined according to the position of the first virtual character in the virtual scene, and in this way, a game user corresponding to the user terminal can know the conditions of other teammates in the game team in time, if a predetermined game event occurs, the player can provide assistance to the teammate in a timely manner.
In one embodiment, a display control device for a game is also provided. Referring to fig. 2, fig. 2 is a schematic structural diagram of a display control device 300 of a game according to an embodiment of the present application. The display control device 300 of the game is applied to a user terminal, and a graphical user interface of the game is displayed on the user terminal, wherein the game includes a virtual scene and virtual characters located in the virtual scene, the virtual characters include a player virtual character and a first virtual character in the same battle as the player virtual character, and the display control device 300 of the game includes a first display module 301 and a second display module 302, as follows:
a first display module 301, configured to display a first virtual scene picture in the graphical user interface, where the first virtual scene picture is a virtual scene picture determined according to a position of the player virtual character in the virtual scene;
a second display module 302, configured to provide an observation window in the graphical user interface in response to a preset game event, and display a second virtual scene picture in the observation window, where the second virtual scene picture is a virtual scene picture determined according to a position of the first virtual character in the virtual scene.
In some embodiments, the second virtual scene picture is a virtual scene picture determined according to the position and orientation of the first virtual character in the virtual scene.
In some embodiments, the second display module 302 is further configured to determine, in response to a preset game event, a target virtual character corresponding to the preset game event from the first virtual character;
providing observation windows on the graphical user interface according to the number of the target virtual roles, wherein one observation window corresponds to one target virtual role;
and for each observation window, displaying a second virtual scene picture in the observation window, wherein the second virtual scene picture is a virtual scene picture determined according to the position of the target virtual character corresponding to the observation window in the virtual scene.
In some embodiments, the second display module 302 is further configured to determine, in response to a preset game event, a target virtual character corresponding to the preset game event from the first virtual character;
acquiring the distance between the target virtual roles, and determining two or more target virtual roles of which the distances are smaller than a preset threshold value as a target virtual role group;
and providing observation windows on the graphical user interface according to the number of the target virtual role groups and the number of target virtual roles which are not coded into the target virtual role groups, wherein one observation window corresponds to one target virtual role or one target virtual role group, for each observation window, a second virtual scene picture is displayed in the observation window, and the second virtual scene picture is a virtual scene picture determined according to the positions of the target virtual roles or the target virtual role groups corresponding to the observation windows in the virtual scene.
In some embodiments, the predetermined game event is that the first virtual character is in a fighting state, and/or a distance between the first virtual character and the player virtual character in the virtual scene is greater than a predetermined distance threshold.
In some embodiments, the preset game event is that a relationship between a game user operating the first virtual character and a game user operating the player virtual character in the game belongs to a preset relationship.
In some embodiments, the preset game event is that the interactive information issued by the first virtual character through the graphical user interface includes preset help seeking information.
In some embodiments, the second display module 302 is further configured to mark the first virtual character as a battle state on a small map of the graphical user interface in response to the preset game event.
In some embodiments, the second display module 302 is further configured to determine, in response to the preset game event, a marking parameter corresponding to the first virtual character in the fighting state;
and marking the first virtual character as a fighting state according to the marking parameters on a small map of the graphical user interface.
In some embodiments, the first display module 301 is further configured to display a virtual item list on the graphical user interface in response to a virtual item viewing operation triggered by the graphical user interface, where the virtual item list includes item identifiers of virtual items held by the player virtual character;
the second display module 302 is further configured to determine, in response to a drag operation on an item identifier, a first target observation window where an end position of the drag operation is located; and
the apparatus 300 further comprises:
and the article presentation module is used for presenting the target virtual article corresponding to the dragged article identifier to the first virtual role corresponding to the first target observation window.
In some embodiments, the target virtual item is a virtual drug; the item presentation module is further configured to treat the first virtual character corresponding to the first target observation window based on the target virtual item corresponding to the dragged item identifier.
In some embodiments, the second display module 302 is further configured to close a second target viewing window in the graphical user interface in response to a window closing event corresponding to the second target viewing window.
In some embodiments, the window closing event includes that the continuous duration of the first virtual character corresponding to the observation window in the non-engagement state is not lower than a preset duration threshold.
It should be noted that the display control device for the game provided in the embodiment of the present application and the display control method for the game in the foregoing embodiments belong to the same concept, and any method provided in the display control method for the game may be implemented by the display control device for the game, and the specific implementation process thereof is described in the display control method for the game in detail, and will not be described herein again.
As can be seen from the above, the display control device for a game provided in the embodiment of the present application displays a graphical user interface of the game on a user terminal, where the game includes a virtual scene and virtual characters located in the virtual scene, the virtual characters include a player virtual character and a first virtual character in the same battle as the player virtual character, the graphical user interface displays a first virtual scene picture on the graphical user interface, the first virtual scene picture is a virtual scene picture determined according to the position of the player virtual character in the virtual scene, in response to a preset game event, an observation window is provided in the graphical user interface, and a second virtual scene picture is displayed in the observation window, the second virtual scene picture is a virtual scene picture determined according to the position of the first virtual character in the virtual scene, and in this way, a game user corresponding to the user terminal can know the conditions of other teammates in the game team in real time, if a predetermined game event occurs, the player can provide assistance to the teammate in a timely manner.
The embodiment of the present application further provides a Computer device, where the Computer device may be a user terminal, and the user terminal may be a terminal device such as a smart phone, a tablet Computer, a notebook Computer, a touch screen, a game console, a Personal Computer (PC), a Personal Digital Assistant (PDA), and the like. As shown in fig. 3, fig. 3 is a schematic structural diagram of a computer device according to an embodiment of the present application. The computer apparatus 400 includes a processor 401 having one or more processing cores, a memory 402 having one or more computer-readable storage media, and a computer program stored on the memory 402 and executable on the processor. The processor 401 is electrically connected to the memory 402. Those skilled in the art will appreciate that the computer device configurations illustrated in the figures are not meant to be limiting of computer devices and may include more or fewer components than those illustrated, or some components may be combined, or a different arrangement of components.
The processor 401 is a control center of the computer device 400, connects the respective parts of the entire computer device 400 using various interfaces and lines, performs various functions of the computer device 400 and processes data by running or loading software programs and/or modules stored in the memory 402 and calling data stored in the memory 402, thereby monitoring the computer device 400 as a whole.
In this embodiment, a graphical user interface of the game is displayed on the computer device 400, where the game includes a virtual scene and virtual characters located in the virtual scene, the virtual characters include a player virtual character and a first virtual character in the same battle as the player virtual character, and the processor 401 in the computer device 400 loads instructions corresponding to processes of one or more applications into the memory 402 according to the following steps, and the processor 401 runs the applications stored in the memory 402, so as to implement various functions:
displaying a first virtual scene picture in the graphical user interface, wherein the first virtual scene picture is a virtual scene picture determined according to the position of the player virtual character in the virtual scene;
and responding to a preset game event, providing an observation window in the graphical user interface, and displaying a second virtual scene picture in the observation window, wherein the second virtual scene picture is a virtual scene picture determined according to the position of the first virtual character in the virtual scene.
The above operations can be implemented in the foregoing embodiments, and are not described in detail herein.
Optionally, as shown in fig. 3, the computer device 400 further includes: touch-sensitive display screen 403, radio frequency circuit 404, audio circuit 405, input unit 406 and power 407. The processor 401 is electrically connected to the touch display screen 403, the radio frequency circuit 404, the audio circuit 405, the input unit 406, and the power source 407. Those skilled in the art will appreciate that the computer device configuration illustrated in FIG. 3 does not constitute a limitation of computer devices, and may include more or fewer components than those illustrated, or some components may be combined, or a different arrangement of components.
The touch display screen 403 may be used for displaying a graphical user interface and receiving operation instructions generated by a user acting on the graphical user interface. The touch display screen 403 may include a display panel and a touch panel. The display panel may be used, among other things, to display information entered by or provided to a user and various graphical user interfaces of the computer device, which may be made up of graphics, text, icons, video, and any combination thereof. Alternatively, the Display panel may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like. The touch panel may be used to collect touch operations of a user on or near the touch panel (for example, operations of the user on or near the touch panel using any suitable object or accessory such as a finger, a stylus pen, and the like), and generate corresponding operation instructions, and the operation instructions execute corresponding programs. Alternatively, the touch panel may include two parts, a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 401, and can receive and execute commands sent by the processor 401. The touch panel may overlay the display panel, and when the touch panel detects a touch operation thereon or nearby, the touch panel may transmit the touch operation to the processor 401 to determine the type of the touch event, and then the processor 401 may provide a corresponding visual output on the display panel according to the type of the touch event. In the embodiment of the present application, the touch panel and the display panel may be integrated into the touch display screen 403 to realize input and output functions. However, in some embodiments, the touch panel and the display panel may be implemented as two separate components to perform the input and output functions. That is, the touch display screen 403 may also be used as a part of the input unit 406 to implement an input function.
The rf circuit 404 may be used for transceiving rf signals to establish wireless communication with a network device or other computer device via wireless communication, and for transceiving signals with the network device or other computer device.
The audio circuit 405 may be used to provide an audio interface between a user and a computer device through speakers, microphones. The audio circuit 405 may transmit the electrical signal converted from the received audio data to a speaker, and convert the electrical signal into a sound signal for output; on the other hand, the microphone converts the collected sound signal into an electrical signal, which is received by the audio circuit 405 and converted into audio data, which is then processed by the audio data output processor 401, and then sent to, for example, another computer device via the radio frequency circuit 404, or output to the memory 402 for further processing. The audio circuit 405 may also include an earbud jack to provide communication of a peripheral headset with the computer device.
The input unit 406 may be used to receive input numbers, character information, or user characteristic information (e.g., fingerprint, iris, facial information, etc.), and to generate keyboard, mouse, joystick, optical, or trackball signal inputs related to user settings and function control.
The power supply 407 is used to power the various components of the computer device 400. Optionally, the power source 407 may be logically connected to the processor 401 through a power management system, so as to implement functions of managing charging, discharging, power consumption management, and the like through the power management system. The power supply 407 may also include one or more dc or ac power sources, recharging systems, power failure detection circuitry, power converters or inverters, power status indicators, or any other component.
Although not shown in fig. 3, the computer device 400 may further include a camera, a sensor, a wireless fidelity module, a bluetooth module, etc., which are not described in detail herein.
In the foregoing embodiments, the descriptions of the respective embodiments have respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to related descriptions of other embodiments.
As can be seen from the above, in the computer device provided in this embodiment, a graphical user interface of a game is displayed on the computer device, where the game includes a virtual scene and virtual characters located in the virtual scene, the virtual characters include a player virtual character and a first virtual character in the same battle as the player virtual character, a first virtual scene picture is displayed on the graphical user interface, the first virtual scene picture is a virtual scene picture determined according to the position of the player virtual character in the virtual scene, in response to a preset game event, an observation window is provided in the graphical user interface, and a second virtual scene picture is displayed in the observation window, the second virtual scene picture is a virtual scene picture determined according to the position of the first virtual character in the virtual scene, and in this way, a game user corresponding to a user terminal can know the conditions of other teammates in the game team where the game is located in time, if a predetermined game event occurs, the player can provide assistance to the teammate in a timely manner.
It will be understood by those skilled in the art that all or part of the steps of the methods of the above embodiments may be performed by instructions or by associated hardware controlled by the instructions, which may be stored in a computer readable storage medium and loaded and executed by a processor.
To this end, embodiments of the present application provide a computer-readable storage medium, in which a plurality of computer programs are stored, and the computer programs can be loaded by a processor to execute the steps in the display control method of any one of the games provided in the embodiments of the present application. For example, the computer program may display a graphical user interface of the game on a user terminal, wherein the game includes a virtual scene and virtual characters located in the virtual scene, the virtual characters include a player virtual character and a first virtual character in a same battle with the player virtual character, and execute the following steps:
displaying a first virtual scene picture in the graphical user interface, wherein the first virtual scene picture is a virtual scene picture determined according to the position of the player virtual character in the virtual scene;
and responding to a preset game event, providing an observation window in the graphical user interface, and displaying a second virtual scene picture in the observation window, wherein the second virtual scene picture is a virtual scene picture determined according to the position of the first virtual character in the virtual scene.
The above operations can be implemented in the foregoing embodiments, and are not described in detail herein.
Wherein the storage medium may include: read Only Memory (ROM), Random Access Memory (RAM), magnetic or optical disks, and the like. Since the computer program stored in the storage medium can execute the steps in the display control method for any game provided in the embodiments of the present application, the beneficial effects that can be achieved by the display control method for any game provided in the embodiments of the present application can be achieved, which are detailed in the foregoing embodiments and will not be described herein again.
The foregoing describes in detail a display control method, device, medium and computer apparatus for a game provided in an embodiment of the present application, and a specific example is applied in the present application to explain the principle and implementation manner of the present application, and the description of the foregoing embodiment is only used to help understand the method and core ideas of the present application; meanwhile, for those skilled in the art, according to the idea of the present application, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present application.

Claims (16)

1. A display control method for a game, wherein a graphical user interface of the game is displayed on a user terminal, wherein the game includes a virtual scene and virtual characters located in the virtual scene, and the virtual characters include a player virtual character and a first virtual character in a same battle as the player virtual character, the method comprising:
displaying a first virtual scene picture in the graphical user interface, wherein the first virtual scene picture is a virtual scene picture determined according to the position of the player virtual character in the virtual scene;
and responding to a preset game event, providing an observation window in the graphical user interface, and displaying a second virtual scene picture in the observation window, wherein the second virtual scene picture is a virtual scene picture determined according to the position of the first virtual character in the virtual scene.
2. The method of claim 1, wherein the second virtual scene picture is a virtual scene picture determined according to a position and an orientation of the first avatar in the virtual scene.
3. The method of claim 1, wherein said providing a viewing window in said graphical user interface and displaying a second virtual scene screen in said viewing window in response to a preset game event comprises:
responding to a preset game event, and determining a target virtual character corresponding to the preset game event from the first virtual character;
providing observation windows on the graphical user interface according to the number of the target virtual roles, wherein one observation window corresponds to one target virtual role;
and for each observation window, displaying a second virtual scene picture in the observation window, wherein the second virtual scene picture is a virtual scene picture determined according to the position of the target virtual character corresponding to the observation window in the virtual scene.
4. The method of claim 1, wherein said providing a viewing window in said graphical user interface and displaying a second virtual scene screen in said viewing window in response to a preset game event comprises:
responding to a preset game event, and determining a target virtual character corresponding to the preset game event from the first virtual character;
acquiring the distance between the target virtual roles, and determining two or more target virtual roles of which the distances are smaller than a preset threshold value as a target virtual role group;
and providing observation windows on the graphical user interface according to the number of the target virtual role groups and the number of target virtual roles which are not coded into the target virtual role groups, wherein one observation window corresponds to one target virtual role or one target virtual role group, for each observation window, a second virtual scene picture is displayed in the observation window, and the second virtual scene picture is a virtual scene picture determined according to the positions of the target virtual roles or the target virtual role groups corresponding to the observation windows in the virtual scene.
5. The method of claim 1, wherein the predetermined game event is the first virtual character being in a battle state and/or the first virtual character being spaced from the player virtual character by a distance in the virtual scene greater than a predetermined distance threshold.
6. The method of claim 1, wherein the predetermined game event is that a relationship in the game between a game user operating the first virtual character and a game user operating the player virtual character belongs to a predetermined relationship.
7. The method of claim 1, wherein the predetermined game event is that the interactive information issued by the first avatar through the gui includes predetermined help seeking information.
8. The method of claim 1, wherein after displaying the first virtual scene screen in the graphical user interface, further comprising:
and responding to the preset game event, and marking the first virtual character as a fighting state on a small map of the graphical user interface, wherein the preset game event is that the first virtual character is in the fighting state.
9. The method of claim 8, wherein marking the first virtual character as a battle state on a minimap of the graphical user interface in response to the preset game event comprises:
responding to the preset game event, and determining a marking parameter corresponding to the first virtual character in the fighting state;
and marking the first virtual character as a fighting state according to the marking parameters on a small map of the graphical user interface.
10. The method of any of claims 1-7, wherein after providing a viewing window in the graphical user interface and displaying a second virtual scene in the viewing window in response to a predetermined game event, further comprising:
in response to a virtual item viewing operation triggered by the graphical user interface, displaying a virtual item list on the graphical user interface, the virtual item list including item identifications of virtual items held by the player virtual character;
responding to a drag operation of an article identifier, and determining a first target observation window where the end position of the drag operation is located;
and giving the target virtual article corresponding to the dragged article identifier to the first virtual role corresponding to the first target observation window.
11. The method of claim 10, wherein the target virtual item is a virtual drug;
the presenting the target virtual item corresponding to the dragged item identifier to the first virtual role corresponding to the first target observation window includes:
and treating the first virtual role corresponding to the first target observation window based on the target virtual object corresponding to the dragged object identifier.
12. The method of any of claims 1 to 7, wherein, after providing a viewing window in the graphical user interface and displaying a second virtual scene view in the viewing window in response to a preset game event, further comprising:
and responding to a window closing event corresponding to a second target observation window in the observation windows, and closing the second target observation window on the graphical user interface.
13. The method of claim 12, wherein the window closing event comprises a duration of time that the first avatar corresponding to the observation window is in the non-engagement state being no less than a preset duration threshold.
14. A display control apparatus for a game, wherein a graphical user interface of the game is displayed on a user terminal, wherein the game includes a virtual scene and virtual characters located in the virtual scene, and the virtual characters include a player virtual character and a first virtual character in a same battle as the player virtual character, the apparatus comprising:
a first display module, configured to display a first virtual scene picture in the graphical user interface, where the first virtual scene picture is a virtual scene picture determined according to a position of the player virtual character in the virtual scene;
and the second display module is used for responding to a preset game event, providing an observation window in the graphical user interface and displaying a second virtual scene picture in the observation window, wherein the second virtual scene picture is a virtual scene picture determined according to the position of the first virtual character in the virtual scene.
15. A computer-readable storage medium on which a computer program is stored, characterized by causing a computer to execute a display control method of a game according to any one of claims 1 to 13 when the computer program runs on the computer.
16. A computer device comprising a processor and a memory, the memory storing a computer program, wherein the processor is configured to execute the display control method of a game according to any one of claims 1 to 13 by calling the computer program.
CN202110808611.9A 2021-07-16 2021-07-16 Game display control method and device, storage medium and computer equipment Pending CN113398566A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110808611.9A CN113398566A (en) 2021-07-16 2021-07-16 Game display control method and device, storage medium and computer equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110808611.9A CN113398566A (en) 2021-07-16 2021-07-16 Game display control method and device, storage medium and computer equipment

Publications (1)

Publication Number Publication Date
CN113398566A true CN113398566A (en) 2021-09-17

Family

ID=77686772

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110808611.9A Pending CN113398566A (en) 2021-07-16 2021-07-16 Game display control method and device, storage medium and computer equipment

Country Status (1)

Country Link
CN (1) CN113398566A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023087903A1 (en) * 2021-11-19 2023-05-25 腾讯科技(深圳)有限公司 Game screen display method and apparatus, storage medium, and electronic device
WO2023173643A1 (en) * 2022-03-17 2023-09-21 网易(杭州)网络有限公司 Picture display method and apparatus, and electronic device and readable storage medium

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105335064A (en) * 2015-09-29 2016-02-17 腾讯科技(深圳)有限公司 Information processing method, terminal, and computer storage medium
CN109331468A (en) * 2018-09-26 2019-02-15 网易(杭州)网络有限公司 Display methods, display device and the display terminal at game visual angle
CN110624248A (en) * 2019-09-18 2019-12-31 网易(杭州)网络有限公司 Game control method, device, electronic equipment and storage medium
US20200353355A1 (en) * 2018-04-17 2020-11-12 Tencent Technology (Shenzhen) Company Limited Information object display method and apparatus in virtual scene, and storage medium
CN112619167A (en) * 2020-12-21 2021-04-09 网易(杭州)网络有限公司 Information processing method and device, computer equipment and medium
CN113069767A (en) * 2021-04-09 2021-07-06 腾讯科技(深圳)有限公司 Virtual interaction method, device, terminal and storage medium

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105335064A (en) * 2015-09-29 2016-02-17 腾讯科技(深圳)有限公司 Information processing method, terminal, and computer storage medium
US20200353355A1 (en) * 2018-04-17 2020-11-12 Tencent Technology (Shenzhen) Company Limited Information object display method and apparatus in virtual scene, and storage medium
CN109331468A (en) * 2018-09-26 2019-02-15 网易(杭州)网络有限公司 Display methods, display device and the display terminal at game visual angle
CN110624248A (en) * 2019-09-18 2019-12-31 网易(杭州)网络有限公司 Game control method, device, electronic equipment and storage medium
CN112619167A (en) * 2020-12-21 2021-04-09 网易(杭州)网络有限公司 Information processing method and device, computer equipment and medium
CN113069767A (en) * 2021-04-09 2021-07-06 腾讯科技(深圳)有限公司 Virtual interaction method, device, terminal and storage medium

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023087903A1 (en) * 2021-11-19 2023-05-25 腾讯科技(深圳)有限公司 Game screen display method and apparatus, storage medium, and electronic device
WO2023173643A1 (en) * 2022-03-17 2023-09-21 网易(杭州)网络有限公司 Picture display method and apparatus, and electronic device and readable storage medium

Similar Documents

Publication Publication Date Title
CN113101652A (en) Information display method and device, computer equipment and storage medium
CN113426124B (en) Display control method and device in game, storage medium and computer equipment
CN113082688A (en) Method and device for controlling virtual character in game, storage medium and equipment
CN113398566A (en) Game display control method and device, storage medium and computer equipment
CN113398590A (en) Sound processing method, sound processing device, computer equipment and storage medium
CN113082707A (en) Virtual object prompting method and device, storage medium and computer equipment
WO2024011894A1 (en) Virtual-object control method and apparatus, and storage medium and computer device
CN115999153A (en) Virtual character control method and device, storage medium and terminal equipment
CN112245914B (en) Viewing angle adjusting method and device, storage medium and computer equipment
CN114225412A (en) Information processing method, information processing device, computer equipment and storage medium
CN114159789A (en) Game interaction method and device, computer equipment and storage medium
CN115212572A (en) Control method and device of game props, computer equipment and storage medium
CN113398564B (en) Virtual character control method, device, storage medium and computer equipment
CN113332721B (en) Game control method, game control device, computer equipment and storage medium
CN117160031A (en) Game skill processing method, game skill processing device, computer equipment and storage medium
CN116870472A (en) Game view angle switching method and device, computer equipment and storage medium
CN116139483A (en) Game function control method, game function control device, storage medium and computer equipment
CN116999835A (en) Game control method, game control device, computer equipment and storage medium
CN116115991A (en) Aiming method, aiming device, computer equipment and storage medium
CN116999825A (en) Game control method, game control device, computer equipment and storage medium
CN116271791A (en) Game control method, game control device, computer equipment and storage medium
CN115645912A (en) Game element display method and device, computer equipment and storage medium
CN116328315A (en) Virtual model processing method, device, terminal and storage medium based on block chain
CN116850594A (en) Game interaction method, game interaction device, computer equipment and computer readable storage medium
CN117462949A (en) Game skill processing method, game skill processing device, computer equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination